All Projects → NeuromatchAcademy → course-content-dl

NeuromatchAcademy / course-content-dl

Licence: CC-BY-4.0, BSD-3-Clause licenses found Licenses found CC-BY-4.0 LICENSE.md BSD-3-Clause LICENSE-CODE.md
NMA deep learning course

Programming Languages

Jupyter Notebook
11667 projects

Projects that are alternatives of or similar to course-content-dl

Introduction-to-Deep-Learning-and-Neural-Networks-Course
Code snippets and solutions for the Introduction to Deep Learning and Neural Networks Course hosted in educative.io
Stars: ✭ 33 (-93.85%)
Mutual labels:  transformers, recurrent-neural-networks
Pytorch Sentiment Analysis
Tutorials on getting started with PyTorch and TorchText for sentiment analysis.
Stars: ✭ 3,209 (+497.58%)
Mutual labels:  transformers, recurrent-neural-networks
gnn-lspe
Source code for GNN-LSPE (Graph Neural Networks with Learnable Structural and Positional Representations), ICLR 2022
Stars: ✭ 165 (-69.27%)
Mutual labels:  transformers
pytorch-vit
An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale
Stars: ✭ 250 (-53.45%)
Mutual labels:  transformers
regulatory-prediction
Code and Data to accompany "Dilated Convolutions for Modeling Long-Distance Genomic Dependencies", presented at the ICML 2017 Workshop on Computational Biology
Stars: ✭ 26 (-95.16%)
Mutual labels:  recurrent-neural-networks
keras-malicious-url-detector
Malicious URL detector using keras recurrent networks and scikit-learn classifiers
Stars: ✭ 24 (-95.53%)
Mutual labels:  recurrent-neural-networks
molecule-attention-transformer
Pytorch reimplementation of Molecule Attention Transformer, which uses a transformer to tackle the graph-like structure of molecules
Stars: ✭ 46 (-91.43%)
Mutual labels:  transformers
PyPOMDP
Python implementation of POMDP framework and PBVI & POMCP algorithms.
Stars: ✭ 60 (-88.83%)
Mutual labels:  reinforcement-learning-algorithms
deepfrog
An NLP-suite powered by deep learning
Stars: ✭ 16 (-97.02%)
Mutual labels:  transformers
CVPR21 PASS
PyTorch implementation of our CVPR2021 (oral) paper "Prototype Augmentation and Self-Supervision for Incremental Learning"
Stars: ✭ 55 (-89.76%)
Mutual labels:  continual-learning
DeepSegmentor
Sequence Segmentation using Joint RNN and Structured Prediction Models (ICASSP 2017)
Stars: ✭ 17 (-96.83%)
Mutual labels:  recurrent-neural-networks
SpeakerDiarization RNN CNN LSTM
Speaker Diarization is the problem of separating speakers in an audio. There could be any number of speakers and final result should state when speaker starts and ends. In this project, we analyze given audio file with 2 channels and 2 speakers (on separate channels).
Stars: ✭ 56 (-89.57%)
Mutual labels:  recurrent-neural-networks
modules
The official repository for our paper "Are Neural Nets Modular? Inspecting Functional Modularity Through Differentiable Weight Masks". We develop a method for analyzing emerging functional modularity in neural networks based on differentiable weight masks and use it to point out important issues in current-day neural networks.
Stars: ✭ 25 (-95.34%)
Mutual labels:  transformers
entity-network
Tensorflow implementation of "Tracking the World State with Recurrent Entity Networks" [https://arxiv.org/abs/1612.03969] by Henaff, Weston, Szlam, Bordes, and LeCun.
Stars: ✭ 58 (-89.2%)
Mutual labels:  recurrent-neural-networks
COVID-19-Tweet-Classification-using-Roberta-and-Bert-Simple-Transformers
Rank 1 / 216
Stars: ✭ 24 (-95.53%)
Mutual labels:  transformers
sequence-rnn-py
Sequence analyzing using Recurrent Neural Networks (RNN) based on Keras
Stars: ✭ 28 (-94.79%)
Mutual labels:  recurrent-neural-networks
deep-learning
Assignmends done for Udacity's Deep Learning MOOC with Vincent Vanhoucke
Stars: ✭ 94 (-82.5%)
Mutual labels:  recurrent-neural-networks
TD3-BipedalWalkerHardcore-v2
Solve BipedalWalkerHardcore-v2 with TD3
Stars: ✭ 41 (-92.36%)
Mutual labels:  reinforcement-learning-algorithms
converse
Conversational text Analysis using various NLP techniques
Stars: ✭ 147 (-72.63%)
Mutual labels:  transformers
NeuroAI
NeuroAI-UW seminar, a regular weekly seminar for the UW community, organized by NeuroAI Shlizerman Lab.
Stars: ✭ 36 (-93.3%)
Mutual labels:  recurrent-neural-networks

Neuromatch Academy Deep Learning (NMA-DL) syllabus

July 11-29, 2022

Objectives: Gain hands-on, code-first experience with deep learning theories, models, and skills that are useful for applications and for advancing science. We focus on how to decide which problems can be tackled with deep learning, how to determine what model is best, how to best implement a model, how to visualize / justify findings, and how neuroscience can inspire deep learning. And throughout we emphasize the ethical use of DL.

Please check out expected prerequisites here!

The content should primarily be accessed from our ebook: https://deeplearning.neuromatch.io/ [under continuous development]

Schedule for 2022: https://github.com/NeuromatchAcademy/course-content-dl/blob/main/tutorials/Schedule/daily_schedules.md


Licensing

CC BY 4.0

CC BY 4.0 BSD-3

The contents of this repository are shared under under a Creative Commons Attribution 4.0 International License.

Software elements are additionally licensed under the BSD (3-Clause) License.

Derivative works may use the license that is more appropriate to the relevant context.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].