Pytorch Sentiment AnalysisTutorials on getting started with PyTorch and TorchText for sentiment analysis.
Stars: ✭ 3,209 (+497.58%)
Mutual labels: transformers, recurrent-neural-networks
gnn-lspeSource code for GNN-LSPE (Graph Neural Networks with Learnable Structural and Positional Representations), ICLR 2022
Stars: ✭ 165 (-69.27%)
Mutual labels: transformers
pytorch-vitAn Image is Worth 16x16 Words: Transformers for Image Recognition at Scale
Stars: ✭ 250 (-53.45%)
Mutual labels: transformers
regulatory-predictionCode and Data to accompany "Dilated Convolutions for Modeling Long-Distance Genomic Dependencies", presented at the ICML 2017 Workshop on Computational Biology
Stars: ✭ 26 (-95.16%)
Mutual labels: recurrent-neural-networks
keras-malicious-url-detectorMalicious URL detector using keras recurrent networks and scikit-learn classifiers
Stars: ✭ 24 (-95.53%)
Mutual labels: recurrent-neural-networks
molecule-attention-transformerPytorch reimplementation of Molecule Attention Transformer, which uses a transformer to tackle the graph-like structure of molecules
Stars: ✭ 46 (-91.43%)
Mutual labels: transformers
PyPOMDPPython implementation of POMDP framework and PBVI & POMCP algorithms.
Stars: ✭ 60 (-88.83%)
Mutual labels: reinforcement-learning-algorithms
deepfrogAn NLP-suite powered by deep learning
Stars: ✭ 16 (-97.02%)
Mutual labels: transformers
CVPR21 PASSPyTorch implementation of our CVPR2021 (oral) paper "Prototype Augmentation and Self-Supervision for Incremental Learning"
Stars: ✭ 55 (-89.76%)
Mutual labels: continual-learning
DeepSegmentorSequence Segmentation using Joint RNN and Structured Prediction Models (ICASSP 2017)
Stars: ✭ 17 (-96.83%)
Mutual labels: recurrent-neural-networks
SpeakerDiarization RNN CNN LSTMSpeaker Diarization is the problem of separating speakers in an audio. There could be any number of speakers and final result should state when speaker starts and ends. In this project, we analyze given audio file with 2 channels and 2 speakers (on separate channels).
Stars: ✭ 56 (-89.57%)
Mutual labels: recurrent-neural-networks
modulesThe official repository for our paper "Are Neural Nets Modular? Inspecting Functional Modularity Through Differentiable Weight Masks". We develop a method for analyzing emerging functional modularity in neural networks based on differentiable weight masks and use it to point out important issues in current-day neural networks.
Stars: ✭ 25 (-95.34%)
Mutual labels: transformers
entity-networkTensorflow implementation of "Tracking the World State with Recurrent Entity Networks" [https://arxiv.org/abs/1612.03969] by Henaff, Weston, Szlam, Bordes, and LeCun.
Stars: ✭ 58 (-89.2%)
Mutual labels: recurrent-neural-networks
sequence-rnn-pySequence analyzing using Recurrent Neural Networks (RNN) based on Keras
Stars: ✭ 28 (-94.79%)
Mutual labels: recurrent-neural-networks
deep-learningAssignmends done for Udacity's Deep Learning MOOC with Vincent Vanhoucke
Stars: ✭ 94 (-82.5%)
Mutual labels: recurrent-neural-networks
converseConversational text Analysis using various NLP techniques
Stars: ✭ 147 (-72.63%)
Mutual labels: transformers
NeuroAINeuroAI-UW seminar, a regular weekly seminar for the UW community, organized by NeuroAI Shlizerman Lab.
Stars: ✭ 36 (-93.3%)
Mutual labels: recurrent-neural-networks