Attentive Neural Processesimplementing "recurrent attentive neural processes" to forecast power usage (w. LSTM baseline, MCDropout)
Stars: ✭ 33 (-73.39%)
Pytorch GatMy implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!
Stars: ✭ 908 (+632.26%)
Absa PytorchAspect Based Sentiment Analysis, PyTorch Implementations. 基于方面的情感分析,使用PyTorch实现。
Stars: ✭ 1,181 (+852.42%)
Switchnorm segmentationSwitchable Normalization for semantic image segmentation and scene parsing.
Stars: ✭ 47 (-62.1%)
SimplecvreproductionReproduce simple cv project including attention module, classification, object detection, segmentation, keypoint detection, tracking 😄 etc.
Stars: ✭ 602 (+385.48%)
CaptcharecognitionEnd-to-end variable length Captcha recognition using CNN+RNN+Attention/CTC (pytorch implementation). 端到端的不定长验证码识别
Stars: ✭ 97 (-21.77%)
LightnetLightNet: Light-weight Networks for Semantic Image Segmentation (Cityscapes and Mapillary Vistas Dataset)
Stars: ✭ 698 (+462.9%)
Time AttentionImplementation of RNN for Time Series prediction from the paper https://arxiv.org/abs/1704.02971
Stars: ✭ 52 (-58.06%)
Tusimple DucUnderstanding Convolution for Semantic Segmentation
Stars: ✭ 567 (+357.26%)
AttentionsPyTorch implementation of some attentions for Deep Learning Researchers.
Stars: ✭ 39 (-68.55%)
Lambda NetworksImplementation of LambdaNetworks, a new approach to image recognition that reaches SOTA with less compute
Stars: ✭ 1,497 (+1107.26%)
Deeplabv3 PlusTensorflow 2.3.0 implementation of DeepLabV3-Plus
Stars: ✭ 32 (-74.19%)
Nlp TutorialNatural Language Processing Tutorial for Deep Learning Researchers
Stars: ✭ 9,895 (+7879.84%)
Nlp tensorflow projectUse tensorflow to achieve some NLP project, eg: classification chatbot ner attention QAetc.
Stars: ✭ 27 (-78.23%)
Leader LineDraw a leader line in your web page.
Stars: ✭ 1,872 (+1409.68%)
Tf Rnn AttentionTensorflow implementation of attention mechanism for text classification tasks.
Stars: ✭ 735 (+492.74%)
Deeplearning Nlp ModelsA small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (-48.39%)
Efficient Segmentation NetworksLightweight models for real-time semantic segmentationon PyTorch (include SQNet, LinkNet, SegNet, UNet, ENet, ERFNet, EDANet, ESPNet, ESPNetv2, LEDNet, ESNet, FSSNet, CGNet, DABNet, Fast-SCNN, ContextNet, FPENet, etc.)
Stars: ✭ 579 (+366.94%)
Yolov4 PytorchThis is a pytorch repository of YOLOv4, attentive YOLOv4 and mobilenet YOLOv4 with PASCAL VOC and COCO
Stars: ✭ 1,070 (+762.9%)
Speech TransformerA PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Stars: ✭ 565 (+355.65%)
Numpy MlMachine learning, in numpy
Stars: ✭ 11,100 (+8851.61%)
Biblosa PytorchRe-implementation of Bi-Directional Block Self-Attention for Fast and Memory-Efficient Sequence Modeling (T. Shen et al., ICLR 2018) on Pytorch.
Stars: ✭ 43 (-65.32%)
Attention TransferImproving Convolutional Networks via Attention Transfer (ICLR 2017)
Stars: ✭ 1,231 (+892.74%)
AttentionclusterTensorFlow Implementation of "Attention Clusters: Purely Attention Based Local Feature Integration for Video Classification"
Stars: ✭ 33 (-73.39%)
Nlp Models TensorflowGathers machine learning and Tensorflow deep learning models for NLP problems, 1.13 < Tensorflow < 2.0
Stars: ✭ 1,603 (+1192.74%)
Pytorch Auto DriveSegmentation models (ERFNet, ENet, DeepLab, FCN...) and Lane detection models (SCNN, SAD, PRNet, RESA, LSTR...) based on PyTorch 1.6 with mixed precision training
Stars: ✭ 32 (-74.19%)
DefactonlpDeFactoNLP: An Automated Fact-checking System that uses Named Entity Recognition, TF-IDF vector comparison and Decomposable Attention models.
Stars: ✭ 30 (-75.81%)
Isab PytorchAn implementation of (Induced) Set Attention Block, from the Set Transformers paper
Stars: ✭ 21 (-83.06%)
HatnHierarchical Attention Transfer Network for Cross-domain Sentiment Classification (AAAI'18)
Stars: ✭ 73 (-41.13%)
Cell DetrOfficial and maintained implementation of the paper Attention-Based Transformers for Instance Segmentation of Cells in Microstructures [BIBM 2020].
Stars: ✭ 26 (-79.03%)
Deeplab V3 Plus CityscapesmIOU=80.02 on cityscapes. My implementation of deeplabv3+ (also know as 'Encoder-Decoder with Atrous Separable Convolution for Semantic Image Segmentation' based on the dataset of cityscapes).
Stars: ✭ 121 (-2.42%)
Enjoy Hamburger[ICLR 2021] Is Attention Better Than Matrix Decomposition?
Stars: ✭ 69 (-44.35%)
Njunmt TfAn open-source neural machine translation system developed by Natural Language Processing Group, Nanjing University.
Stars: ✭ 97 (-21.77%)
Text ClassificationImplementation of papers for text classification task on DBpedia
Stars: ✭ 682 (+450%)
Global Self Attention NetworkA Pytorch implementation of Global Self-Attention Network, a fully-attention backbone for vision tasks
Stars: ✭ 64 (-48.39%)
VadVoice activity detection (VAD) toolkit including DNN, bDNN, LSTM and ACAM based VAD. We also provide our directly recorded dataset.
Stars: ✭ 622 (+401.61%)
BisenetAdd bisenetv2. My implementation of BiSeNet
Stars: ✭ 589 (+375%)
Nlp JourneyDocuments, papers and codes related to Natural Language Processing, including Topic Model, Word Embedding, Named Entity Recognition, Text Classificatin, Text Generation, Text Similarity, Machine Translation),etc. All codes are implemented intensorflow 2.0.
Stars: ✭ 1,290 (+940.32%)
FluenceA deep learning library based on Pytorch focussed on low resource language research and robustness
Stars: ✭ 54 (-56.45%)
FastpunctPunctuation restoration and spell correction experiments.
Stars: ✭ 121 (-2.42%)
SightseqComputer vision tools for fairseq, containing PyTorch implementation of text recognition and object detection
Stars: ✭ 116 (-6.45%)
DabnetDepth-wise Asymmetric Bottleneck for Real-time Semantic Segmentation (BMVC2019)
Stars: ✭ 109 (-12.1%)
Eval On Nn Of RcEmpirical Evaluation on Current Neural Networks on Cloze-style Reading Comprehension
Stars: ✭ 84 (-32.26%)