towheeTowhee is a framework that is dedicated to making neural data processing pipelines simple and fast.
Stars: ✭ 821 (+416.35%)
Nlp Experiments In PytorchPyTorch repository for text categorization and NER experiments in Turkish and English.
Stars: ✭ 35 (-77.99%)
YOLOv5-Lite🍅🍅🍅YOLOv5-Lite: lighter, faster and easier to deploy. Evolved from yolov5 and the size of model is only 930+kb (int8) and 1.7M (fp16). It can reach 10+ FPS on the Raspberry Pi 4B when the input size is 320×320~
Stars: ✭ 1,230 (+673.58%)
MASTER-pytorchCode for the paper "MASTER: Multi-Aspect Non-local Network for Scene Text Recognition" (Pattern Recognition 2021)
Stars: ✭ 263 (+65.41%)
transformerA PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-82.39%)
Keras Textclassification中文长文本分类、短句子分类、多标签分类、两句子相似度(Chinese Text Classification of Keras NLP, multi-label classify, or sentence classify, long or short),字词句向量嵌入层(embeddings)和网络层(graph)构建基类,FastText,TextCNN,CharCNN,TextRNN, RCNN, DCNN, DPCNN, VDCNN, CRNN, Bert, Xlnet, Albert, Attention, DeepMoji, HAN, 胶囊网络-CapsuleNet, Transformer-encode, Seq2seq, SWEM, LEAM, TextGCN
Stars: ✭ 914 (+474.84%)
KospeechOpen-Source Toolkit for End-to-End Korean Automatic Speech Recognition.
Stars: ✭ 190 (+19.5%)
graph-transformer-pytorchImplementation of Graph Transformer in Pytorch, for potential use in replicating Alphafold2
Stars: ✭ 81 (-49.06%)
Transgan[Preprint] "TransGAN: Two Transformers Can Make One Strong GAN", Yifan Jiang, Shiyu Chang, Zhangyang Wang
Stars: ✭ 864 (+443.4%)
bLVNet-TAMThe official Codes for NeurIPS 2019 paper. Quanfu Fan, Ricarhd Chen, Hilde Kuehne, Marco Pistoia, David Cox, "More Is Less: Learning Efficient Video Representations by Temporal Aggregation Modules"
Stars: ✭ 54 (-66.04%)
NiuTrans.NMTA Fast Neural Machine Translation System. It is developed in C++ and resorts to NiuTensor for fast tensor APIs.
Stars: ✭ 112 (-29.56%)
Cell DetrOfficial and maintained implementation of the paper Attention-Based Transformers for Instance Segmentation of Cells in Microstructures [BIBM 2020].
Stars: ✭ 26 (-83.65%)
verseagilityRamp up your custom natural language processing (NLP) task, allowing you to bring your own data, use your preferred frameworks and bring models into production.
Stars: ✭ 23 (-85.53%)
SentimentanalysisSentiment analysis neural network trained by fine-tuning BERT, ALBERT, or DistilBERT on the Stanford Sentiment Treebank.
Stars: ✭ 186 (+16.98%)
zeroZero -- A neural machine translation system
Stars: ✭ 121 (-23.9%)
Turbotransformersa fast and user-friendly runtime for transformer inference (Bert, Albert, GPT2, Decoders, etc) on CPU and GPU.
Stars: ✭ 826 (+419.5%)
wenetProduction First and Production Ready End-to-End Speech Recognition Toolkit
Stars: ✭ 2,384 (+1399.37%)
query-selectorLONG-TERM SERIES FORECASTING WITH QUERYSELECTOR – EFFICIENT MODEL OF SPARSEATTENTION
Stars: ✭ 63 (-60.38%)
MusicTransformer-PytorchMusicTransformer written for MaestroV2 using the Pytorch framework for music generation
Stars: ✭ 106 (-33.33%)
Getting Things Done With PytorchJupyter Notebook tutorials on solving real-world problems with Machine Learning & Deep Learning using PyTorch. Topics: Face detection with Detectron 2, Time Series anomaly detection with LSTM Autoencoders, Object Detection with YOLO v5, Build your first Neural Network, Time Series forecasting for Coronavirus daily cases, Sentiment Analysis with BERT.
Stars: ✭ 738 (+364.15%)
Transformer ClinicUnderstanding the Difficulty of Training Transformers
Stars: ✭ 179 (+12.58%)
AdaSpeechAdaSpeech: Adaptive Text to Speech for Custom Voice
Stars: ✭ 108 (-32.08%)
TraxTrax — Deep Learning with Clear Code and Speed
Stars: ✭ 6,666 (+4092.45%)
Transformer-TransducerPyTorch implementation of "Transformer Transducer: A Streamable Speech Recognition Model with Transformer Encoders and RNN-T Loss" (ICASSP 2020)
Stars: ✭ 61 (-61.64%)
Bert For Tf2A Keras TensorFlow 2.0 implementation of BERT, ALBERT and adapter-BERT.
Stars: ✭ 683 (+329.56%)
ICON(TPAMI2022) Salient Object Detection via Integrity Learning.
Stars: ✭ 125 (-21.38%)
Transformers.jlJulia Implementation of Transformer models
Stars: ✭ 173 (+8.81%)
SegFormerOfficial PyTorch implementation of SegFormer
Stars: ✭ 1,264 (+694.97%)
Deep Ctr PredictionCTR prediction models based on deep learning(基于深度学习的广告推荐CTR预估模型)
Stars: ✭ 628 (+294.97%)
densecapDense video captioning in PyTorch
Stars: ✭ 37 (-76.73%)
sticker2Further developed as SyntaxDot: https://github.com/tensordot/syntaxdot
Stars: ✭ 14 (-91.19%)
WenetProduction First and Production Ready End-to-End Speech Recognition Toolkit
Stars: ✭ 617 (+288.05%)
GraphormerGraphormer is a deep learning package that allows researchers and developers to train custom models for molecule modeling tasks. It aims to accelerate the research and application in AI for molecule science, such as material design, drug discovery, etc.
Stars: ✭ 1,194 (+650.94%)
Gpt 2 Tensorflow2.0OpenAI GPT2 pre-training and sequence prediction implementation in Tensorflow 2.0
Stars: ✭ 172 (+8.18%)
h-transformer-1dImplementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (-23.9%)
React Native Svg TransformerImport SVG files in your React Native project the same way that you would in a Web application.
Stars: ✭ 568 (+257.23%)
german-sentimentA data set and model for german sentiment classification.
Stars: ✭ 37 (-76.73%)
DolboNetРусскоязычный чат-бот для Discord на архитектуре Transformer
Stars: ✭ 53 (-66.67%)
Speech TransformerA PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Stars: ✭ 565 (+255.35%)
basis-expansionsBasis expansion transformers in sklearn style.
Stars: ✭ 74 (-53.46%)
BMTSource code for "Bi-modal Transformer for Dense Video Captioning" (BMVC 2020)
Stars: ✭ 192 (+20.75%)
Rust BertRust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
Stars: ✭ 510 (+220.75%)
transformer-lsOfficial PyTorch Implementation of Long-Short Transformer (NeurIPS 2021).
Stars: ✭ 201 (+26.42%)
Transformer Temporal TaggerCode and data form the paper BERT Got a Date: Introducing Transformers to Temporal Tagging
Stars: ✭ 55 (-65.41%)
hififaceUnofficial PyTorch Implementation for HifiFace (https://arxiv.org/abs/2106.09965)
Stars: ✭ 227 (+42.77%)
kaggle-champsCode for the CHAMPS Predicting Molecular Properties Kaggle competition
Stars: ✭ 49 (-69.18%)
bytekitJava 字节操作的工具库(不是字节码的工具库)
Stars: ✭ 40 (-74.84%)
TorchnlpEasy to use NLP library built on PyTorch and TorchText
Stars: ✭ 233 (+46.54%)
Gpt2 ChitchatGPT2 for Chinese chitchat/用于中文闲聊的GPT2模型(实现了DialoGPT的MMI思想)
Stars: ✭ 1,230 (+673.58%)