TraxTrax — Deep Learning with Clear Code and Speed
Stars: ✭ 6,666 (+5190.48%)
verseagilityRamp up your custom natural language processing (NLP) task, allowing you to bring your own data, use your preferred frameworks and bring models into production.
Stars: ✭ 23 (-81.75%)
awesome-transformer-searchA curated list of awesome resources combining Transformers with Neural Architecture Search
Stars: ✭ 194 (+53.97%)
zeroZero -- A neural machine translation system
Stars: ✭ 121 (-3.97%)
Bert For Tf2A Keras TensorFlow 2.0 implementation of BERT, ALBERT and adapter-BERT.
Stars: ✭ 683 (+442.06%)
wenetProduction First and Production Ready End-to-End Speech Recognition Toolkit
Stars: ✭ 2,384 (+1792.06%)
Embedding As ServiceOne-Stop Solution to encode sentence to fixed length vectors from various embedding techniques
Stars: ✭ 151 (+19.84%)
MusicTransformer-PytorchMusicTransformer written for MaestroV2 using the Pytorch framework for music generation
Stars: ✭ 106 (-15.87%)
Deep Ctr PredictionCTR prediction models based on deep learning(基于深度学习的广告推荐CTR预估模型)
Stars: ✭ 628 (+398.41%)
VideoTransformer-pytorchPyTorch implementation of a collections of scalable Video Transformer Benchmarks.
Stars: ✭ 159 (+26.19%)
AdaSpeechAdaSpeech: Adaptive Text to Speech for Custom Voice
Stars: ✭ 108 (-14.29%)
WenetProduction First and Production Ready End-to-End Speech Recognition Toolkit
Stars: ✭ 617 (+389.68%)
Transformer-TransducerPyTorch implementation of "Transformer Transducer: A Streamable Speech Recognition Model with Transformer Encoders and RNN-T Loss" (ICASSP 2020)
Stars: ✭ 61 (-51.59%)
BigbirdTransformers for Longer Sequences
Stars: ✭ 146 (+15.87%)
React Native Svg TransformerImport SVG files in your React Native project the same way that you would in a Web application.
Stars: ✭ 568 (+350.79%)
ICON(TPAMI2022) Salient Object Detection via Integrity Learning.
Stars: ✭ 125 (-0.79%)
nested-transformerNested Hierarchical Transformer https://arxiv.org/pdf/2105.12723.pdf
Stars: ✭ 174 (+38.1%)
SegFormerOfficial PyTorch implementation of SegFormer
Stars: ✭ 1,264 (+903.17%)
Speech TransformerA PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Stars: ✭ 565 (+348.41%)
Tensorflowasr集成了Tensorflow 2版本的端到端语音识别模型,并且RTF(实时率)在0.1左右/Mandarin State-of-the-art Automatic Speech Recognition in Tensorflow 2
Stars: ✭ 145 (+15.08%)
sticker2Further developed as SyntaxDot: https://github.com/tensordot/syntaxdot
Stars: ✭ 14 (-88.89%)
GraphormerGraphormer is a deep learning package that allows researchers and developers to train custom models for molecule modeling tasks. It aims to accelerate the research and application in AI for molecule science, such as material design, drug discovery, etc.
Stars: ✭ 1,194 (+847.62%)
fastT5⚡ boost inference speed of T5 models by 5x & reduce the model size by 3x.
Stars: ✭ 421 (+234.13%)
h-transformer-1dImplementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (-3.97%)
Rust BertRust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
Stars: ✭ 510 (+304.76%)
german-sentimentA data set and model for german sentiment classification.
Stars: ✭ 37 (-70.63%)
TupeTransformer with Untied Positional Encoding (TUPE). Code of paper "Rethinking Positional Encoding in Language Pre-training". Improve existing models like BERT.
Stars: ✭ 143 (+13.49%)
DolboNetРусскоязычный чат-бот для Discord на архитектуре Transformer
Stars: ✭ 53 (-57.94%)
LightseqLightSeq: A High Performance Inference Library for Sequence Processing and Generation
Stars: ✭ 501 (+297.62%)
VT-UNet[MICCAI2022] This is an official PyTorch implementation for A Robust Volumetric Transformer for Accurate 3D Tumor Segmentation
Stars: ✭ 151 (+19.84%)
basis-expansionsBasis expansion transformers in sklearn style.
Stars: ✭ 74 (-41.27%)
Awesome Visual TransformerCollect some papers about transformer with vision. Awesome Transformer with Computer Vision (CV)
Stars: ✭ 475 (+276.98%)
proc-thatproc(ess)-that - easy extendable ETL tool for Node.js. Written in TypeScript.
Stars: ✭ 25 (-80.16%)
Nlp researchNLP research:基于tensorflow的nlp深度学习项目,支持文本分类/句子匹配/序列标注/文本生成 四大任务
Stars: ✭ 141 (+11.9%)
keyword-transformerOfficial implementation of the Keyword Transformer: https://arxiv.org/abs/2104.00769
Stars: ✭ 76 (-39.68%)
Seq2seqchatbotsA wrapper around tensor2tensor to flexibly train, interact, and generate data for neural chatbots.
Stars: ✭ 466 (+269.84%)
kospeechOpen-Source Toolkit for End-to-End Korean Automatic Speech Recognition leveraging PyTorch and Hydra.
Stars: ✭ 456 (+261.9%)
sparql-transformerA more handy way to use SPARQL data in your web app
Stars: ✭ 38 (-69.84%)
TitleStylistSource code for our "TitleStylist" paper at ACL 2020
Stars: ✭ 72 (-42.86%)
JukeboxCode for the paper "Jukebox: A Generative Model for Music"
Stars: ✭ 4,863 (+3759.52%)
Transformer-MM-Explainability[ICCV 2021- Oral] Official PyTorch implementation for Generic Attention-model Explainability for Interpreting Bi-Modal and Encoder-Decoder Transformers, a novel method to visualize any Transformer-based network. Including examples for DETR, VQA.
Stars: ✭ 484 (+284.13%)
SightseqComputer vision tools for fairseq, containing PyTorch implementation of text recognition and object detection
Stars: ✭ 116 (-7.94%)
alpr utilsALPR model in unconstrained scenarios for Chinese license plates
Stars: ✭ 158 (+25.4%)
JoeynmtMinimalist NMT for educational purposes
Stars: ✭ 420 (+233.33%)
YaEtlYet Another ETL in PHP
Stars: ✭ 60 (-52.38%)
Ner Bert PytorchPyTorch solution of named entity recognition task Using Google AI's pre-trained BERT model.
Stars: ✭ 249 (+97.62%)
Pytorch Original TransformerMy implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+226.19%)
text simplificationText Simplification Model based on Encoder-Decoder (includes Transformer and Seq2Seq) model.
Stars: ✭ 66 (-47.62%)
Neural-Scam-ArtistWeb Scraping, Document Deduplication & GPT-2 Fine-tuning with a newly created scam dataset.
Stars: ✭ 18 (-85.71%)
libaiLiBai(李白): A Toolbox for Large-Scale Distributed Parallel Training
Stars: ✭ 284 (+125.4%)
vietnamese-robertaA Robustly Optimized BERT Pretraining Approach for Vietnamese
Stars: ✭ 22 (-82.54%)
Lumen Api StarterLumen 8 基础上扩展出的API 启动项目,精心设计的目录结构,规范统一的响应数据格式,Repository 模式架构的最佳实践。
Stars: ✭ 197 (+56.35%)
transformerNeutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (-52.38%)