Awesome Bert NlpA curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
Stars: ✭ 567 (+829.51%)
SIGIR2021 ConureOne Person, One Model, One World: Learning Continual User Representation without Forgetting
Stars: ✭ 23 (-62.3%)
Transfer NlpNLP library designed for reproducible experimentation management
Stars: ✭ 287 (+370.49%)
Bert language understandingPre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
Stars: ✭ 933 (+1429.51%)
Haystack🔍 Haystack is an open source NLP framework that leverages Transformer models. It enables developers to implement production-ready neural search, question answering, semantic document search and summarization for a wide range of applications.
Stars: ✭ 3,409 (+5488.52%)
Spacy Transformers🛸 Use pretrained transformers like BERT, XLNet and GPT-2 in spaCy
Stars: ✭ 919 (+1406.56%)
Transformers🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Stars: ✭ 55,742 (+91280.33%)
Pytorch Openai Transformer Lm🐥A PyTorch implementation of OpenAI's finetuned transformer language model with a script to import the weights pre-trained by OpenAI
Stars: ✭ 1,268 (+1978.69%)
Neural spEnd-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+568.85%)
Bert KerasKeras implementation of BERT with pre-trained weights
Stars: ✭ 820 (+1244.26%)
Filipino-Text-BenchmarksOpen-source benchmark datasets and pretrained transformer models in the Filipino language.
Stars: ✭ 22 (-63.93%)
Relational Rnn PytorchAn implementation of DeepMind's Relational Recurrent Neural Networks in PyTorch.
Stars: ✭ 236 (+286.89%)
Bert Sklearna sklearn wrapper for Google's BERT model
Stars: ✭ 182 (+198.36%)
Gpt2 FrenchGPT-2 French demo | Démo française de GPT-2
Stars: ✭ 47 (-22.95%)
FNet-pytorchUnofficial implementation of Google's FNet: Mixing Tokens with Fourier Transforms
Stars: ✭ 204 (+234.43%)
AITQAresources for the IBM Airlines Table-Question-Answering Benchmark
Stars: ✭ 12 (-80.33%)
Getting Things Done With PytorchJupyter Notebook tutorials on solving real-world problems with Machine Learning & Deep Learning using PyTorch. Topics: Face detection with Detectron 2, Time Series anomaly detection with LSTM Autoencoders, Object Detection with YOLO v5, Build your first Neural Network, Time Series forecasting for Coronavirus daily cases, Sentiment Analysis with BERT.
Stars: ✭ 738 (+1109.84%)
Bert PytorchGoogle AI 2018 BERT pytorch implementation
Stars: ✭ 4,642 (+7509.84%)
TupeTransformer with Untied Positional Encoding (TUPE). Code of paper "Rethinking Positional Encoding in Language Pre-training". Improve existing models like BERT.
Stars: ✭ 143 (+134.43%)
Gpt ScrollsA collaborative collection of open-source safe GPT-3 prompts that work well
Stars: ✭ 195 (+219.67%)
Flow ForecastDeep learning PyTorch library for time series forecasting, classification, and anomaly detection (originally for flood forecasting).
Stars: ✭ 368 (+503.28%)
wechselCode for WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models.
Stars: ✭ 39 (-36.07%)
Cross Domain nerCross-domain NER using cross-domain language modeling, code for ACL 2019 paper
Stars: ✭ 67 (+9.84%)
Gpt2PyTorch Implementation of OpenAI GPT-2
Stars: ✭ 64 (+4.92%)
backpropBackprop makes it simple to use, finetune, and deploy state-of-the-art ML models.
Stars: ✭ 229 (+275.41%)
Context-TransformerContext-Transformer: Tackling Object Confusion for Few-Shot Detection, AAAI 2020
Stars: ✭ 89 (+45.9%)
DANCode release of "Learning Transferable Features with Deep Adaptation Networks" (ICML 2015)
Stars: ✭ 149 (+144.26%)
PDNThe official PyTorch implementation of "Pathfinder Discovery Networks for Neural Message Passing" (WebConf '21)
Stars: ✭ 44 (-27.87%)
Vision-Language-TransformerVision-Language Transformer and Query Generation for Referring Segmentation (ICCV 2021)
Stars: ✭ 127 (+108.2%)
Syn2RealRepository for Transfer Learning using Deep CNNs trained with synthetic images
Stars: ✭ 16 (-73.77%)
Transformer-ocrHandwritten text recognition using transformers.
Stars: ✭ 92 (+50.82%)
image-classificationA collection of SOTA Image Classification Models in PyTorch
Stars: ✭ 70 (+14.75%)
LaTeX-OCRpix2tex: Using a ViT to convert images of equations into LaTeX code.
Stars: ✭ 1,566 (+2467.21%)
transformer-sltSign Language Translation with Transformers (COLING'2020, ECCV'20 SLRTP Workshop)
Stars: ✭ 92 (+50.82%)
DeepPhonemizerGrapheme to phoneme conversion with deep learning.
Stars: ✭ 152 (+149.18%)
towheeTowhee is a framework that is dedicated to making neural data processing pipelines simple and fast.
Stars: ✭ 821 (+1245.9%)
transform-graphql⚙️ Transformer function to transform GraphQL Directives. Create model CRUD directive for example
Stars: ✭ 23 (-62.3%)
MetaHeacThis is an official implementation for "Learning to Expand Audience via Meta Hybrid Experts and Critics for Recommendation and Advertising"(KDD2021).
Stars: ✭ 36 (-40.98%)
YOLOv5-Lite🍅🍅🍅YOLOv5-Lite: lighter, faster and easier to deploy. Evolved from yolov5 and the size of model is only 930+kb (int8) and 1.7M (fp16). It can reach 10+ FPS on the Raspberry Pi 4B when the input size is 320×320~
Stars: ✭ 1,230 (+1916.39%)
OverlapPredator[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stars: ✭ 293 (+380.33%)
Word-Prediction-NgramNext Word Prediction using n-gram Probabilistic Model with various Smoothing Techniques
Stars: ✭ 25 (-59.02%)
TransPosePyTorch Implementation for "TransPose: Keypoint localization via Transformer", ICCV 2021.
Stars: ✭ 250 (+309.84%)
super-gradientsEasily train or fine-tune SOTA computer vision models with one open source training library
Stars: ✭ 429 (+603.28%)
HRFormerThis is an official implementation of our NeurIPS 2021 paper "HRFormer: High-Resolution Transformer for Dense Prediction".
Stars: ✭ 357 (+485.25%)
tying-wv-and-wcImplementation for "Tying Word Vectors and Word Classifiers: A Loss Framework for Language Modeling"
Stars: ✭ 39 (-36.07%)
gpt-jA GPT-J API to use with python3 to generate text, blogs, code, and more
Stars: ✭ 101 (+65.57%)