All Projects → MinTL → Similar Projects or Alternatives

701 Open source projects that are alternatives of or similar to MinTL

Nlp Paper
NLP Paper
Stars: ✭ 484 (+693.44%)
Awesome Bert Nlp
A curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
Stars: ✭ 567 (+829.51%)
SIGIR2021 Conure
One Person, One Model, One World: Learning Continual User Representation without Forgetting
Stars: ✭ 23 (-62.3%)
Mutual labels:  transformer, transfer-learning
Transfer Nlp
NLP library designed for reproducible experimentation management
Stars: ✭ 287 (+370.49%)
Bert language understanding
Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
Stars: ✭ 933 (+1429.51%)
Haystack
🔍 Haystack is an open source NLP framework that leverages Transformer models. It enables developers to implement production-ready neural search, question answering, semantic document search and summarization for a wide range of applications.
Stars: ✭ 3,409 (+5488.52%)
Spacy Transformers
🛸 Use pretrained transformers like BERT, XLNet and GPT-2 in spaCy
Stars: ✭ 919 (+1406.56%)
Transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Stars: ✭ 55,742 (+91280.33%)
Mutual labels:  transformer, language-model
Pytorch Openai Transformer Lm
🐥A PyTorch implementation of OpenAI's finetuned transformer language model with a script to import the weights pre-trained by OpenAI
Stars: ✭ 1,268 (+1978.69%)
Mutual labels:  transformer, language-model
Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+568.85%)
Mutual labels:  transformer, language-model
Bert Keras
Keras implementation of BERT with pre-trained weights
Stars: ✭ 820 (+1244.26%)
Mutual labels:  transformer, transfer-learning
Vietnamese Electra
Electra pre-trained model using Vietnamese corpus
Stars: ✭ 55 (-9.84%)
Mutual labels:  transformer, language-model
Filipino-Text-Benchmarks
Open-source benchmark datasets and pretrained transformer models in the Filipino language.
Stars: ✭ 22 (-63.93%)
Mutual labels:  transformer, transfer-learning
Relational Rnn Pytorch
An implementation of DeepMind's Relational Recurrent Neural Networks in PyTorch.
Stars: ✭ 236 (+286.89%)
Mutual labels:  transformer, language-model
Highway-Transformer
[ACL‘20] Highway Transformer: A Gated Transformer.
Stars: ✭ 26 (-57.38%)
Mutual labels:  transformer, language-model
Bert Sklearn
a sklearn wrapper for Google's BERT model
Stars: ✭ 182 (+198.36%)
Gpt2 French
GPT-2 French demo | Démo française de GPT-2
Stars: ✭ 47 (-22.95%)
Mutual labels:  transformer, language-model
FNet-pytorch
Unofficial implementation of Google's FNet: Mixing Tokens with Fourier Transforms
Stars: ✭ 204 (+234.43%)
Mutual labels:  transformer, language-model
Indonesian Language Models
Indonesian Language Models and its Usage
Stars: ✭ 64 (+4.92%)
Mutual labels:  transformer, language-model
AITQA
resources for the IBM Airlines Table-Question-Answering Benchmark
Stars: ✭ 12 (-80.33%)
Mutual labels:  transformer, transfer-learning
Abstractive Summarization With Transfer Learning
Abstractive summarisation using Bert as encoder and Transformer Decoder
Stars: ✭ 358 (+486.89%)
Mutual labels:  transformer, transfer-learning
Getting Things Done With Pytorch
Jupyter Notebook tutorials on solving real-world problems with Machine Learning & Deep Learning using PyTorch. Topics: Face detection with Detectron 2, Time Series anomaly detection with LSTM Autoencoders, Object Detection with YOLO v5, Build your first Neural Network, Time Series forecasting for Coronavirus daily cases, Sentiment Analysis with BERT.
Stars: ✭ 738 (+1109.84%)
Mutual labels:  transformer, transfer-learning
Bert Pytorch
Google AI 2018 BERT pytorch implementation
Stars: ✭ 4,642 (+7509.84%)
Mutual labels:  transformer, language-model
Tupe
Transformer with Untied Positional Encoding (TUPE). Code of paper "Rethinking Positional Encoding in Language Pre-training". Improve existing models like BERT.
Stars: ✭ 143 (+134.43%)
Mutual labels:  transformer, language-model
Gpt Scrolls
A collaborative collection of open-source safe GPT-3 prompts that work well
Stars: ✭ 195 (+219.67%)
Mutual labels:  transformer, language-model
Flow Forecast
Deep learning PyTorch library for time series forecasting, classification, and anomaly detection (originally for flood forecasting).
Stars: ✭ 368 (+503.28%)
Mutual labels:  transformer, transfer-learning
wechsel
Code for WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models.
Stars: ✭ 39 (-36.07%)
Cross Domain ner
Cross-domain NER using cross-domain language modeling, code for ACL 2019 paper
Stars: ✭ 67 (+9.84%)
Gpt2
PyTorch Implementation of OpenAI GPT-2
Stars: ✭ 64 (+4.92%)
Mutual labels:  transformer, language-model
backprop
Backprop makes it simple to use, finetune, and deploy state-of-the-art ML models.
Stars: ✭ 229 (+275.41%)
Context-Transformer
Context-Transformer: Tackling Object Confusion for Few-Shot Detection, AAAI 2020
Stars: ✭ 89 (+45.9%)
Mutual labels:  transformer, transfer-learning
DAN
Code release of "Learning Transferable Features with Deep Adaptation Networks" (ICML 2015)
Stars: ✭ 149 (+144.26%)
Mutual labels:  transfer-learning
laravel5-hal-json
Laravel 5 HAL+JSON API Transformer Package
Stars: ✭ 15 (-75.41%)
Mutual labels:  transformer
laravel-scene
Laravel Transformer
Stars: ✭ 27 (-55.74%)
Mutual labels:  transformer
PDN
The official PyTorch implementation of "Pathfinder Discovery Networks for Neural Message Passing" (WebConf '21)
Stars: ✭ 44 (-27.87%)
Mutual labels:  transformer
Vision-Language-Transformer
Vision-Language Transformer and Query Generation for Referring Segmentation (ICCV 2021)
Stars: ✭ 127 (+108.2%)
Mutual labels:  transformer
Relation-Extraction-Transformer
NLP: Relation extraction with position-aware self-attention transformer
Stars: ✭ 63 (+3.28%)
Mutual labels:  transformer
tf2-transformer-chatbot
Transformer Chatbot in TensorFlow 2 with TPU support.
Stars: ✭ 94 (+54.1%)
Mutual labels:  transformer
chainer-notebooks
Jupyter notebooks for Chainer hands-on
Stars: ✭ 23 (-62.3%)
Mutual labels:  language-model
Syn2Real
Repository for Transfer Learning using Deep CNNs trained with synthetic images
Stars: ✭ 16 (-73.77%)
Mutual labels:  transfer-learning
Transformer-ocr
Handwritten text recognition using transformers.
Stars: ✭ 92 (+50.82%)
Mutual labels:  transformer
image-classification
A collection of SOTA Image Classification Models in PyTorch
Stars: ✭ 70 (+14.75%)
Mutual labels:  transformer
LaTeX-OCR
pix2tex: Using a ViT to convert images of equations into LaTeX code.
Stars: ✭ 1,566 (+2467.21%)
Mutual labels:  transformer
transformer-slt
Sign Language Translation with Transformers (COLING'2020, ECCV'20 SLRTP Workshop)
Stars: ✭ 92 (+50.82%)
Mutual labels:  transformer
DeepPhonemizer
Grapheme to phoneme conversion with deep learning.
Stars: ✭ 152 (+149.18%)
Mutual labels:  transformer
towhee
Towhee is a framework that is dedicated to making neural data processing pipelines simple and fast.
Stars: ✭ 821 (+1245.9%)
Mutual labels:  transformer
transform-graphql
⚙️ Transformer function to transform GraphQL Directives. Create model CRUD directive for example
Stars: ✭ 23 (-62.3%)
Mutual labels:  transformer
MetaHeac
This is an official implementation for "Learning to Expand Audience via Meta Hybrid Experts and Critics for Recommendation and Advertising"(KDD2021).
Stars: ✭ 36 (-40.98%)
Mutual labels:  transfer-learning
bert-movie-reviews-sentiment-classifier
Build a Movie Reviews Sentiment Classifier with Google's BERT Language Model
Stars: ✭ 12 (-80.33%)
Mutual labels:  language-model
YOLOv5-Lite
🍅🍅🍅YOLOv5-Lite: lighter, faster and easier to deploy. Evolved from yolov5 and the size of model is only 930+kb (int8) and 1.7M (fp16). It can reach 10+ FPS on the Raspberry Pi 4B when the input size is 320×320~
Stars: ✭ 1,230 (+1916.39%)
Mutual labels:  transformer
OverlapPredator
[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stars: ✭ 293 (+380.33%)
Mutual labels:  transformer
Word-Prediction-Ngram
Next Word Prediction using n-gram Probabilistic Model with various Smoothing Techniques
Stars: ✭ 25 (-59.02%)
Mutual labels:  language-model
Deep-Learning-Experiments-implemented-using-Google-Colab
Colab Compatible FastAI notebooks for NLP and Computer Vision Datasets
Stars: ✭ 16 (-73.77%)
Mutual labels:  transfer-learning
Awesome-Dialogue-State-Tracking
Dialogue State Tracking (DST) Papers, Datasets, Resources 🤩
Stars: ✭ 94 (+54.1%)
Mutual labels:  task-oriented-dialogue
TransPose
PyTorch Implementation for "TransPose: Keypoint localization via Transformer", ICCV 2021.
Stars: ✭ 250 (+309.84%)
Mutual labels:  transformer
super-gradients
Easily train or fine-tune SOTA computer vision models with one open source training library
Stars: ✭ 429 (+603.28%)
Mutual labels:  transfer-learning
HRFormer
This is an official implementation of our NeurIPS 2021 paper "HRFormer: High-Resolution Transformer for Dense Prediction".
Stars: ✭ 357 (+485.25%)
Mutual labels:  transformer
tying-wv-and-wc
Implementation for "Tying Word Vectors and Word Classifiers: A Loss Framework for Language Modeling"
Stars: ✭ 39 (-36.07%)
Mutual labels:  language-model
gpt-j
A GPT-J API to use with python3 to generate text, blogs, code, and more
Stars: ✭ 101 (+65.57%)
Mutual labels:  language-model
Music-Genre-Classification
Genre Classification using Convolutional Neural Networks
Stars: ✭ 27 (-55.74%)
Mutual labels:  transfer-learning
1-60 of 701 similar projects