Transformersπ€ Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Stars: β 55,742 (+997.93%)
Mutual labels: natural-language-processing, language-model, natural-language-understanding, bert
Spark NlpState of the Art Natural Language Processing
Stars: β 2,518 (-50.4%)
Mutual labels: natural-language-processing, transformers, bert
Easy BertA Dead Simple BERT API for Python and Java (https://github.com/google-research/bert)
Stars: β 106 (-97.91%)
Mutual labels: natural-language-processing, language-model, natural-language-understanding
Attention MechanismsImplementations for a family of attention mechanisms, suitable for all kinds of natural language processing tasks and compatible with TensorFlow 2.0 and Keras.
Stars: β 203 (-96%)
Mutual labels: natural-language-processing, language-model, natural-language-understanding
ClueδΈζθ―θ¨η解ζ΅θ―εΊε Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard
Stars: β 2,425 (-52.24%)
Mutual labels: language-model, transformers, bert
Bert As ServiceMapping a variable-length sentence to a fixed-length vector using BERT model
Stars: β 9,779 (+92.61%)
Mutual labels: natural-language-processing, natural-language-understanding, bert
Spacy TransformersπΈ Use pretrained transformers like BERT, XLNet and GPT-2 in spaCy
Stars: β 919 (-81.9%)
Mutual labels: natural-language-processing, language-model, natural-language-understanding
wechselCode for WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models.
Stars: β 39 (-99.23%)
Mutual labels: transformers, language-model, bert
backpropBackprop makes it simple to use, finetune, and deploy state-of-the-art ML models.
Stars: β 229 (-95.49%)
Mutual labels: transformers, language-model, bert
COCO-LM[NeurIPS 2021] COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining
Stars: β 109 (-97.85%)
Mutual labels: transformers, language-model, natural-language-understanding
Haystackπ Haystack is an open source NLP framework that leverages Transformer models. It enables developers to implement production-ready neural search, question answering, semantic document search and summarization for a wide range of applications.
Stars: β 3,409 (-32.85%)
Mutual labels: language-model, transformers, bert
text2classMulti-class text categorization using state-of-the-art pre-trained contextualized language models, e.g. BERT
Stars: β 15 (-99.7%)
Mutual labels: transformers, bert, natural-language-understanding
Chars2vecCharacter-based word embeddings model based on RNN for handling real worldΒ texts
Stars: β 130 (-97.44%)
Mutual labels: natural-language-processing, language-model, natural-language-understanding
Pytorch Sentiment AnalysisTutorials on getting started with PyTorch and TorchText for sentiment analysis.
Stars: β 3,209 (-36.79%)
Mutual labels: natural-language-processing, transformers, bert
label-studio-transformersLabel data using HuggingFace's transformers and automatically get a prediction service
Stars: β 117 (-97.7%)
Mutual labels: transformers, bert, natural-language-understanding
classyclassy is a simple-to-use library for building high-performance Machine Learning models in NLP.
Stars: β 61 (-98.8%)
Mutual labels: transformers, bert, natural-language-understanding
Bert PytorchGoogle AI 2018 BERT pytorch implementation
Stars: β 4,642 (-8.57%)
Mutual labels: language-model, bert
policy-data-analyzerBuilding a model to recognize incentives for landscape restoration in environmental policies from Latin America, the US and India. Bringing NLP to the world of policy analysis through an extensible framework that includes scraping, preprocessing, active learning and text analysis pipelines.
Stars: β 22 (-99.57%)
Mutual labels: transformers, bert
few-shot-lmThe source code of "Language Models are Few-shot Multilingual Learners" (MRL @ EMNLP 2021)
Stars: β 32 (-99.37%)
Mutual labels: gpt, language-model
Practical NlpOfficial Repository for 'Practical Natural Language Processing' by O'Reilly Media
Stars: β 452 (-91.1%)
Mutual labels: natural-language-processing, natural-language-understanding