Tokenizers💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
Stars: ✭ 5,077 (+5944.05%)
gpt-jA GPT-J API to use with python3 to generate text, blogs, code, and more
Stars: ✭ 101 (+20.24%)
backpropBackprop makes it simple to use, finetune, and deploy state-of-the-art ML models.
Stars: ✭ 229 (+172.62%)
COCO-LM[NeurIPS 2021] COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining
Stars: ✭ 109 (+29.76%)
miniconsUtility for analyzing Transformer based representations of language.
Stars: ✭ 28 (-66.67%)
wechselCode for WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models.
Stars: ✭ 39 (-53.57%)
Clue中文语言理解测评基准 Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard
Stars: ✭ 2,425 (+2786.9%)
Haystack🔍 Haystack is an open source NLP framework that leverages Transformer models. It enables developers to implement production-ready neural search, question answering, semantic document search and summarization for a wide range of applications.
Stars: ✭ 3,409 (+3958.33%)
KB-ALBERTKB국민은행에서 제공하는 경제/금융 도메인에 특화된 한국어 ALBERT 모델
Stars: ✭ 215 (+155.95%)
gpt-j-apiAPI for the GPT-J language model 🦜. Including a FastAPI backend and a streamlit frontend
Stars: ✭ 248 (+195.24%)
long-short-transformerImplementation of Long-Short Transformer, combining local and global inductive biases for attention over long sequences, in Pytorch
Stars: ✭ 103 (+22.62%)
plannerLightweight, interactive planning tool that visualizes a series of tasks using an HTML canvas
Stars: ✭ 502 (+497.62%)
gnn-lspeSource code for GNN-LSPE (Graph Neural Networks with Learnable Structural and Positional Representations), ICLR 2022
Stars: ✭ 165 (+96.43%)
fixAllows you to use OpenAI Codex to fix errors in the command line.
Stars: ✭ 72 (-14.29%)
gdcCode for the ICLR 2021 paper "A Distributional Approach to Controlled Text Generation"
Stars: ✭ 94 (+11.9%)
label-studio-transformersLabel data using HuggingFace's transformers and automatically get a prediction service
Stars: ✭ 117 (+39.29%)
deepconsensusDeepConsensus uses gap-aware sequence transformers to correct errors in Pacific Biosciences (PacBio) Circular Consensus Sequencing (CCS) data.
Stars: ✭ 124 (+47.62%)
xpandasUniversal 1d/2d data containers with Transformers functionality for data analysis.
Stars: ✭ 25 (-70.24%)
vim codexThis is a simple plugin for Vim that allows you to use OpenAI Codex.
Stars: ✭ 181 (+115.48%)
anonymisationAnonymization of legal cases (Fr) based on Flair embeddings
Stars: ✭ 85 (+1.19%)
PDNThe official PyTorch implementation of "Pathfinder Discovery Networks for Neural Message Passing" (WebConf '21)
Stars: ✭ 44 (-47.62%)
open clipAn open source implementation of CLIP.
Stars: ✭ 1,534 (+1726.19%)
minGPT-TFA minimal TF2 re-implementation of the OpenAI GPT training
Stars: ✭ 36 (-57.14%)
gpt-neo-fine-tuning-exampleFine-Tune EleutherAI GPT-Neo And GPT-J-6B To Generate Netflix Movie Descriptions Using Hugginface And DeepSpeed
Stars: ✭ 157 (+86.9%)
lightning-transformersFlexible components pairing 🤗 Transformers with Pytorch Lightning
Stars: ✭ 551 (+555.95%)
Black-Box-TuningICML'2022: Black-Box Tuning for Language-Model-as-a-Service
Stars: ✭ 99 (+17.86%)
LM-CNLCChinese Natural Language Correction via Language Model
Stars: ✭ 15 (-82.14%)
converseConversational text Analysis using various NLP techniques
Stars: ✭ 147 (+75%)
XENAXENA is the managed remote administration platform for botnet creation & development powered by blockchain and machine learning. Aiming to provide an ecosystem which serves the bot herders. Favoring secrecy and resiliency over performance. It's micro-service oriented allowing for specialization and lower footprint. Join the community of the ulti…
Stars: ✭ 127 (+51.19%)
Robotics-Planning-Dynamics-and-ControlRPDC : This contains all my MATLAB codes for the Robotics, Planning, Dynamics and Control . The implementations model various kinds of manipulators and mobile robots for position control, trajectory planning and path planning problems.
Stars: ✭ 171 (+103.57%)
Angry-HEXAn artificial player for the popular video game Angry Birds
Stars: ✭ 16 (-80.95%)
DocSumA tool to automatically summarize documents abstractively using the BART or PreSumm Machine Learning Model.
Stars: ✭ 58 (-30.95%)
intervalThis PHP library provides some tools to handle intervals. For instance, you can compute the union or intersection of two intervals.
Stars: ✭ 25 (-70.24%)
Text-SummarizationAbstractive and Extractive Text summarization using Transformers.
Stars: ✭ 38 (-54.76%)
WellcomeMLRepository for Machine Learning utils at the Wellcome Trust
Stars: ✭ 31 (-63.1%)
Transformers-TutorialsThis repository contains demos I made with the Transformers library by HuggingFace.
Stars: ✭ 2,828 (+3266.67%)
pytorch-vitAn Image is Worth 16x16 Words: Transformers for Image Recognition at Scale
Stars: ✭ 250 (+197.62%)
mongolian-nlpUseful resources for Mongolian NLP
Stars: ✭ 119 (+41.67%)
cscgCode Generation as a Dual Task of Code Summarization.
Stars: ✭ 28 (-66.67%)
BotSmartSchedulerEnhance your planning capabilities with this smart bot!
Stars: ✭ 44 (-47.62%)
PCPMPresenting Collection of Pretrained Models. Links to pretrained models in NLP and voice.
Stars: ✭ 21 (-75%)
modulesThe official repository for our paper "Are Neural Nets Modular? Inspecting Functional Modularity Through Differentiable Weight Masks". We develop a method for analyzing emerging functional modularity in neural networks based on differentiable weight masks and use it to point out important issues in current-day neural networks.
Stars: ✭ 25 (-70.24%)
transformers-interpretModel explainability that works seamlessly with 🤗 transformers. Explain your transformers model in just 2 lines of code.
Stars: ✭ 861 (+925%)
bert-squeeze🛠️ Tools for Transformers compression using PyTorch Lightning ⚡
Stars: ✭ 56 (-33.33%)
elastic transformersMaking BERT stretchy. Semantic Elasticsearch with Sentence Transformers
Stars: ✭ 153 (+82.14%)
pysentimientoA Python multilingual toolkit for Sentiment Analysis and Social NLP tasks
Stars: ✭ 274 (+226.19%)
molecule-attention-transformerPytorch reimplementation of Molecule Attention Transformer, which uses a transformer to tackle the graph-like structure of molecules
Stars: ✭ 46 (-45.24%)
DnaWeaverA route planner for DNA assembly
Stars: ✭ 20 (-76.19%)
awesome-agi-cocosciAn awesome & curated list for Artificial General Intelligence, an emerging inter-discipline field that combines artificial intelligence and computational cognitive sciences.
Stars: ✭ 81 (-3.57%)
transformer generalizationThe official repository for our paper "The Devil is in the Detail: Simple Tricks Improve Systematic Generalization of Transformers". We significantly improve the systematic generalization of transformer models on a variety of datasets using simple tricks and careful considerations.
Stars: ✭ 58 (-30.95%)
CoLAKECOLING'2020: CoLAKE: Contextualized Language and Knowledge Embedding
Stars: ✭ 86 (+2.38%)
subword-lstm-lmLSTM Language Model with Subword Units Input Representations
Stars: ✭ 45 (-46.43%)
scrum-planning-pokerPlease feel FREE to try it and give feedback by searching Scrum敏捷估算 in WeChat mini program.
Stars: ✭ 30 (-64.29%)