KB-ALBERTKB국민은행에서 제공하는 경제/금융 도메인에 특화된 한국어 ALBERT 모델
Stars: ✭ 215 (+32.72%)
C-TranGeneral Multi-label Image Classification with Transformers
Stars: ✭ 106 (-34.57%)
VQGAN-CLIP-DockerZero-Shot Text-to-Image Generation VQGAN+CLIP Dockerized
Stars: ✭ 58 (-64.2%)
TransmogrifaiTransmogrifAI (pronounced trăns-mŏgˈrə-fī) is an AutoML library for building modular, reusable, strongly typed machine learning workflows on Apache Spark with minimal hand-tuning
Stars: ✭ 2,084 (+1186.42%)
TorchBlocksA PyTorch-based toolkit for natural language processing
Stars: ✭ 85 (-47.53%)
transganformerImplementation of TransGanFormer, an all-attention GAN that combines the finding from the recent GanFormer and TransGan paper
Stars: ✭ 137 (-15.43%)
spark-transformersSpark-Transformers: Library for exporting Apache Spark MLLIB models to use them in any Java application with no other dependencies.
Stars: ✭ 39 (-75.93%)
text2keywordsTrained T5 and T5-large model for creating keywords from text
Stars: ✭ 53 (-67.28%)
OpenDialogAn Open-Source Package for Chinese Open-domain Conversational Chatbot (中文闲聊对话系统,一键部署微信闲聊机器人)
Stars: ✭ 94 (-41.98%)
Vit PytorchImplementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
Stars: ✭ 7,199 (+4343.83%)
text2textText2Text: Cross-lingual natural language processing and generation toolkit
Stars: ✭ 188 (+16.05%)
SimpletransformersTransformers for Classification, NER, QA, Language Modelling, Language Generation, T5, Multi-Modal, and Conversational AI
Stars: ✭ 2,881 (+1678.4%)
YunoYuno is context based search engine for anime.
Stars: ✭ 320 (+97.53%)
robo-vlnPytorch code for ICRA'21 paper: "Hierarchical Cross-Modal Agent for Robotics Vision-and-Language Navigation"
Stars: ✭ 34 (-79.01%)
Nlp ArchitectA model library for exploring state-of-the-art deep learning topologies and techniques for optimizing Natural Language Processing neural networks
Stars: ✭ 2,768 (+1608.64%)
tttA package for fine-tuning Transformers with TPUs, written in Tensorflow2.0+
Stars: ✭ 35 (-78.4%)
HugsVisionHugsVision is a easy to use huggingface wrapper for state-of-the-art computer vision
Stars: ✭ 154 (-4.94%)
Fast BertSuper easy library for BERT based NLP models
Stars: ✭ 1,678 (+935.8%)
eve-botEVE bot, a customer service chatbot to enhance virtual engagement for Twitter Apple Support
Stars: ✭ 31 (-80.86%)
trapperState-of-the-art NLP through transformer models in a modular design and consistent APIs.
Stars: ✭ 28 (-82.72%)
robustness-vitContains code for the paper "Vision Transformers are Robust Learners" (AAAI 2022).
Stars: ✭ 78 (-51.85%)
BERT-NERUsing pre-trained BERT models for Chinese and English NER with 🤗Transformers
Stars: ✭ 114 (-29.63%)
HVT[ICCV 2021] Official implementation of "Scalable Vision Transformers with Hierarchical Pooling"
Stars: ✭ 26 (-83.95%)
Tokenizers💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
Stars: ✭ 5,077 (+3033.95%)
classyclassy is a simple-to-use library for building high-performance Machine Learning models in NLP.
Stars: ✭ 61 (-62.35%)
Dalle PytorchImplementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch
Stars: ✭ 3,661 (+2159.88%)
n-grammer-pytorchImplementation of N-Grammer, augmenting Transformers with latent n-grams, in Pytorch
Stars: ✭ 50 (-69.14%)
CogViewText-to-Image generation. The repo for NeurIPS 2021 paper "CogView: Mastering Text-to-Image Generation via Transformers".
Stars: ✭ 708 (+337.04%)
smaller-transformersLoad What You Need: Smaller Multilingual Transformers for Pytorch and TensorFlow 2.0.
Stars: ✭ 66 (-59.26%)
nlp-papersMust-read papers on Natural Language Processing (NLP)
Stars: ✭ 87 (-46.3%)
text2classMulti-class text categorization using state-of-the-art pre-trained contextualized language models, e.g. BERT
Stars: ✭ 15 (-90.74%)
hashformersHashformers is a framework for hashtag segmentation with transformers.
Stars: ✭ 18 (-88.89%)
miniconsUtility for analyzing Transformer based representations of language.
Stars: ✭ 28 (-82.72%)
Spark NlpState of the Art Natural Language Processing
Stars: ✭ 2,518 (+1454.32%)
nuwa-pytorchImplementation of NÜWA, state of the art attention network for text to video synthesis, in Pytorch
Stars: ✭ 347 (+114.2%)
policy-data-analyzerBuilding a model to recognize incentives for landscape restoration in environmental policies from Latin America, the US and India. Bringing NLP to the world of policy analysis through an extensible framework that includes scraping, preprocessing, active learning and text analysis pipelines.
Stars: ✭ 22 (-86.42%)
COCO-LM[NeurIPS 2021] COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining
Stars: ✭ 109 (-32.72%)
simple transformersSimple transformer implementations that I can understand
Stars: ✭ 18 (-88.89%)
remixer-pytorchImplementation of the Remixer Block from the Remixer paper, in Pytorch
Stars: ✭ 37 (-77.16%)
Product-Categorization-NLPMulti-Class Text Classification for products based on their description with Machine Learning algorithms and Neural Networks (MLP, CNN, Distilbert).
Stars: ✭ 30 (-81.48%)
BangalASRTransformer based Bangla Speech Recognition
Stars: ✭ 20 (-87.65%)
golgothaContextualised Embeddings and Language Modelling using BERT and Friends using R
Stars: ✭ 39 (-75.93%)
Clue中文语言理解测评基准 Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard
Stars: ✭ 2,425 (+1396.91%)
small-textActive Learning for Text Classification in Python
Stars: ✭ 241 (+48.77%)
ercEmotion recognition in conversation
Stars: ✭ 34 (-79.01%)
ParsBigBirdPersian Bert For Long-Range Sequences
Stars: ✭ 58 (-64.2%)
Pytorch Sentiment AnalysisTutorials on getting started with PyTorch and TorchText for sentiment analysis.
Stars: ✭ 3,209 (+1880.86%)
bangla-bertBangla-Bert is a pretrained bert model for Bengali language
Stars: ✭ 41 (-74.69%)
MISEMultimodal Image Synthesis and Editing: A Survey
Stars: ✭ 214 (+32.1%)
Haystack🔍 Haystack is an open source NLP framework that leverages Transformer models. It enables developers to implement production-ready neural search, question answering, semantic document search and summarization for a wide range of applications.
Stars: ✭ 3,409 (+2004.32%)
optimum🏎️ Accelerate training and inference of 🤗 Transformers with easy to use hardware optimization tools
Stars: ✭ 567 (+250%)
knowledge-neuronsA library for finding knowledge neurons in pretrained transformer models.
Stars: ✭ 72 (-55.56%)
KoBERT-NERNER Task with KoBERT (with Naver NLP Challenge dataset)
Stars: ✭ 76 (-53.09%)
KoSpacingAutomatic Korean word spacing with R
Stars: ✭ 76 (-53.09%)
Nn🧑🏫 50! Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
Stars: ✭ 5,720 (+3430.86%)