SimpletransformersTransformers for Classification, NER, QA, Language Modelling, Language Generation, T5, Multi-Modal, and Conversational AI
Stars: ✭ 2,881 (+2116.15%)
Vit PytorchImplementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
Stars: ✭ 7,199 (+5437.69%)
KB-ALBERTKB국민은행에서 제공하는 경제/금융 도메인에 특화된 한국어 ALBERT 모델
Stars: ✭ 215 (+65.38%)
Nlp ArchitectA model library for exploring state-of-the-art deep learning topologies and techniques for optimizing Natural Language Processing neural networks
Stars: ✭ 2,768 (+2029.23%)
HugsVisionHugsVision is a easy to use huggingface wrapper for state-of-the-art computer vision
Stars: ✭ 154 (+18.46%)
jax-modelsUnofficial JAX implementations of deep learning research papers
Stars: ✭ 108 (-16.92%)
Fast BertSuper easy library for BERT based NLP models
Stars: ✭ 1,678 (+1190.77%)
YunoYuno is context based search engine for anime.
Stars: ✭ 320 (+146.15%)
gplPowerful unsupervised domain adaptation method for dense retrieval. Requires only unlabeled corpus and yields massive improvement: "GPL: Generative Pseudo Labeling for Unsupervised Domain Adaptation of Dense Retrieval" https://arxiv.org/abs/2112.07577
Stars: ✭ 216 (+66.15%)
COCO-LM[NeurIPS 2021] COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining
Stars: ✭ 109 (-16.15%)
simple transformersSimple transformer implementations that I can understand
Stars: ✭ 18 (-86.15%)
textUsing Transformers from HuggingFace in R
Stars: ✭ 66 (-49.23%)
Nn🧑🏫 50! Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
Stars: ✭ 5,720 (+4300%)
CGCF-ConfGen🧪 Learning Neural Generative Dynamics for Molecular Conformation Generation (ICLR 2021)
Stars: ✭ 41 (-68.46%)
TransmogrifaiTransmogrifAI (pronounced trăns-mŏgˈrə-fī) is an AutoML library for building modular, reusable, strongly typed machine learning workflows on Apache Spark with minimal hand-tuning
Stars: ✭ 2,084 (+1503.08%)
STAM-pytorchImplementation of STAM (Space Time Attention Model), a pure and simple attention model that reaches SOTA for video classification
Stars: ✭ 109 (-16.15%)
Reformer PytorchReformer, the efficient Transformer, in Pytorch
Stars: ✭ 1,644 (+1164.62%)
VQGAN-CLIP-DockerZero-Shot Text-to-Image Generation VQGAN+CLIP Dockerized
Stars: ✭ 58 (-55.38%)
thermostatCollection of NLP model explanations and accompanying analysis tools
Stars: ✭ 126 (-3.08%)
transganformerImplementation of TransGanFormer, an all-attention GAN that combines the finding from the recent GanFormer and TransGan paper
Stars: ✭ 137 (+5.38%)
uniformer-pytorchImplementation of Uniformer, a simple attention and 3d convolutional net that achieved SOTA in a number of video classification tasks, debuted in ICLR 2022
Stars: ✭ 90 (-30.77%)
trapperState-of-the-art NLP through transformer models in a modular design and consistent APIs.
Stars: ✭ 28 (-78.46%)
KoBERT-TransformersKoBERT on 🤗 Huggingface Transformers 🤗 (with Bug Fixed)
Stars: ✭ 162 (+24.62%)
knowledge-neuronsA library for finding knowledge neurons in pretrained transformer models.
Stars: ✭ 72 (-44.62%)
Basic-UI-for-GPT-J-6B-with-low-vramA repository to run gpt-j-6b on low vram machines (4.2 gb minimum vram for 2000 token context, 3.5 gb for 1000 token context). Model loading takes 12gb free ram.
Stars: ✭ 90 (-30.77%)
nlp-papersMust-read papers on Natural Language Processing (NLP)
Stars: ✭ 87 (-33.08%)
question generatorAn NLP system for generating reading comprehension questions
Stars: ✭ 188 (+44.62%)
Pytorch Sentiment AnalysisTutorials on getting started with PyTorch and TorchText for sentiment analysis.
Stars: ✭ 3,209 (+2368.46%)
Transformer-in-PyTorchTransformer/Transformer-XL/R-Transformer examples and explanations
Stars: ✭ 21 (-83.85%)
Dalle PytorchImplementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch
Stars: ✭ 3,661 (+2716.15%)
Ask2TransformersA Framework for Textual Entailment based Zero Shot text classification
Stars: ✭ 102 (-21.54%)
Spark NlpState of the Art Natural Language Processing
Stars: ✭ 2,518 (+1836.92%)
TransQuestTransformer based translation quality estimation
Stars: ✭ 85 (-34.62%)
Clue中文语言理解测评基准 Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard
Stars: ✭ 2,425 (+1765.38%)
nlp workshop odsc europe20Extensive tutorials for the Advanced NLP Workshop in Open Data Science Conference Europe 2020. We will leverage machine learning, deep learning and deep transfer learning to learn and solve popular tasks using NLP including NER, Classification, Recommendation \ Information Retrieval, Summarization, Classification, Language Translation, Q&A and T…
Stars: ✭ 127 (-2.31%)
Haystack🔍 Haystack is an open source NLP framework that leverages Transformer models. It enables developers to implement production-ready neural search, question answering, semantic document search and summarization for a wide range of applications.
Stars: ✭ 3,409 (+2522.31%)
naruNeural Relation Understanding: neural cardinality estimators for tabular data
Stars: ✭ 76 (-41.54%)
Tokenizers💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
Stars: ✭ 5,077 (+3805.38%)
transformers-lightningA collection of Models, Datasets, DataModules, Callbacks, Metrics, Losses and Loggers to better integrate pytorch-lightning with transformers.
Stars: ✭ 45 (-65.38%)
CogViewText-to-Image generation. The repo for NeurIPS 2021 paper "CogView: Mastering Text-to-Image Generation via Transformers".
Stars: ✭ 708 (+444.62%)
LIT[AAAI 2022] This is the official PyTorch implementation of "Less is More: Pay Less Attention in Vision Transformers"
Stars: ✭ 79 (-39.23%)
hashformersHashformers is a framework for hashtag segmentation with transformers.
Stars: ✭ 18 (-86.15%)
Transformer-MM-Explainability[ICCV 2021- Oral] Official PyTorch implementation for Generic Attention-model Explainability for Interpreting Bi-Modal and Encoder-Decoder Transformers, a novel method to visualize any Transformer-based network. Including examples for DETR, VQA.
Stars: ✭ 484 (+272.31%)
policy-data-analyzerBuilding a model to recognize incentives for landscape restoration in environmental policies from Latin America, the US and India. Bringing NLP to the world of policy analysis through an extensible framework that includes scraping, preprocessing, active learning and text analysis pipelines.
Stars: ✭ 22 (-83.08%)
SnowflakeNet(TPAMI 2022) Snowflake Point Deconvolution for Point Cloud Completion and Generation with Skip-Transformer
Stars: ✭ 74 (-43.08%)
BangalASRTransformer based Bangla Speech Recognition
Stars: ✭ 20 (-84.62%)
ercEmotion recognition in conversation
Stars: ✭ 34 (-73.85%)
class-normClass Normalization for Continual Zero-Shot Learning
Stars: ✭ 34 (-73.85%)
bangla-bertBangla-Bert is a pretrained bert model for Bengali language
Stars: ✭ 41 (-68.46%)
KnowledgeEditorCode for Editing Factual Knowledge in Language Models
Stars: ✭ 86 (-33.85%)
Fengshenbang-LMFengshenbang-LM(封神榜大模型)是IDEA研究院认知计算与自然语言研究中心主导的大模型开源体系,成为中文AIGC和认知智能的基础设施。
Stars: ✭ 1,813 (+1294.62%)
awesome-huggingface🤗 A list of wonderful open-source projects & applications integrated with Hugging Face libraries.
Stars: ✭ 436 (+235.38%)
X-TransformerX-Transformer: Taming Pretrained Transformers for eXtreme Multi-label Text Classification
Stars: ✭ 127 (-2.31%)
oreilly-bert-nlpThis repository contains code for the O'Reilly Live Online Training for BERT
Stars: ✭ 19 (-85.38%)
clip-italianCLIP (Contrastive Language–Image Pre-training) for Italian
Stars: ✭ 113 (-13.08%)
convolutional-attentionRepository for the code of the "A Convolutional Attention Network for Extreme Summarization of Source Code" paper
Stars: ✭ 118 (-9.23%)