All Projects β†’ huggingface β†’ awesome-huggingface

huggingface / awesome-huggingface

Licence: Apache-2.0 license
πŸ€— A list of wonderful open-source projects & applications integrated with Hugging Face libraries.

Projects that are alternatives of or similar to awesome-huggingface

erc
Emotion recognition in conversation
Stars: ✭ 34 (-92.2%)
Mutual labels:  transformers, huggingface
converse
Conversational text Analysis using various NLP techniques
Stars: ✭ 147 (-66.28%)
Mutual labels:  transformers, huggingface
clip-italian
CLIP (Contrastive Language–Image Pre-training) for Italian
Stars: ✭ 113 (-74.08%)
Mutual labels:  transformers, huggingface
chef-transformer
Chef Transformer 🍲 .
Stars: ✭ 29 (-93.35%)
Mutual labels:  transformers, huggingface
HugsVision
HugsVision is a easy to use huggingface wrapper for state-of-the-art computer vision
Stars: ✭ 154 (-64.68%)
Mutual labels:  transformers, huggingface
policy-data-analyzer
Building a model to recognize incentives for landscape restoration in environmental policies from Latin America, the US and India. Bringing NLP to the world of policy analysis through an extensible framework that includes scraping, preprocessing, active learning and text analysis pipelines.
Stars: ✭ 22 (-94.95%)
Mutual labels:  transformers, huggingface
danish transformers
A collection of Danish Transformers
Stars: ✭ 30 (-93.12%)
Mutual labels:  transformers, huggingface
text
Using Transformers from HuggingFace in R
Stars: ✭ 66 (-84.86%)
Mutual labels:  transformers
nlp workshop odsc europe20
Extensive tutorials for the Advanced NLP Workshop in Open Data Science Conference Europe 2020. We will leverage machine learning, deep learning and deep transfer learning to learn and solve popular tasks using NLP including NER, Classification, Recommendation \ Information Retrieval, Summarization, Classification, Language Translation, Q&A and T…
Stars: ✭ 127 (-70.87%)
Mutual labels:  transformers
Transformer-in-PyTorch
Transformer/Transformer-XL/R-Transformer examples and explanations
Stars: ✭ 21 (-95.18%)
Mutual labels:  transformers
jax-models
Unofficial JAX implementations of deep learning research papers
Stars: ✭ 108 (-75.23%)
Mutual labels:  transformers
TabFormer
Code & Data for "Tabular Transformers for Modeling Multivariate Time Series" (ICASSP, 2021)
Stars: ✭ 209 (-52.06%)
Mutual labels:  huggingface
question generator
An NLP system for generating reading comprehension questions
Stars: ✭ 188 (-56.88%)
Mutual labels:  transformers
AnimeGANv3
Use AnimeGANv3 to make your own animation works, including turning photos or videos into anime.
Stars: ✭ 878 (+101.38%)
Mutual labels:  huggingface
Ask2Transformers
A Framework for Textual Entailment based Zero Shot text classification
Stars: ✭ 102 (-76.61%)
Mutual labels:  transformers
Neural-Scam-Artist
Web Scraping, Document Deduplication & GPT-2 Fine-tuning with a newly created scam dataset.
Stars: ✭ 18 (-95.87%)
Mutual labels:  huggingface
hf-experiments
Experiments with Hugging Face πŸ”¬ πŸ€—
Stars: ✭ 37 (-91.51%)
Mutual labels:  huggingface
Introduction-to-Deep-Learning-and-Neural-Networks-Course
Code snippets and solutions for the Introduction to Deep Learning and Neural Networks Course hosted in educative.io
Stars: ✭ 33 (-92.43%)
Mutual labels:  transformers
Transformer-MM-Explainability
[ICCV 2021- Oral] Official PyTorch implementation for Generic Attention-model Explainability for Interpreting Bi-Modal and Encoder-Decoder Transformers, a novel method to visualize any Transformer-based network. Including examples for DETR, VQA.
Stars: ✭ 484 (+11.01%)
Mutual labels:  transformers
uniformer-pytorch
Implementation of Uniformer, a simple attention and 3d convolutional net that achieved SOTA in a number of video classification tasks, debuted in ICLR 2022
Stars: ✭ 90 (-79.36%)
Mutual labels:  transformers

awesome-huggingface

This is a list of some wonderful open-source projects & applications integrated with Hugging Face libraries.

How to contribute

πŸ€— Official Libraries

First-party cool stuff made with ❀️ by πŸ€— Hugging Face.

  • transformers - State-of-the-art natural language processing for Jax, PyTorch and TensorFlow.
  • datasets - The largest hub of ready-to-use NLP datasets for ML models with fast, easy-to-use and efficient data manipulation tools.
  • tokenizers - Fast state-of-the-Art tokenizers optimized for research and production.
  • knockknock - Get notified when your training ends with only two additional lines of code.
  • accelerate - A simple way to train and use PyTorch models with multi-GPU, TPU, mixed-precision.
  • autonlp - Train state-of-the-art natural language processing models and deploy them in a scalable environment automatically.
  • nn_pruning - Prune a model while finetuning or training.
  • huggingface_hub - Client library to download and publish models and other files on the huggingface.co hub.
  • tune - A benchmark for comparing Transformer-based models.

πŸ‘©β€πŸ« Tutorials

Learn how to use Hugging Face toolkits, step-by-step.

  • Official Course (from Hugging Face) - The official course series provided by πŸ€— Hugging Face.
  • transformers-tutorials (by @nielsrogge) - Tutorials for applying multiple models on real-world datasets.

🧰 NLP Toolkits

NLP toolkits built upon Transformers. Swiss Army!

  • AllenNLP (from AI2) - An open-source NLP research library.
  • Graph4NLP - Enabling easy use of Graph Neural Networks for NLP.
  • Lightning Transformers - Transformers with PyTorch Lightning interface.
  • Adapter Transformers - Extension to the Transformers library, integrating adapters into state-of-the-art language models.
  • Obsei - A low-code AI workflow automation tool and performs various NLP tasks in the workflow pipeline.
  • Trapper (from OBSS) - State-of-the-art NLP through transformer models in a modular design and consistent APIs.

πŸ₯‘ Text Representation

Converting a sentence to a vector.

  • Sentence Transformers (from UKPLab) - Widely used encoders computing dense vector representations for sentences, paragraphs, and images.
  • WhiteningBERT (from Microsoft) - An easy unsupervised sentence embedding approach with whitening.
  • SimCSE (from Princeton) - State-of-the-art sentence embedding with contrastive learning.
  • DensePhrases (from Princeton) - Learning dense representations of phrases at scale.

βš™οΈ Inference Engines

Highly optimized inference engines implementing Transformers-compatible APIs.

  • TurboTransformers (from Tencent) - An inference engine for transformers with fast C++ API.
  • FasterTransformer (from Nvidia) - A script and recipe to run the highly optimized transformer-based encoder and decoder component on NVIDIA GPUs.
  • lightseq (from ByteDance) - A high performance inference library for sequence processing and generation implemented in CUDA.
  • FastSeq (from Microsoft) - Efficient implementation of popular sequence models (e.g., Bart, ProphetNet) for text generation, summarization, translation tasks etc.

πŸŒ— Model Scalability

Parallelization models across multiple GPUs.

  • Parallelformers (from TUNiB) - A library for model parallel deployment.
  • OSLO (from TUNiB) - A library that supports various features to help you train large-scale models.
  • Deepspeed (from Microsoft) - Deepspeed-ZeRO - scales any model size with zero to no changes to the model. Integrated with HF Trainer.
  • fairscale (from Facebook) - Implements ZeRO protocol as well. Integrated with HF Trainer.
  • ColossalAI (from Hpcaitech) - A Unified Deep Learning System for Large-Scale Parallel Training (1D, 2D, 2.5D, 3D and sequence parallelism, and ZeRO protocol).

🏎️ Model Compression/Acceleration

Compressing or accelerate models for improved inference speed.

  • torchdistill - PyTorch-based modular, configuration-driven framework for knowledge distillation.
  • TextBrewer (from HFL) - State-of-the-art distillation methods to compress language models.
  • BERT-of-Theseus (from Microsoft) - Compressing BERT by progressively replacing the components of the original BERT.

🏹️ Adversarial Attack

Conducting adversarial attack to test model robustness.

  • TextAttack (from UVa) - A Python framework for adversarial attacks, data augmentation, and model training in NLP.
  • TextFlint (from Fudan) - A unified multilingual robustness evaluation toolkit for NLP.
  • OpenAttack (from THU) - An open-source textual adversarial attack toolkit.

πŸ” Style Transfer

Transfer the style of text! Now you know why it's called transformer?

  • Styleformer - A neural language style transfer framework to transfer text smoothly between styles.
  • ConSERT - A contrastive framework for self-supervised sentence representation transfer.

πŸ’’ Sentiment Analysis

Analyzing the sentiment and emotions of human beings.

  • conv-emotion - Implementation of different architectures for emotion recognition in conversations.

πŸ™… Grammatical Error Correction

You made a typo! Let me correct it.

  • Gramformer - A framework for detecting, highlighting and correcting grammatical errors on natural language text.

πŸ—Ί Translation

Translating between different languages.

  • dl-translate - A deep learning-based translation library based on HF Transformers.
  • EasyNMT (from UKPLab) - Easy-to-use, state-of-the-art translation library and Docker images based on HF Transformers.

πŸ“– Knowledge and Entity

Learning knowledge, mining entities, connecting the world.

  • PURE (from Princeton) - Entity and relation extraction from text.

πŸŽ™ Speech

Speech processing powered by HF libraries. Need for speech!

  • s3prl - A self-supervised speech pre-training and representation learning toolkit.
  • speechbrain - A PyTorch-based speech toolkit.

🀯 Multi-modality

Understanding the world from different modalities.

  • ViLT (from Kakao) - A vision-and-language transformer Without convolution or region supervision.

πŸ€– Reinforcement Learning

Combining RL magic with NLP!

  • trl - Fine-tune transformers using Proximal Policy Optimization (PPO) to align with human preferences.

❓ Question Answering

Searching for answers? Transformers to the rescue!

  • Haystack (from deepset) - End-to-end framework for developing and deploying question-answering systems in the wild.

πŸ’ Recommender Systems

I think this is just right for you!

  • Transformers4Rec (from Nvidia) - A flexible and efficient library powered by Transformers for sequential and session-based recommendations.

βš–οΈ Evauation

Evaluating NLP outputs powered by HF datasets!

  • Jury (from OBSS) - Easy to use tool for evaluating NLP model outputs, spesifically for NLG (Natural Language Generation), offering various automated text-to-text metrics.

πŸ” Neural Search

Search, but with the power of neural networks!

  • Jina Integration - Jina integration of Hugging Face Accelerated API.
  • Weaviate Integration (text2vec) (QA) - Weaviate integration of Hugging Face Transformers.
  • ColBERT (from Stanford) - A fast and accurate retrieval model, enabling scalable BERT-based search over large text collections in tens of milliseconds.

☁ Cloud

Cloud makes your life easy!

  • Amazon SageMaker - Making it easier than ever to train Hugging Face Transformer models in Amazon SageMaker.

πŸ“± Hardware

The infrastructure enabling the magic to happen.

  • Qualcomm - Collaboration on enabling Transformers in Snapdragon.
  • Intel - Collaboration with Intel for configuration options.
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].