All Projects → Adapter-Hub → Adapter Transformers

Adapter-Hub / Adapter Transformers

Licence: apache-2.0
Huggingface Transformers + Adapters = ❤️

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Adapter Transformers

Nlp
Selected Machine Learning algorithms for natural language processing and semantic analysis in Golang
Stars: ✭ 304 (-10.06%)
Mutual labels:  natural-language-processing
Aspect Based Sentiment Analysis
A paper list for aspect based sentiment analysis.
Stars: ✭ 311 (-7.99%)
Mutual labels:  natural-language-processing
Chakin
Simple downloader for pre-trained word vectors
Stars: ✭ 323 (-4.44%)
Mutual labels:  natural-language-processing
Nlprule
A fast, low-resource Natural Language Processing and Text Correction library written in Rust.
Stars: ✭ 309 (-8.58%)
Mutual labels:  natural-language-processing
Trankit
Trankit is a Light-Weight Transformer-based Python Toolkit for Multilingual Natural Language Processing
Stars: ✭ 311 (-7.99%)
Mutual labels:  natural-language-processing
Displacy
💥 displaCy.js: An open-source NLP visualiser for the modern web
Stars: ✭ 311 (-7.99%)
Mutual labels:  natural-language-processing
Pyresparser
A simple resume parser used for extracting information from resumes
Stars: ✭ 297 (-12.13%)
Mutual labels:  natural-language-processing
Dynamic Memory Networks In Theano
Implementation of Dynamic memory networks by Kumar et al. http://arxiv.org/abs/1506.07285
Stars: ✭ 334 (-1.18%)
Mutual labels:  natural-language-processing
Biosentvec
BioWordVec & BioSentVec: pre-trained embeddings for biomedical words and sentences
Stars: ✭ 308 (-8.88%)
Mutual labels:  natural-language-processing
Clause
🏇 聊天机器人,自然语言理解,语义理解
Stars: ✭ 323 (-4.44%)
Mutual labels:  natural-language-processing
Awesome Arabic
A curated list of awesome projects and dev/design resources for supporting Arabic computational needs.
Stars: ✭ 309 (-8.58%)
Mutual labels:  natural-language-processing
Ltp
Language Technology Platform
Stars: ✭ 3,648 (+979.29%)
Mutual labels:  natural-language-processing
Bytenet Tensorflow
ByteNet for character-level language modelling
Stars: ✭ 319 (-5.62%)
Mutual labels:  natural-language-processing
Nlp101
NLP 101: a resource repository for Deep Learning and Natural Language Processing
Stars: ✭ 305 (-9.76%)
Mutual labels:  natural-language-processing
Adam qas
ADAM - A Question Answering System. Inspired from IBM Watson
Stars: ✭ 330 (-2.37%)
Mutual labels:  natural-language-processing
Graphbrain
Language, Knowledge, Cognition
Stars: ✭ 294 (-13.02%)
Mutual labels:  natural-language-processing
Gcn Over Pruned Trees
Graph Convolution over Pruned Dependency Trees Improves Relation Extraction (authors' PyTorch implementation)
Stars: ✭ 312 (-7.69%)
Mutual labels:  natural-language-processing
Matchzoo
Facilitating the design, comparison and sharing of deep text matching models.
Stars: ✭ 3,568 (+955.62%)
Mutual labels:  natural-language-processing
Nndial
NNDial is an open source toolkit for building end-to-end trainable task-oriented dialogue models. It is released by Tsung-Hsien (Shawn) Wen from Cambridge Dialogue Systems Group under Apache License 2.0.
Stars: ✭ 332 (-1.78%)
Mutual labels:  natural-language-processing
Ai Deadlines
⏰ AI conference deadline countdowns
Stars: ✭ 3,852 (+1039.64%)
Mutual labels:  natural-language-processing

adapter-transformers

A friendly fork of HuggingFace's Transformers, adding Adapters to PyTorch language models

Tests GitHub PyPI

adapter-transformers is an extension of HuggingFace's Transformers library, integrating adapters into state-of-the-art language models by incorporating AdapterHub, a central repository for pre-trained adapter modules.

This library can be used as a drop-in replacement for HuggingFace Transformers and regularly synchronizes new upstream changes.

Quick tour

adapter-transformers currently supports Python 3.6+ and PyTorch 1.1.0+. After installing PyTorch, you can install adapter-transformers from PyPI ...

pip install -U adapter-transformers

... or from source by cloning the repository:

git clone https://github.com/adapter-hub/adapter-transformers.git
cd adapter-transformers
pip install .

Getting Started

HuggingFace's great documentation on getting started with Transformers can be found here. adapter-transformers is fully compatible with Transformers.

To get started with adapters, refer to these locations:

  • Colab notebook tutorials, a series notebooks providing an introduction to all the main concepts of (adapter-)transformers and AdapterHub
  • https://docs.adapterhub.ml, our documentation on training and using adapters with adapter-transformers
  • https://adapterhub.ml to explore available pre-trained adapter modules and share your own adapters
  • Examples folder of this repository containing HuggingFace's example training scripts, many adapted for training adapters

Citation

If you find this library useful, please cite our paper AdapterHub: A Framework for Adapting Transformers:

@inproceedings{pfeiffer2020AdapterHub,
    title={AdapterHub: A Framework for Adapting Transformers},
    author={Pfeiffer, Jonas and
            R{\"u}ckl{\'e}, Andreas and
            Poth, Clifton and
            Kamath, Aishwarya and
            Vuli{\'c}, Ivan and
            Ruder, Sebastian and
            Cho, Kyunghyun and
            Gurevych, Iryna},
    booktitle={Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations},
    pages={46--54},
    year={2020}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].