All Projects → nyu-mll → Jiant

nyu-mll / Jiant

Licence: mit
jiant is an NLP toolkit

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Jiant

Bert language understanding
Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
Stars: ✭ 933 (-18.66%)
Mutual labels:  transfer-learning
Transfer Learning Toolkit
Transfer Learning Toolkit for Primary Researchers
Stars: ✭ 50 (-95.64%)
Mutual labels:  transfer-learning
Drivebot
tensorflow deep RL for driving a rover around
Stars: ✭ 62 (-94.59%)
Mutual labels:  transfer-learning
Gradfeat20
Gradients as Features for Deep Representation Learning
Stars: ✭ 30 (-97.38%)
Mutual labels:  transfer-learning
Graphrole
Automatic feature extraction and node role assignment for transfer learning on graphs (ReFeX & RolX)
Stars: ✭ 38 (-96.69%)
Mutual labels:  transfer-learning
Qh finsight
国内首个迁移学习赛题 中国平安前海征信“好信杯”迁移学习大数据算法大赛 FInSight团队作品(算法方案排名第三)
Stars: ✭ 55 (-95.2%)
Mutual labels:  transfer-learning
Spacy Transformers
🛸 Use pretrained transformers like BERT, XLNet and GPT-2 in spaCy
Stars: ✭ 919 (-19.88%)
Mutual labels:  transfer-learning
Farm
🏡 Fast & easy transfer learning for NLP. Harvesting language models for the industry. Focus on Question Answering.
Stars: ✭ 1,140 (-0.61%)
Mutual labels:  transfer-learning
Awesome Knowledge Distillation
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
Stars: ✭ 1,031 (-10.11%)
Mutual labels:  transfer-learning
Weakly Supervised 3d Object Detection
Weakly Supervised 3D Object Detection from Point Clouds (VS3D), ACM MM 2020
Stars: ✭ 61 (-94.68%)
Mutual labels:  transfer-learning
Seismic Transfer Learning
Deep-learning seismic facies on state-of-the-art CNN architectures
Stars: ✭ 32 (-97.21%)
Mutual labels:  transfer-learning
Teacher Student Training
This repository stores the files used for my summer internship's work on "teacher-student learning", an experimental method for training deep neural networks using a trained teacher model.
Stars: ✭ 34 (-97.04%)
Mutual labels:  transfer-learning
Big transfer
Official repository for the "Big Transfer (BiT): General Visual Representation Learning" paper.
Stars: ✭ 1,096 (-4.45%)
Mutual labels:  transfer-learning
Transferlearning
Transfer learning / domain adaptation / domain generalization / multi-task learning etc. Papers, codes, datasets, applications, tutorials.-迁移学习
Stars: ✭ 8,481 (+639.41%)
Mutual labels:  transfer-learning
Pathnet Pytorch
PyTorch implementation of PathNet: Evolution Channels Gradient Descent in Super Neural Networks
Stars: ✭ 63 (-94.51%)
Mutual labels:  transfer-learning
Deepfakes video classification
Deepfakes Video classification via CNN, LSTM, C3D and triplets
Stars: ✭ 24 (-97.91%)
Mutual labels:  transfer-learning
Average Word2vec
🔤 Calculate average word embeddings (word2vec) from documents for transfer learning
Stars: ✭ 52 (-95.47%)
Mutual labels:  transfer-learning
Cross Domain ner
Cross-domain NER using cross-domain language modeling, code for ACL 2019 paper
Stars: ✭ 67 (-94.16%)
Mutual labels:  transfer-learning
Sru Deeplearning Workshop
دوره 12 ساعته یادگیری عمیق با چارچوب Keras
Stars: ✭ 66 (-94.25%)
Mutual labels:  transfer-learning
Neural Painters X
Neural Paiters
Stars: ✭ 61 (-94.68%)
Mutual labels:  transfer-learning

jiant is an NLP toolkit

The multitask and transfer learning toolkit for natural language processing research

Generic badge codecov CircleCI Code style: black License: MIT

Why should I use jiant?

A few additional things you might want to know about jiant:

  • jiant is configuration file driven
  • jiant is built with PyTorch
  • jiant integrates with datasets to manage task data
  • jiant integrates with transformers to manage models and tokenizers.

Getting Started

Installation

To import jiant from source (recommended for researchers):

git clone https://github.com/nyu-mll/jiant.git
cd jiant
pip install -r requirements.txt

# Add the following to your .bash_rc or .bash_profile 
export PYTHONPATH=/path/to/jiant:$PYTHONPATH

If you plan to contribute to jiant, install additional dependencies with pip install -r requirements-dev.txt.

To install jiant from source (alternative for researchers):

git clone https://github.com/nyu-mll/jiant.git
cd jiant
pip install . -e

To install jiant from pip (recommended if you just want to train/use a model):

pip install jiant

We recommended that you install jiant in a virtual environment or a conda environment.

To check jiant was correctly installed, run a simple example.

Quick Introduction

The following example fine-tunes a RoBERTa model on the MRPC dataset.

Python version:

from jiant.proj.simple import runscript as run
import jiant.scripts.download_data.runscript as downloader

EXP_DIR = "/path/to/exp"

# Download the Data
downloader.download_data(["mrpc"], f"{EXP_DIR}/tasks")

# Set up the arguments for the Simple API
args = run.RunConfiguration(
   run_name="simple",
   exp_dir=EXP_DIR,
   data_dir=f"{EXP_DIR}/tasks",
   hf_pretrained_model_name_or_path="roberta-base",
   tasks="mrpc",
   train_batch_size=16,
   num_train_epochs=3
)

# Run!
run.run_simple(args)

Bash version:

EXP_DIR=/path/to/exp

python jiant/scripts/download_data/runscript.py \
    download \
    --tasks mrpc \
    --output_path ${EXP_DIR}/tasks
python jiant/proj/simple/runscript.py \
    run \
    --run_name simple \
    --exp_dir ${EXP_DIR}/ \
    --data_dir ${EXP_DIR}/tasks \
    --model_type roberta-base \
    --tasks mrpc \
    --train_batch_size 16 \
    --num_train_epochs 3

Examples of more complex training workflows are found here.

Contributing

The jiant project's contributing guidelines can be found here.

Looking for jiant v1.3.2?

jiant v1.3.2 has been moved to jiant-v1-legacy to support ongoing research with the library. jiant v2.x.x is more modular and scalable than jiant v1.3.2 and has been designed to reflect the needs of the current NLP research community. We strongly recommended any new projects use jiant v2.x.x.

jiant 1.x has been used in in several papers. For instructions on how to reproduce papers by jiant authors that refer readers to this site for documentation (including Tenney et al., Wang et al., Bowman et al., Kim et al., Warstadt et al.), refer to the jiant-v1-legacy README.

Citation

If you use jiant ≥ v2.0.0 in academic work, please cite it directly:

@misc{phang2020jiant,
    author = {Jason Phang and Phil Yeres and Jesse Swanson and Haokun Liu and Ian F. Tenney and Phu Mon Htut and Clara Vania and Alex Wang and Samuel R. Bowman},
    title = {\texttt{jiant} 2.0: A software toolkit for research on general-purpose text understanding models},
    howpublished = {\url{http://jiant.info/}},
    year = {2020}
}

If you use jiant ≤ v1.3.2 in academic work, please use the citation found here.

Acknowledgments

  • This work was made possible in part by a donation to NYU from Eric and Wendy Schmidt made by recommendation of the Schmidt Futures program, and by support from Intuit Inc.
  • We gratefully acknowledge the support of NVIDIA Corporation with the donation of a Titan V GPU used at NYU in this work.
  • Developer Jesse Swanson is supported by the Moore-Sloan Data Science Environment as part of the NYU Data Science Services initiative.

License

jiant is released under the MIT License.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].