All Projects → iKernels → transformers-lightning

iKernels / transformers-lightning

Licence: GPL-2.0 license
A collection of Models, Datasets, DataModules, Callbacks, Metrics, Losses and Loggers to better integrate pytorch-lightning with transformers.

Programming Languages

python
139335 projects - #7 most used programming language
Makefile
30231 projects

Projects that are alternatives of or similar to transformers-lightning

Basic-UI-for-GPT-J-6B-with-low-vram
A repository to run gpt-j-6b on low vram machines (4.2 gb minimum vram for 2000 token context, 3.5 gb for 1000 token context). Model loading takes 12gb free ram.
Stars: ✭ 90 (+100%)
Mutual labels:  transformers
nlp workshop odsc europe20
Extensive tutorials for the Advanced NLP Workshop in Open Data Science Conference Europe 2020. We will leverage machine learning, deep learning and deep transfer learning to learn and solve popular tasks using NLP including NER, Classification, Recommendation \ Information Retrieval, Summarization, Classification, Language Translation, Q&A and T…
Stars: ✭ 127 (+182.22%)
Mutual labels:  transformers
ginza-transformers
Use custom tokenizers in spacy-transformers
Stars: ✭ 15 (-66.67%)
Mutual labels:  transformers
KnowledgeEditor
Code for Editing Factual Knowledge in Language Models
Stars: ✭ 86 (+91.11%)
Mutual labels:  transformers
DynamicalBilliards.jl
An easy-to-use, modular, extendable and absurdly fast Julia package for dynamical billiards in two dimensions.
Stars: ✭ 97 (+115.56%)
Mutual labels:  models
question generator
An NLP system for generating reading comprehension questions
Stars: ✭ 188 (+317.78%)
Mutual labels:  transformers
Transformer-in-PyTorch
Transformer/Transformer-XL/R-Transformer examples and explanations
Stars: ✭ 21 (-53.33%)
Mutual labels:  transformers
Ask2Transformers
A Framework for Textual Entailment based Zero Shot text classification
Stars: ✭ 102 (+126.67%)
Mutual labels:  transformers
danish transformers
A collection of Danish Transformers
Stars: ✭ 30 (-33.33%)
Mutual labels:  transformers
translatable
Add multilingual support to your laravel 5 models
Stars: ✭ 34 (-24.44%)
Mutual labels:  models
mlx
Machine Learning eXchange (MLX). Data and AI Assets Catalog and Execution Engine
Stars: ✭ 132 (+193.33%)
Mutual labels:  models
Transformer-MM-Explainability
[ICCV 2021- Oral] Official PyTorch implementation for Generic Attention-model Explainability for Interpreting Bi-Modal and Encoder-Decoder Transformers, a novel method to visualize any Transformer-based network. Including examples for DETR, VQA.
Stars: ✭ 484 (+975.56%)
Mutual labels:  transformers
oreilly-bert-nlp
This repository contains code for the O'Reilly Live Online Training for BERT
Stars: ✭ 19 (-57.78%)
Mutual labels:  transformers
clip-italian
CLIP (Contrastive Language–Image Pre-training) for Italian
Stars: ✭ 113 (+151.11%)
Mutual labels:  transformers
Introduction-to-Deep-Learning-and-Neural-Networks-Course
Code snippets and solutions for the Introduction to Deep Learning and Neural Networks Course hosted in educative.io
Stars: ✭ 33 (-26.67%)
Mutual labels:  transformers
text
Using Transformers from HuggingFace in R
Stars: ✭ 66 (+46.67%)
Mutual labels:  transformers
psyplot
Python package for interactive data visualization
Stars: ✭ 64 (+42.22%)
Mutual labels:  models
X-Transformer
X-Transformer: Taming Pretrained Transformers for eXtreme Multi-label Text Classification
Stars: ✭ 127 (+182.22%)
Mutual labels:  transformers
mlreef
The collaboration workspace for Machine Learning
Stars: ✭ 1,409 (+3031.11%)
Mutual labels:  models
django-jsonfield-backport
Backport of the cross-DB JSONField model and form fields from Django 3.1.
Stars: ✭ 36 (-20%)
Mutual labels:  models

transformers-lightning

A collection of adapters, callbacks, datamodules, datasets, language-modeling, loggers, models, schedulers and optimizers to better integrate PyTorch Lightning and Transformers.

I'm happy to announce that all the metrics contained in this package has been successfully integrated into torchmetrics.

Table of contents

1. Install

2. Documentation

3. Main file

Install

Install the last stable release with

pip install transformers-lightning

You can also install a particular older version, for example the 0.5.4 by doing:

pip install git+https://github.com/iKernels/[email protected] --upgrade

Documentation

The documentation of each component is described in the relative folder.

Main file

We encourage you to structure your main file like:

import os
from argparse import ArgumentParser

import pytorch_lightning as pl
import torch

import models
import datamodules

from transformers_lightning import utils, callbacks, datamodules
from transformers_lightning.defaults import DefaultConfig


# Print high precision tensor values
torch.set_printoptions(precision=16)

def main(hyperparameters):

    # instantiate PL model
    model = TransformerModel(hyperparameters)

    # default tensorboard logger
    test_tube_logger = pl.loggers.TestTubeLogger(
        os.path.join(hyperparameters.output_dir, hyperparameters.tensorboard_dir), name=hyperparameters.name
    )

    # Save pre-trained models to
    save_transformers_callback = callbacks.TransformersModelCheckpointCallback(hyperparameters)

    # instantiate PL trainer
    trainer = pl.Trainer.from_argparse_args(
        hyperparameters,
        default_root_dir=hyperparameters.output_dir,
        profiler=True,
        logger=test_tube_logger,
        callbacks=[save_transformers_callback],
        log_gpu_memory='all',
        weights_summary='full'
    )

    # Datasets
    datamodule = YourDataModule(hyperparameters, model, trainer)

    # Train!
    if datamodule.do_train():
        trainer.fit(model, datamodule=datamodule)

    # Test!
    if datamodule.do_test():
        trainer.test(model, datamodule=datamodule)


if __name__ == '__main__':

    parser = ArgumentParser()

    # Experiment name, used both for checkpointing, pre_trained_names, logging and tensorboard
    parser.add_argument('--name', type=str, required=True, help='Name of the experiment, well be used to correctly retrieve checkpoints and logs')

    # I/O folders
    DefaultConfig.add_argparse_args(parser)

    # add model specific cli arguments
    TransformerModel.add_argparse_args(parser)
    YourDataModule.add_argparse_args(parser)

    # add callback / logger specific cli arguments
    callbacks.TransformersModelCheckpointCallback.add_argparse_args(parser)

    # add all the available trainer options to argparse
    # ie: now --gpus --num_nodes ... --fast_dev_run all work in the cli
    pl.Trainer.add_argparse_args(parser)

    # get NameSpace of parameters
    hyperparameters = parser.parse_args()
    main(hyperparameters)

Tests

To run tests you should start by installing pytest:

pip install pytest

Now, you can run tests (from the main folder), by running for example:

python -m pytest tests/callbacks

to only test callbacks or

python -m pytest tests

to run every test.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].