All Projects → heartexlabs → label-studio-transformers

heartexlabs / label-studio-transformers

Licence: Apache-2.0 license
Label data using HuggingFace's transformers and automatically get a prediction service

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to label-studio-transformers

classy
classy is a simple-to-use library for building high-performance Machine Learning models in NLP.
Stars: ✭ 61 (-47.86%)
Mutual labels:  transformers, bert, natural-language-understanding
Transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Stars: ✭ 55,742 (+47542.74%)
Mutual labels:  bert, natural-language-understanding, pytorch-transformers
HugsVision
HugsVision is a easy to use huggingface wrapper for state-of-the-art computer vision
Stars: ✭ 154 (+31.62%)
Mutual labels:  transformers, bert, pytorch-transformers
text2class
Multi-class text categorization using state-of-the-art pre-trained contextualized language models, e.g. BERT
Stars: ✭ 15 (-87.18%)
Mutual labels:  transformers, bert, natural-language-understanding
Tokenizers
💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
Stars: ✭ 5,077 (+4239.32%)
Mutual labels:  transformers, bert, natural-language-understanding
anonymisation
Anonymization of legal cases (Fr) based on Flair embeddings
Stars: ✭ 85 (-27.35%)
Mutual labels:  transformers, bert
Pytorch Sentiment Analysis
Tutorials on getting started with PyTorch and TorchText for sentiment analysis.
Stars: ✭ 3,209 (+2642.74%)
Mutual labels:  transformers, bert
Transformers-Tutorials
This repository contains demos I made with the Transformers library by HuggingFace.
Stars: ✭ 2,828 (+2317.09%)
Mutual labels:  transformers, bert
COCO-LM
[NeurIPS 2021] COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining
Stars: ✭ 109 (-6.84%)
Mutual labels:  transformers, natural-language-understanding
Haystack
🔍 Haystack is an open source NLP framework that leverages Transformer models. It enables developers to implement production-ready neural search, question answering, semantic document search and summarization for a wide range of applications.
Stars: ✭ 3,409 (+2813.68%)
Mutual labels:  transformers, bert
Nlp Architect
A model library for exploring state-of-the-art deep learning topologies and techniques for optimizing Natural Language Processing neural networks
Stars: ✭ 2,768 (+2265.81%)
Mutual labels:  transformers, bert
gpl
Powerful unsupervised domain adaptation method for dense retrieval. Requires only unlabeled corpus and yields massive improvement: "GPL: Generative Pseudo Labeling for Unsupervised Domain Adaptation of Dense Retrieval" https://arxiv.org/abs/2112.07577
Stars: ✭ 216 (+84.62%)
Mutual labels:  transformers, bert
Spark Nlp
State of the Art Natural Language Processing
Stars: ✭ 2,518 (+2052.14%)
Mutual labels:  transformers, bert
Clue
中文语言理解测评基准 Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard
Stars: ✭ 2,425 (+1972.65%)
Mutual labels:  transformers, bert
Text-Summarization
Abstractive and Extractive Text summarization using Transformers.
Stars: ✭ 38 (-67.52%)
Mutual labels:  transformers, bert
Fast Bert
Super easy library for BERT based NLP models
Stars: ✭ 1,678 (+1334.19%)
Mutual labels:  transformers, bert
GLUE-bert4keras
基于bert4keras的GLUE基准代码
Stars: ✭ 59 (-49.57%)
Mutual labels:  bert, natural-language-understanding
question generator
An NLP system for generating reading comprehension questions
Stars: ✭ 188 (+60.68%)
Mutual labels:  transformers, bert
bert extension tf
BERT Extension in TensorFlow
Stars: ✭ 29 (-75.21%)
Mutual labels:  bert, natural-language-understanding
oreilly-bert-nlp
This repository contains code for the O'Reilly Live Online Training for BERT
Stars: ✭ 19 (-83.76%)
Mutual labels:  transformers, bert

Label Studio for Hugging Face's Transformers

WebsiteDocsTwitterJoin Slack Community


Transfer learning for NLP models by annotating your textual data without any additional coding.

This package provides a ready-to-use container that links together:


Quick Usage

Install Label Studio and other dependencies

pip install -r requirements.txt
Create ML backend with BERT classifier
label-studio-ml init my-ml-backend --script models/bert_classifier.py
cp models/utils.py my-ml-backend/utils.py

# Start ML backend at http://localhost:9090
label-studio-ml start my-ml-backend

# Start Label Studio in the new terminal with the same python environment
label-studio start
  1. Create a project with Choices and Text tags in the labeling config.
  2. Connect the ML backend in the Project settings with http://localhost:9090
Create ML backend with BERT named entity recognizer
label-studio-ml init my-ml-backend --script models/ner.py
cp models/utils.py my-ml-backend/utils.py

# Start ML backend at http://localhost:9090
label-studio-ml start my-ml-backend

# Start Label Studio in the new terminal with the same python environment
label-studio start
  1. Create a project with Labels and Text tags in the labeling config.
  2. Connect the ML backend in the Project settings with http://localhost:9090

Training and inference

The browser opens at http://localhost:8080. Upload your data on Import page then annotate by selecting Labeling page. Once you've annotate sufficient amount of data, go to Model page and press Start Training button. Once training is finished, model automatically starts serving for inference from Label Studio, and you'll find all model checkpoints inside my-ml-backend/<ml-backend-id>/ directory.

Click here to read more about how to use Machine Learning backend and build Human-in-the-Loop pipelines with Label Studio

License

This software is licensed under the Apache 2.0 LICENSE © Heartex. 2020

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].