All Projects → stickeritis → sticker2

stickeritis / sticker2

Licence: other
Further developed as SyntaxDot: https://github.com/tensordot/syntaxdot

Programming Languages

rust
11053 projects
Nix
1067 projects
python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to sticker2

FasterTransformer
Transformer related optimization, including BERT, GPT
Stars: ✭ 1,571 (+11121.43%)
Mutual labels:  transformer, bert
Bertviz
Tool for visualizing attention in the Transformer model (BERT, GPT-2, Albert, XLNet, RoBERTa, CTRL, etc.)
Stars: ✭ 3,443 (+24492.86%)
Mutual labels:  transformer, bert
Bert Pytorch
Google AI 2018 BERT pytorch implementation
Stars: ✭ 4,642 (+33057.14%)
Mutual labels:  transformer, bert
bert-as-a-service TFX
End-to-end pipeline with TFX to train and deploy a BERT model for sentiment analysis.
Stars: ✭ 32 (+128.57%)
Mutual labels:  transformer, bert
tensorflow-ml-nlp-tf2
텐서플로2와 머신러닝으로 시작하는 자연어처리 (로지스틱회귀부터 BERT와 GPT3까지) 실습자료
Stars: ✭ 245 (+1650%)
Mutual labels:  transformer, bert
SIGIR2021 Conure
One Person, One Model, One World: Learning Continual User Representation without Forgetting
Stars: ✭ 23 (+64.29%)
Mutual labels:  transformer, bert
Transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Stars: ✭ 55,742 (+398057.14%)
Mutual labels:  transformer, bert
are-16-heads-really-better-than-1
Code for the paper "Are Sixteen Heads Really Better than One?"
Stars: ✭ 128 (+814.29%)
Mutual labels:  transformer, bert
KitanaQA
KitanaQA: Adversarial training and data augmentation for neural question-answering models
Stars: ✭ 58 (+314.29%)
Mutual labels:  transformer, bert
sister
SImple SenTence EmbeddeR
Stars: ✭ 66 (+371.43%)
Mutual labels:  transformer, bert
text-generation-transformer
text generation based on transformer
Stars: ✭ 36 (+157.14%)
Mutual labels:  transformer, bert
TabFormer
Code & Data for "Tabular Transformers for Modeling Multivariate Time Series" (ICASSP, 2021)
Stars: ✭ 209 (+1392.86%)
Mutual labels:  transformer, bert
Filipino-Text-Benchmarks
Open-source benchmark datasets and pretrained transformer models in the Filipino language.
Stars: ✭ 22 (+57.14%)
Mutual labels:  transformer, bert
bert in a flask
A dockerized flask API, serving ALBERT and BERT predictions using TensorFlow 2.0.
Stars: ✭ 32 (+128.57%)
Mutual labels:  transformer, bert
semantic-document-relations
Implementation, trained models and result data for the paper "Pairwise Multi-Class Document Classification for Semantic Relations between Wikipedia Articles"
Stars: ✭ 21 (+50%)
Mutual labels:  transformer, bert
Nlp Tutorial
Natural Language Processing Tutorial for Deep Learning Researchers
Stars: ✭ 9,895 (+70578.57%)
Mutual labels:  transformer, bert
transformer-models
Deep Learning Transformer models in MATLAB
Stars: ✭ 90 (+542.86%)
Mutual labels:  transformer, bert
golgotha
Contextualised Embeddings and Language Modelling using BERT and Friends using R
Stars: ✭ 39 (+178.57%)
Mutual labels:  transformer, bert
vietnamese-roberta
A Robustly Optimized BERT Pretraining Approach for Vietnamese
Stars: ✭ 22 (+57.14%)
Mutual labels:  transformer, bert
les-military-mrc-rank7
莱斯杯:全国第二届“军事智能机器阅读”挑战赛 - Rank7 解决方案
Stars: ✭ 37 (+164.29%)
Mutual labels:  transformer, bert

sticker2

Warning: SyntaxDot supersedes sticker2.

Introduction

sticker2 is a sequence labeler using Transformer networks. sticker2 models can be trained from scratch or using pretrained models, such as BERT or XLM-RoBERTa.

In principle, sticker2 can be used to perform any sequence labeling task, but so far the focus has been on:

  • Part-of-speech tagging
  • Morphological tagging
  • Topological field tagging
  • Lemmatization
  • Dependency parsing
  • Named entity recognition

The easiest way to get started with sticker2 is to use a pretrained model.

Features

  • Input representations:
    • Word pieces
    • Sentence pieces
  • Flexible sequence encoder/decoder architecture, which supports:
    • Simple sequence labels (e.g. POS, morphology, named entities)
    • Lemmatization, based on edit trees
    • Dependency parsing
    • Simple API to extend to other tasks
  • Models representations:
    • Transformers
    • Pretraining from BERT and XLM-RoBERTa models
  • Multi-task training and classification using scalar weighting.
  • Model distillation
  • Deployment:
    • Standalone binary that links against PyTorch's libtorch
    • Very liberal license

Status

sticker2 is still under heavy development. However, models are reusable and the API is stable for every y in version 0.y.z.

References

sticker uses techniques from or was inspired by the following papers:

Documentation

Issues

You can report bugs and feature requests in the sticker2 issue tracker.

License

sticker2 is licensed under the Blue Oak Model License version 1.0.0. The list of contributors is also available.

Credits

  • sticker2 is developed by Daniël de Kok & Tobias Pütz.
  • The Python precursor to sticker was developer by Erik Schill.
  • Sebastian Pütz and Patricia Fischer reviewed a lot of code across the sticker projects.
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].