All Projects → Nlp Interview Notes → Similar Projects or Alternatives

513 Open source projects that are alternatives of or similar to Nlp Interview Notes

Nlp research
NLP research:基于tensorflow的nlp深度学习项目,支持文本分类/句子匹配/序列标注/文本生成 四大任务
Stars: ✭ 141 (-31.88%)
Mutual labels:  ner, transformer
Etagger
reference tensorflow code for named entity tagging
Stars: ✭ 100 (-51.69%)
Mutual labels:  ner, transformer
Meta Emb
Multilingual Meta-Embeddings for Named Entity Recognition (RepL4NLP & EMNLP 2019)
Stars: ✭ 28 (-86.47%)
Mutual labels:  ner, transformer
Bert Multitask Learning
BERT for Multitask Learning
Stars: ✭ 380 (+83.57%)
Mutual labels:  ner, transformer
Rust Bert
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
Stars: ✭ 510 (+146.38%)
Mutual labels:  ner, transformer
tensorflow-ml-nlp-tf2
텐서플로2와 머신러닝으로 시작하는 자연어처리 (로지스틱회귀부터 BERT와 GPT3까지) 실습자료
Stars: ✭ 245 (+18.36%)
Mutual labels:  transformer, ner
Nlp Experiments In Pytorch
PyTorch repository for text categorization and NER experiments in Turkish and English.
Stars: ✭ 35 (-83.09%)
Mutual labels:  ner, transformer
Ner Bert Pytorch
PyTorch solution of named entity recognition task Using Google AI's pre-trained BERT model.
Stars: ✭ 249 (+20.29%)
Mutual labels:  ner, transformer
verseagility
Ramp up your custom natural language processing (NLP) task, allowing you to bring your own data, use your preferred frameworks and bring models into production.
Stars: ✭ 23 (-88.89%)
Mutual labels:  transformer, ner
chinese-nlp-ner
一套针对中文实体识别的BLSTM-CRF解决方案
Stars: ✭ 14 (-93.24%)
Mutual labels:  ner
PAML
Personalizing Dialogue Agents via Meta-Learning
Stars: ✭ 114 (-44.93%)
Mutual labels:  transformer
ipymarkup
NER, syntax markup visualizations
Stars: ✭ 108 (-47.83%)
Mutual labels:  ner
lstm-crf-tagging
No description or website provided.
Stars: ✭ 13 (-93.72%)
Mutual labels:  ner
linformer
Implementation of Linformer for Pytorch
Stars: ✭ 119 (-42.51%)
Mutual labels:  transformer
huner
Named Entity Recognition for biomedical entities
Stars: ✭ 44 (-78.74%)
Mutual labels:  ner
KgCLUE
KgCLUE: 大规模中文开源知识图谱问答
Stars: ✭ 131 (-36.71%)
Mutual labels:  ner
deep-molecular-optimization
Molecular optimization by capturing chemist’s intuition using the Seq2Seq with attention and the Transformer
Stars: ✭ 60 (-71.01%)
Mutual labels:  transformer
Filipino-Text-Benchmarks
Open-source benchmark datasets and pretrained transformer models in the Filipino language.
Stars: ✭ 22 (-89.37%)
Mutual labels:  transformer
segmenter
[ICCV2021] Official PyTorch implementation of Segmenter: Transformer for Semantic Segmentation
Stars: ✭ 463 (+123.67%)
Mutual labels:  transformer
transformer
Neutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (-71.01%)
Mutual labels:  transformer
uformer-pytorch
Implementation of Uformer, Attention-based Unet, in Pytorch
Stars: ✭ 54 (-73.91%)
Mutual labels:  transformer
TextPruner
A PyTorch-based model pruning toolkit for pre-trained language models
Stars: ✭ 94 (-54.59%)
Mutual labels:  transformer
fairseq-tagging
a Fairseq fork for sequence tagging/labeling tasks
Stars: ✭ 26 (-87.44%)
Mutual labels:  ner
Image-Caption
Using LSTM or Transformer to solve Image Captioning in Pytorch
Stars: ✭ 36 (-82.61%)
Mutual labels:  transformer
M3DETR
Code base for M3DeTR: Multi-representation, Multi-scale, Mutual-relation 3D Object Detection with Transformers
Stars: ✭ 47 (-77.29%)
Mutual labels:  transformer
pynmt
a simple and complete pytorch implementation of neural machine translation system
Stars: ✭ 13 (-93.72%)
Mutual labels:  transformer
Legal-Entity-Recognition
A Dataset of German Legal Documents for Named Entity Recognition
Stars: ✭ 98 (-52.66%)
Mutual labels:  ner
zero-administration-inference-with-aws-lambda-for-hugging-face
Zero administration inference with AWS Lambda for 🤗
Stars: ✭ 19 (-90.82%)
Mutual labels:  transformer
keras-bert-ner
Keras solution of Chinese NER task using BiLSTM-CRF/BiGRU-CRF/IDCNN-CRF model with Pretrained Language Model: supporting BERT/RoBERTa/ALBERT
Stars: ✭ 7 (-96.62%)
Mutual labels:  ner
amrlib
A python library that makes AMR parsing, generation and visualization simple.
Stars: ✭ 107 (-48.31%)
Mutual labels:  transformer
galerkin-transformer
[NeurIPS 2021] Galerkin Transformer: a linear attention without softmax
Stars: ✭ 111 (-46.38%)
Mutual labels:  transformer
well-classified-examples-are-underestimated
Code for the AAAI 2022 publication "Well-classified Examples are Underestimated in Classification with Deep Neural Networks"
Stars: ✭ 21 (-89.86%)
Mutual labels:  transformer
laravel-mutate
Mutate Laravel attributes
Stars: ✭ 13 (-93.72%)
Mutual labels:  transformer
max-deeplab
Unofficial implementation of MaX-DeepLab for Instance Segmentation
Stars: ✭ 84 (-59.42%)
Mutual labels:  transformer
svgs2fonts
npm-svgs2fonts。svg图标转字体图标库(svgs -> svg,ttf,eot,woff,woff2),nodejs。
Stars: ✭ 29 (-85.99%)
Mutual labels:  transformer
NER corpus chinese
NER(命名实体识别)中文语料,一站式获取
Stars: ✭ 102 (-50.72%)
Mutual labels:  ner
NER-Multimodal-pytorch
Pytorch Implementation of "Adaptive Co-attention Network for Named Entity Recognition in Tweets" (AAAI 2018)
Stars: ✭ 42 (-79.71%)
Mutual labels:  ner
Transformer-in-Transformer
An Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches
Stars: ✭ 40 (-80.68%)
Mutual labels:  transformer
SIGIR2021 Conure
One Person, One Model, One World: Learning Continual User Representation without Forgetting
Stars: ✭ 23 (-88.89%)
Mutual labels:  transformer
pytorch-transformer-chatbot
PyTorch v1.2에서 생긴 Transformer API 를 이용한 간단한 Chitchat 챗봇
Stars: ✭ 44 (-78.74%)
Mutual labels:  transformer
attention-is-all-you-need-paper
Implementation of Vaswani, Ashish, et al. "Attention is all you need." Advances in neural information processing systems. 2017.
Stars: ✭ 97 (-53.14%)
Mutual labels:  transformer
tutel
Tutel MoE: An Optimized Mixture-of-Experts Implementation
Stars: ✭ 183 (-11.59%)
Mutual labels:  transformer
bert in a flask
A dockerized flask API, serving ALBERT and BERT predictions using TensorFlow 2.0.
Stars: ✭ 32 (-84.54%)
Mutual labels:  transformer
php-hal
HAL+JSON & HAL+XML API transformer outputting valid (PSR-7) API Responses.
Stars: ✭ 30 (-85.51%)
Mutual labels:  transformer
trapper
State-of-the-art NLP through transformer models in a modular design and consistent APIs.
Stars: ✭ 28 (-86.47%)
Mutual labels:  transformer
react-taggy
A simple zero-dependency React component for tagging user-defined entities within a block of text.
Stars: ✭ 29 (-85.99%)
Mutual labels:  ner
bert-as-a-service TFX
End-to-end pipeline with TFX to train and deploy a BERT model for sentiment analysis.
Stars: ✭ 32 (-84.54%)
Mutual labels:  transformer
TadTR
End-to-end Temporal Action Detection with Transformer. [Under review for a journal publication]
Stars: ✭ 55 (-73.43%)
Mutual labels:  transformer
semantic-document-relations
Implementation, trained models and result data for the paper "Pairwise Multi-Class Document Classification for Semantic Relations between Wikipedia Articles"
Stars: ✭ 21 (-89.86%)
Mutual labels:  transformer
Transformer tf2.0
Transfromer tensorflow2.0版本实现
Stars: ✭ 23 (-88.89%)
Mutual labels:  transformer
Embedding
Embedding模型代码和学习笔记总结
Stars: ✭ 25 (-87.92%)
Mutual labels:  transformer
Swin-Transformer-Tensorflow
Unofficial implementation of "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows" (https://arxiv.org/abs/2103.14030)
Stars: ✭ 45 (-78.26%)
Mutual labels:  transformer
charformer-pytorch
Implementation of the GBST block from the Charformer paper, in Pytorch
Stars: ✭ 74 (-64.25%)
Mutual labels:  transformer
laravel5-jsonapi-dingo
Laravel5 JSONAPI and Dingo together to build APIs fast
Stars: ✭ 29 (-85.99%)
Mutual labels:  transformer
Restormer
[CVPR 2022--Oral] Restormer: Efficient Transformer for High-Resolution Image Restoration. SOTA for motion deblurring, image deraining, denoising (Gaussian/real data), and defocus deblurring.
Stars: ✭ 586 (+183.09%)
Mutual labels:  transformer
Learning-Lab-C-Library
This library provides a set of basic functions for different type of deep learning (and other) algorithms in C.This deep learning library will be constantly updated
Stars: ✭ 20 (-90.34%)
Mutual labels:  transformer
are-16-heads-really-better-than-1
Code for the paper "Are Sixteen Heads Really Better than One?"
Stars: ✭ 128 (-38.16%)
Mutual labels:  transformer
mitie-ruby
Named-entity recognition for Ruby
Stars: ✭ 77 (-62.8%)
Mutual labels:  ner
few shot slot tagging and NER
PyTorch implementation of the paper: Vector Projection Network for Few-shot Slot Tagging in Natural Language Understanding. Su Zhu, Ruisheng Cao, Lu Chen and Kai Yu.
Stars: ✭ 17 (-91.79%)
Mutual labels:  ner
saint
The official PyTorch implementation of recent paper - SAINT: Improved Neural Networks for Tabular Data via Row Attention and Contrastive Pre-Training
Stars: ✭ 209 (+0.97%)
Mutual labels:  transformer
1-60 of 513 similar projects