All Projects → are-16-heads-really-better-than-1 → Similar Projects or Alternatives

540 Open source projects that are alternatives of or similar to are-16-heads-really-better-than-1

Filipino-Text-Benchmarks
Open-source benchmark datasets and pretrained transformer models in the Filipino language.
Stars: ✭ 22 (-82.81%)
Mutual labels:  transformer, bert
tensorflow-ml-nlp-tf2
텐서플로2와 머신러닝으로 시작하는 자연어처리 (로지스틱회귀부터 BERT와 GPT3까지) 실습자료
Stars: ✭ 245 (+91.41%)
Mutual labels:  transformer, bert
FasterTransformer
Transformer related optimization, including BERT, GPT
Stars: ✭ 1,571 (+1127.34%)
Mutual labels:  transformer, bert
Kevinpro-NLP-demo
All NLP you Need Here. 个人实现了一些好玩的NLP demo,目前包含13个NLP应用的pytorch实现
Stars: ✭ 117 (-8.59%)
Mutual labels:  transformer, bert
les-military-mrc-rank7
莱斯杯:全国第二届“军事智能机器阅读”挑战赛 - Rank7 解决方案
Stars: ✭ 37 (-71.09%)
Mutual labels:  transformer, bert
KitanaQA
KitanaQA: Adversarial training and data augmentation for neural question-answering models
Stars: ✭ 58 (-54.69%)
Mutual labels:  transformer, bert
bert in a flask
A dockerized flask API, serving ALBERT and BERT predictions using TensorFlow 2.0.
Stars: ✭ 32 (-75%)
Mutual labels:  transformer, bert
Transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Stars: ✭ 55,742 (+43448.44%)
Mutual labels:  transformer, bert
Bert Pytorch
Google AI 2018 BERT pytorch implementation
Stars: ✭ 4,642 (+3526.56%)
Mutual labels:  transformer, bert
sister
SImple SenTence EmbeddeR
Stars: ✭ 66 (-48.44%)
Mutual labels:  transformer, bert
Xpersona
XPersona: Evaluating Multilingual Personalized Chatbot
Stars: ✭ 54 (-57.81%)
Mutual labels:  transformer, bert
NLP-paper
🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-82.03%)
Mutual labels:  transformer, bert
Bertviz
Tool for visualizing attention in the Transformer model (BERT, GPT-2, Albert, XLNet, RoBERTa, CTRL, etc.)
Stars: ✭ 3,443 (+2589.84%)
Mutual labels:  transformer, bert
sticker2
Further developed as SyntaxDot: https://github.com/tensordot/syntaxdot
Stars: ✭ 14 (-89.06%)
Mutual labels:  transformer, bert
semantic-document-relations
Implementation, trained models and result data for the paper "Pairwise Multi-Class Document Classification for Semantic Relations between Wikipedia Articles"
Stars: ✭ 21 (-83.59%)
Mutual labels:  transformer, bert
text-generation-transformer
text generation based on transformer
Stars: ✭ 36 (-71.87%)
Mutual labels:  transformer, bert
golgotha
Contextualised Embeddings and Language Modelling using BERT and Friends using R
Stars: ✭ 39 (-69.53%)
Mutual labels:  transformer, bert
Nlp Tutorial
Natural Language Processing Tutorial for Deep Learning Researchers
Stars: ✭ 9,895 (+7630.47%)
Mutual labels:  transformer, bert
bert-as-a-service TFX
End-to-end pipeline with TFX to train and deploy a BERT model for sentiment analysis.
Stars: ✭ 32 (-75%)
Mutual labels:  transformer, bert
SIGIR2021 Conure
One Person, One Model, One World: Learning Continual User Representation without Forgetting
Stars: ✭ 23 (-82.03%)
Mutual labels:  transformer, bert
vietnamese-roberta
A Robustly Optimized BERT Pretraining Approach for Vietnamese
Stars: ✭ 22 (-82.81%)
Mutual labels:  transformer, bert
TabFormer
Code & Data for "Tabular Transformers for Modeling Multivariate Time Series" (ICASSP, 2021)
Stars: ✭ 209 (+63.28%)
Mutual labels:  transformer, bert
PDN
The official PyTorch implementation of "Pathfinder Discovery Networks for Neural Message Passing" (WebConf '21)
Stars: ✭ 44 (-65.62%)
Mutual labels:  transformer, bert
transformer-models
Deep Learning Transformer models in MATLAB
Stars: ✭ 90 (-29.69%)
Mutual labels:  transformer, bert
LightLM
高性能小模型测评 Shared Tasks in NLPCC 2020. Task 1 - Light Pre-Training Chinese Language Model for NLP Task
Stars: ✭ 54 (-57.81%)
Mutual labels:  bert
Image-Caption
Using LSTM or Transformer to solve Image Captioning in Pytorch
Stars: ✭ 36 (-71.87%)
Mutual labels:  transformer
Quality-Estimation2
机器翻译子任务-翻译质量评价-在BERT模型后面加上Bi-LSTM进行fine-tuning
Stars: ✭ 31 (-75.78%)
Mutual labels:  bert
zero-administration-inference-with-aws-lambda-for-hugging-face
Zero administration inference with AWS Lambda for 🤗
Stars: ✭ 19 (-85.16%)
Mutual labels:  transformer
ALBERT-Pytorch
Pytorch Implementation of ALBERT(A Lite BERT for Self-supervised Learning of Language Representations)
Stars: ✭ 214 (+67.19%)
Mutual labels:  bert
bert sa
bert sentiment analysis tensorflow serving with RESTful API
Stars: ✭ 35 (-72.66%)
Mutual labels:  bert
php-hal
HAL+JSON & HAL+XML API transformer outputting valid (PSR-7) API Responses.
Stars: ✭ 30 (-76.56%)
Mutual labels:  transformer
M3DETR
Code base for M3DeTR: Multi-representation, Multi-scale, Mutual-relation 3D Object Detection with Transformers
Stars: ✭ 47 (-63.28%)
Mutual labels:  transformer
Text and Audio classification with Bert
Text Classification in Turkish Texts with Bert
Stars: ✭ 34 (-73.44%)
Mutual labels:  bert
Medi-CoQA
Conversational Question Answering on Clinical Text
Stars: ✭ 22 (-82.81%)
Mutual labels:  bert
bert quora question pairs
BERT Model Fine-tuning on Quora Questions Pairs
Stars: ✭ 28 (-78.12%)
Mutual labels:  bert
KLUE
📖 Korean NLU Benchmark
Stars: ✭ 420 (+228.13%)
Mutual labels:  bert
OpenDialog
An Open-Source Package for Chinese Open-domain Conversational Chatbot (中文闲聊对话系统,一键部署微信闲聊机器人)
Stars: ✭ 94 (-26.56%)
Mutual labels:  bert
CoronaXiv
First Prize in HackJaipur Hackathon 2020 for Best ElasticSearch-based Product! Website: http://coronaxiv2.surge.sh/#/
Stars: ✭ 15 (-88.28%)
Mutual labels:  bert
SemEval2019Task3
Code for ANA at SemEval-2019 Task 3
Stars: ✭ 41 (-67.97%)
Mutual labels:  bert
TadTR
End-to-end Temporal Action Detection with Transformer. [Under review for a journal publication]
Stars: ✭ 55 (-57.03%)
Mutual labels:  transformer
well-classified-examples-are-underestimated
Code for the AAAI 2022 publication "Well-classified Examples are Underestimated in Classification with Deep Neural Networks"
Stars: ✭ 21 (-83.59%)
Mutual labels:  transformer
Transformer tf2.0
Transfromer tensorflow2.0版本实现
Stars: ✭ 23 (-82.03%)
Mutual labels:  transformer
Embedding
Embedding模型代码和学习笔记总结
Stars: ✭ 25 (-80.47%)
Mutual labels:  transformer
classy
classy is a simple-to-use library for building high-performance Machine Learning models in NLP.
Stars: ✭ 61 (-52.34%)
Mutual labels:  bert
NSP-BERT
The code for our paper "NSP-BERT: A Prompt-based Zero-Shot Learner Through an Original Pre-training Task —— Next Sentence Prediction"
Stars: ✭ 166 (+29.69%)
Mutual labels:  bert
robo-vln
Pytorch code for ICRA'21 paper: "Hierarchical Cross-Modal Agent for Robotics Vision-and-Language Navigation"
Stars: ✭ 34 (-73.44%)
Mutual labels:  bert
bert tokenization for java
This is a java version of Chinese tokenization descried in BERT.
Stars: ✭ 39 (-69.53%)
Mutual labels:  bert
max-deeplab
Unofficial implementation of MaX-DeepLab for Instance Segmentation
Stars: ✭ 84 (-34.37%)
Mutual labels:  transformer
Restormer
[CVPR 2022--Oral] Restormer: Efficient Transformer for High-Resolution Image Restoration. SOTA for motion deblurring, image deraining, denoising (Gaussian/real data), and defocus deblurring.
Stars: ✭ 586 (+357.81%)
Mutual labels:  transformer
KAREN
KAREN: Unifying Hatespeech Detection and Benchmarking
Stars: ✭ 18 (-85.94%)
Mutual labels:  bert
mcQA
🔮 Answering multiple choice questions with Language Models.
Stars: ✭ 23 (-82.03%)
Mutual labels:  bert
Visual-Transformer-Paper-Summary
Summary of Transformer applications for computer vision tasks.
Stars: ✭ 51 (-60.16%)
Mutual labels:  transformer
TextPair
文本对关系比较 - 语义相似度、字面相似度、文本蕴含等等
Stars: ✭ 44 (-65.62%)
Mutual labels:  bert
bert experimental
code and supplementary materials for a series of Medium articles about the BERT model
Stars: ✭ 72 (-43.75%)
Mutual labels:  bert
Learning-Lab-C-Library
This library provides a set of basic functions for different type of deep learning (and other) algorithms in C.This deep learning library will be constantly updated
Stars: ✭ 20 (-84.37%)
Mutual labels:  transformer
bern
A neural named entity recognition and multi-type normalization tool for biomedical text mining
Stars: ✭ 151 (+17.97%)
Mutual labels:  bert
TorchBlocks
A PyTorch-based toolkit for natural language processing
Stars: ✭ 85 (-33.59%)
Mutual labels:  bert
Walk-Transformer
From Random Walks to Transformer for Learning Node Embeddings (ECML-PKDD 2020) (In Pytorch and Tensorflow)
Stars: ✭ 26 (-79.69%)
Mutual labels:  transformer
WSDM-Cup-2019
[ACM-WSDM] 3rd place solution at WSDM Cup 2019, Fake News Classification on Kaggle.
Stars: ✭ 62 (-51.56%)
Mutual labels:  bert
1-60 of 540 similar projects