All Projects → bert_in_a_flask → Similar Projects or Alternatives

543 Open source projects that are alternatives of or similar to bert_in_a_flask

charformer-pytorch
Implementation of the GBST block from the Charformer paper, in Pytorch
Stars: ✭ 74 (+131.25%)
Mutual labels:  transformer
Transformer-in-Transformer
An Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches
Stars: ✭ 40 (+25%)
Mutual labels:  transformer
trapper
State-of-the-art NLP through transformer models in a modular design and consistent APIs.
Stars: ✭ 28 (-12.5%)
Mutual labels:  transformer
pytorch-transformer-chatbot
PyTorch v1.2에서 생긴 Transformer API 를 이용한 간단한 Chitchat 챗봇
Stars: ✭ 44 (+37.5%)
Mutual labels:  transformer
galerkin-transformer
[NeurIPS 2021] Galerkin Transformer: a linear attention without softmax
Stars: ✭ 111 (+246.88%)
Mutual labels:  transformer
tutel
Tutel MoE: An Optimized Mixture-of-Experts Implementation
Stars: ✭ 183 (+471.88%)
Mutual labels:  transformer
bert tokenization for java
This is a java version of Chinese tokenization descried in BERT.
Stars: ✭ 39 (+21.88%)
Mutual labels:  bert
linformer
Implementation of Linformer for Pytorch
Stars: ✭ 119 (+271.88%)
Mutual labels:  transformer
mcQA
🔮 Answering multiple choice questions with Language Models.
Stars: ✭ 23 (-28.12%)
Mutual labels:  bert
KAREN
KAREN: Unifying Hatespeech Detection and Benchmarking
Stars: ✭ 18 (-43.75%)
Mutual labels:  bert
T3
[EMNLP 2020] "T3: Tree-Autoencoder Constrained Adversarial Text Generation for Targeted Attack" by Boxin Wang, Hengzhi Pei, Boyuan Pan, Qian Chen, Shuohang Wang, Bo Li
Stars: ✭ 25 (-21.87%)
Mutual labels:  bert
bern
A neural named entity recognition and multi-type normalization tool for biomedical text mining
Stars: ✭ 151 (+371.88%)
Mutual labels:  bert
KLUE
📖 Korean NLU Benchmark
Stars: ✭ 420 (+1212.5%)
Mutual labels:  bert
WSDM-Cup-2019
[ACM-WSDM] 3rd place solution at WSDM Cup 2019, Fake News Classification on Kaggle.
Stars: ✭ 62 (+93.75%)
Mutual labels:  bert
SemEval2019Task3
Code for ANA at SemEval-2019 Task 3
Stars: ✭ 41 (+28.13%)
Mutual labels:  bert
laravel5-jsonapi-dingo
Laravel5 JSONAPI and Dingo together to build APIs fast
Stars: ✭ 29 (-9.37%)
Mutual labels:  transformer
kwx
BERT, LDA, and TFIDF based keyword extraction in Python
Stars: ✭ 33 (+3.13%)
Mutual labels:  bert
robo-vln
Pytorch code for ICRA'21 paper: "Hierarchical Cross-Modal Agent for Robotics Vision-and-Language Navigation"
Stars: ✭ 34 (+6.25%)
Mutual labels:  bert
FinBERT
A Pretrained BERT Model for Financial Communications. https://arxiv.org/abs/2006.08097
Stars: ✭ 193 (+503.13%)
Mutual labels:  bert
Restormer
[CVPR 2022--Oral] Restormer: Efficient Transformer for High-Resolution Image Restoration. SOTA for motion deblurring, image deraining, denoising (Gaussian/real data), and defocus deblurring.
Stars: ✭ 586 (+1731.25%)
Mutual labels:  transformer
TextPruner
A PyTorch-based model pruning toolkit for pre-trained language models
Stars: ✭ 94 (+193.75%)
Mutual labels:  transformer
Learning-Lab-C-Library
This library provides a set of basic functions for different type of deep learning (and other) algorithms in C.This deep learning library will be constantly updated
Stars: ✭ 20 (-37.5%)
Mutual labels:  transformer
cdQA-ui
⛔ [NOT MAINTAINED] A web interface for cdQA and other question answering systems.
Stars: ✭ 19 (-40.62%)
Mutual labels:  bert
Walk-Transformer
From Random Walks to Transformer for Learning Node Embeddings (ECML-PKDD 2020) (In Pytorch and Tensorflow)
Stars: ✭ 26 (-18.75%)
Mutual labels:  transformer
policy-data-analyzer
Building a model to recognize incentives for landscape restoration in environmental policies from Latin America, the US and India. Bringing NLP to the world of policy analysis through an extensible framework that includes scraping, preprocessing, active learning and text analysis pipelines.
Stars: ✭ 22 (-31.25%)
Mutual labels:  bert
Visual-Transformer-Paper-Summary
Summary of Transformer applications for computer vision tasks.
Stars: ✭ 51 (+59.38%)
Mutual labels:  transformer
SQUAD2.Q-Augmented-Dataset
Augmented version of SQUAD 2.0 for Questions
Stars: ✭ 31 (-3.12%)
Mutual labels:  bert
FragmentVC
Any-to-any voice conversion by end-to-end extracting and fusing fine-grained voice fragments with attention
Stars: ✭ 134 (+318.75%)
Mutual labels:  transformer
FewCLUE
FewCLUE 小样本学习测评基准,中文版
Stars: ✭ 251 (+684.38%)
Mutual labels:  bert
uformer-pytorch
Implementation of Uformer, Attention-based Unet, in Pytorch
Stars: ✭ 54 (+68.75%)
Mutual labels:  transformer
PIE
Fast + Non-Autoregressive Grammatical Error Correction using BERT. Code and Pre-trained models for paper "Parallel Iterative Edit Models for Local Sequence Transduction": www.aclweb.org/anthology/D19-1435.pdf (EMNLP-IJCNLP 2019)
Stars: ✭ 164 (+412.5%)
Mutual labels:  bert
KoBERT-nsmc
Naver movie review sentiment classification with KoBERT
Stars: ✭ 57 (+78.13%)
Mutual labels:  bert
kosr
Korean speech recognition based on transformer (트랜스포머 기반 한국어 음성 인식)
Stars: ✭ 25 (-21.87%)
Mutual labels:  transformer
BERT-Chinese-Couplet
BERT for Chinese Couplet | BERT用于自动对对联
Stars: ✭ 19 (-40.62%)
Mutual labels:  bert
enformer-pytorch
Implementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch
Stars: ✭ 146 (+356.25%)
Mutual labels:  transformer
SOLQ
"SOLQ: Segmenting Objects by Learning Queries", SOLQ is an end-to-end instance segmentation framework with Transformer.
Stars: ✭ 159 (+396.88%)
Mutual labels:  transformer
TS-CAM
Codes for TS-CAM: Token Semantic Coupled Attention Map for Weakly Supervised Object Localization.
Stars: ✭ 96 (+200%)
Mutual labels:  transformer
RSTNet
RSTNet: Captioning with Adaptive Attention on Visual and Non-Visual Words (CVPR 2021)
Stars: ✭ 71 (+121.88%)
Mutual labels:  transformer
VideoBERT
Using VideoBERT to tackle video prediction
Stars: ✭ 56 (+75%)
Mutual labels:  bert
zero-administration-inference-with-aws-lambda-for-hugging-face
Zero administration inference with AWS Lambda for 🤗
Stars: ✭ 19 (-40.62%)
Mutual labels:  transformer
text2keywords
Trained T5 and T5-large model for creating keywords from text
Stars: ✭ 53 (+65.63%)
Mutual labels:  transformer
DocProduct
Medical Q&A with Deep Language Models
Stars: ✭ 527 (+1546.88%)
Mutual labels:  bert
OpenDialog
An Open-Source Package for Chinese Open-domain Conversational Chatbot (中文闲聊对话系统,一键部署微信闲聊机器人)
Stars: ✭ 94 (+193.75%)
Mutual labels:  bert
deformer
[ACL 2020] DeFormer: Decomposing Pre-trained Transformers for Faster Question Answering
Stars: ✭ 111 (+246.88%)
Mutual labels:  transformer
knowledge-graph-nlp-in-action
从模型训练到部署,实战知识图谱(Knowledge Graph)&自然语言处理(NLP)。涉及 Tensorflow, Bert+Bi-LSTM+CRF,Neo4j等 涵盖 Named Entity Recognition,Text Classify,Information Extraction,Relation Extraction 等任务。
Stars: ✭ 58 (+81.25%)
Mutual labels:  bert
classy
classy is a simple-to-use library for building high-performance Machine Learning models in NLP.
Stars: ✭ 61 (+90.63%)
Mutual labels:  bert
CSV2RDF
Streaming, transforming, SPARQL-based CSV to RDF converter. Apache license.
Stars: ✭ 48 (+50%)
Mutual labels:  transformer
text-style-transfer-benchmark
Text style transfer benchmark
Stars: ✭ 56 (+75%)
Mutual labels:  transformer
TextPair
文本对关系比较 - 语义相似度、字面相似度、文本蕴含等等
Stars: ✭ 44 (+37.5%)
Mutual labels:  bert
MinTL
MinTL: Minimalist Transfer Learning for Task-Oriented Dialogue Systems
Stars: ✭ 61 (+90.63%)
Mutual labels:  transformer
ParsBigBird
Persian Bert For Long-Range Sequences
Stars: ✭ 58 (+81.25%)
Mutual labels:  bert
ADL2019
Applied Deep Learning (2019 Spring) @ NTU
Stars: ✭ 20 (-37.5%)
Mutual labels:  bert
text2text
Text2Text: Cross-lingual natural language processing and generation toolkit
Stars: ✭ 188 (+487.5%)
Mutual labels:  bert
HRFormer
This is an official implementation of our NeurIPS 2021 paper "HRFormer: High-Resolution Transformer for Dense Prediction".
Stars: ✭ 357 (+1015.63%)
Mutual labels:  transformer
ark-nlp
A private nlp coding package, which quickly implements the SOTA solutions.
Stars: ✭ 232 (+625%)
Mutual labels:  bert
NER-FunTool
本NER项目包含多个中文数据集,模型采用BiLSTM+CRF、BERT+Softmax、BERT+Cascade、BERT+WOL等,最后用TFServing进行模型部署,线上推理和线下推理。
Stars: ✭ 56 (+75%)
Mutual labels:  bert
Transformer Survey Study
"A survey of Transformer" paper study 👩🏻‍💻🧑🏻‍💻 KoreaUniv. DSBA Lab
Stars: ✭ 166 (+418.75%)
Mutual labels:  transformer
LaTeX-OCR
pix2tex: Using a ViT to convert images of equations into LaTeX code.
Stars: ✭ 1,566 (+4793.75%)
Mutual labels:  transformer
Swin-Transformer-Tensorflow
Unofficial implementation of "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows" (https://arxiv.org/abs/2103.14030)
Stars: ✭ 45 (+40.63%)
Mutual labels:  transformer
iamQA
中文wiki百科QA阅读理解问答系统,使用了CCKS2016数据的NER模型和CMRC2018的阅读理解模型,还有W2V词向量搜索,使用torchserve部署
Stars: ✭ 46 (+43.75%)
Mutual labels:  bert
61-120 of 543 similar projects