All Projects → bangla-bert → Similar Projects or Alternatives

308 Open source projects that are alternatives of or similar to bangla-bert

ttt
A package for fine-tuning Transformers with TPUs, written in Tensorflow2.0+
Stars: ✭ 35 (-14.63%)
Mutual labels:  transformers
modules
The official repository for our paper "Are Neural Nets Modular? Inspecting Functional Modularity Through Differentiable Weight Masks". We develop a method for analyzing emerging functional modularity in neural networks based on differentiable weight masks and use it to point out important issues in current-day neural networks.
Stars: ✭ 25 (-39.02%)
Mutual labels:  transformers
DeepNER
An Easy-to-use, Modular and Prolongable package of deep-learning based Named Entity Recognition Models.
Stars: ✭ 9 (-78.05%)
Mutual labels:  bert
SemEval2019Task3
Code for ANA at SemEval-2019 Task 3
Stars: ✭ 41 (+0%)
Mutual labels:  bert
muse-as-service
REST API for sentence tokenization and embedding using Multilingual Universal Sentence Encoder.
Stars: ✭ 45 (+9.76%)
Mutual labels:  bert
TermiNetwork
🌏 A zero-dependency networking solution for building modern and secure iOS, watchOS, macOS and tvOS applications.
Stars: ✭ 80 (+95.12%)
Mutual labels:  transformers
SentimentAnalysis
(BOW, TF-IDF, Word2Vec, BERT) Word Embeddings + (SVM, Naive Bayes, Decision Tree, Random Forest) Base Classifiers + Pre-trained BERT on Tensorflow Hub + 1-D CNN and Bi-Directional LSTM on IMDB Movie Reviews Dataset
Stars: ✭ 40 (-2.44%)
Mutual labels:  bert
deepfrog
An NLP-suite powered by deep learning
Stars: ✭ 16 (-60.98%)
Mutual labels:  transformers
serverless-transformers-on-aws-lambda
Deploy transformers serverless on AWS Lambda
Stars: ✭ 100 (+143.9%)
Mutual labels:  transformers
viewpoint-mining
参考NER,基于BERT的电商评论观点挖掘和情感分析
Stars: ✭ 31 (-24.39%)
Mutual labels:  bert
spark-transformers
Spark-Transformers: Library for exporting Apache Spark MLLIB models to use them in any Java application with no other dependencies.
Stars: ✭ 39 (-4.88%)
Mutual labels:  transformers
mcQA
🔮 Answering multiple choice questions with Language Models.
Stars: ✭ 23 (-43.9%)
Mutual labels:  bert
cdQA-ui
⛔ [NOT MAINTAINED] A web interface for cdQA and other question answering systems.
Stars: ✭ 19 (-53.66%)
Mutual labels:  bert
BERT-Chinese-Couplet
BERT for Chinese Couplet | BERT用于自动对对联
Stars: ✭ 19 (-53.66%)
Mutual labels:  bert
tfbert
基于tensorflow1.x的预训练模型调用,支持单机多卡、梯度累积,XLA加速,混合精度。可灵活训练、验证、预测。
Stars: ✭ 54 (+31.71%)
Mutual labels:  bert
CoronaXiv
First Prize in HackJaipur Hackathon 2020 for Best ElasticSearch-based Product! Website: http://coronaxiv2.surge.sh/#/
Stars: ✭ 15 (-63.41%)
Mutual labels:  bert
JD2Skills-BERT-XMLC
Code and Dataset for the Bhola et al. (2020) Retrieving Skills from Job Descriptions: A Language Model Based Extreme Multi-label Classification Framework
Stars: ✭ 33 (-19.51%)
Mutual labels:  bert
pytorch-vit
An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale
Stars: ✭ 250 (+509.76%)
Mutual labels:  transformers
bern
A neural named entity recognition and multi-type normalization tool for biomedical text mining
Stars: ✭ 151 (+268.29%)
Mutual labels:  bert
Xpersona
XPersona: Evaluating Multilingual Personalized Chatbot
Stars: ✭ 54 (+31.71%)
Mutual labels:  bert
Quality-Estimation2
机器翻译子任务-翻译质量评价-在BERT模型后面加上Bi-LSTM进行fine-tuning
Stars: ✭ 31 (-24.39%)
Mutual labels:  bert
molecule-attention-transformer
Pytorch reimplementation of Molecule Attention Transformer, which uses a transformer to tackle the graph-like structure of molecules
Stars: ✭ 46 (+12.2%)
Mutual labels:  transformers
Product-Categorization-NLP
Multi-Class Text Classification for products based on their description with Machine Learning algorithms and Neural Networks (MLP, CNN, Distilbert).
Stars: ✭ 30 (-26.83%)
Mutual labels:  transformers
converse
Conversational text Analysis using various NLP techniques
Stars: ✭ 147 (+258.54%)
Mutual labels:  transformers
ai web RISKOUT BTS
국방 리스크 관리 플랫폼 (🏅 국방부장관상/Minister of National Defense Award)
Stars: ✭ 18 (-56.1%)
Mutual labels:  bert
SQUAD2.Q-Augmented-Dataset
Augmented version of SQUAD 2.0 for Questions
Stars: ✭ 31 (-24.39%)
Mutual labels:  bert
bert-movie-reviews-sentiment-classifier
Build a Movie Reviews Sentiment Classifier with Google's BERT Language Model
Stars: ✭ 12 (-70.73%)
Mutual labels:  bert
textwiser
[AAAI 2021] TextWiser: Text Featurization Library
Stars: ✭ 26 (-36.59%)
Mutual labels:  bert
are-16-heads-really-better-than-1
Code for the paper "Are Sixteen Heads Really Better than One?"
Stars: ✭ 128 (+212.2%)
Mutual labels:  bert
text2keywords
Trained T5 and T5-large model for creating keywords from text
Stars: ✭ 53 (+29.27%)
Mutual labels:  transformers
gnn-lspe
Source code for GNN-LSPE (Graph Neural Networks with Learnable Structural and Positional Representations), ICLR 2022
Stars: ✭ 165 (+302.44%)
Mutual labels:  transformers
PyTorch-Model-Compare
Compare neural networks by their feature similarity
Stars: ✭ 119 (+190.24%)
Mutual labels:  transformers
DocSum
A tool to automatically summarize documents abstractively using the BART or PreSumm Machine Learning Model.
Stars: ✭ 58 (+41.46%)
Mutual labels:  transformers
minicons
Utility for analyzing Transformer based representations of language.
Stars: ✭ 28 (-31.71%)
Mutual labels:  transformers
small-text
Active Learning for Text Classification in Python
Stars: ✭ 241 (+487.8%)
Mutual labels:  transformers
xpandas
Universal 1d/2d data containers with Transformers functionality for data analysis.
Stars: ✭ 25 (-39.02%)
Mutual labels:  transformers
WellcomeML
Repository for Machine Learning utils at the Wellcome Trust
Stars: ✭ 31 (-24.39%)
Mutual labels:  transformers
DocProduct
Medical Q&A with Deep Language Models
Stars: ✭ 527 (+1185.37%)
Mutual labels:  bert
mirror-bert
[EMNLP 2021] Mirror-BERT: Converting Pretrained Language Models to universal text encoders without labels.
Stars: ✭ 56 (+36.59%)
Mutual labels:  bert
C-Tran
General Multi-label Image Classification with Transformers
Stars: ✭ 106 (+158.54%)
Mutual labels:  transformers
NSP-BERT
The code for our paper "NSP-BERT: A Prompt-based Zero-Shot Learner Through an Original Pre-training Task —— Next Sentence Prediction"
Stars: ✭ 166 (+304.88%)
Mutual labels:  bert
eve-bot
EVE bot, a customer service chatbot to enhance virtual engagement for Twitter Apple Support
Stars: ✭ 31 (-24.39%)
Mutual labels:  transformers
BERT-embedding
A simple wrapper class for extracting features(embedding) and comparing them using BERT in TensorFlow
Stars: ✭ 24 (-41.46%)
Mutual labels:  bert
task-transferability
Data and code for our paper "Exploring and Predicting Transferability across NLP Tasks", to appear at EMNLP 2020.
Stars: ✭ 35 (-14.63%)
Mutual labels:  bert
transformer-models
Deep Learning Transformer models in MATLAB
Stars: ✭ 90 (+119.51%)
Mutual labels:  bert
rasa milktea chatbot
Chatbot with bert chinese model, base on rasa framework(中文聊天机器人,结合bert意图分析,基于rasa框架)
Stars: ✭ 97 (+136.59%)
Mutual labels:  bert
bert experimental
code and supplementary materials for a series of Medium articles about the BERT model
Stars: ✭ 72 (+75.61%)
Mutual labels:  bert
berserker
Berserker - BERt chineSE woRd toKenizER
Stars: ✭ 17 (-58.54%)
Mutual labels:  bert
bert-tensorflow-pytorch-spacy-conversion
Instructions for how to convert a BERT Tensorflow model to work with HuggingFace's pytorch-transformers, and spaCy. This walk-through uses DeepPavlov's RuBERT as example.
Stars: ✭ 26 (-36.59%)
Mutual labels:  bert
trove
Weakly supervised medical named entity classification
Stars: ✭ 55 (+34.15%)
Mutual labels:  bert
ark-nlp
A private nlp coding package, which quickly implements the SOTA solutions.
Stars: ✭ 232 (+465.85%)
Mutual labels:  bert
BERT-for-Chinese-Question-Answering
No description or website provided.
Stars: ✭ 75 (+82.93%)
Mutual labels:  bert
KLUE
📖 Korean NLU Benchmark
Stars: ✭ 420 (+924.39%)
Mutual labels:  bert
n-grammer-pytorch
Implementation of N-Grammer, augmenting Transformers with latent n-grams, in Pytorch
Stars: ✭ 50 (+21.95%)
Mutual labels:  transformers
lightning-transformers
Flexible components pairing 🤗 Transformers with Pytorch Lightning
Stars: ✭ 551 (+1243.9%)
Mutual labels:  transformers
robustness-vit
Contains code for the paper "Vision Transformers are Robust Learners" (AAAI 2022).
Stars: ✭ 78 (+90.24%)
Mutual labels:  transformers
embeddings
Embeddings: State-of-the-art Text Representations for Natural Language Processing tasks, an initial version of library focus on the Polish Language
Stars: ✭ 27 (-34.15%)
Mutual labels:  lm
classifier multi label
multi-label,classifier,text classification,多标签文本分类,文本分类,BERT,ALBERT,multi-label-classification
Stars: ✭ 127 (+209.76%)
Mutual labels:  bert
LMMS
Language Modelling Makes Sense - WSD (and more) with Contextual Embeddings
Stars: ✭ 79 (+92.68%)
Mutual labels:  bert
long-short-transformer
Implementation of Long-Short Transformer, combining local and global inductive biases for attention over long sequences, in Pytorch
Stars: ✭ 103 (+151.22%)
Mutual labels:  transformers
61-120 of 308 similar projects