All Projects → RoBERTaABSA → Similar Projects or Alternatives

24 Open source projects that are alternatives of or similar to RoBERTaABSA

AspectBasedSentimentAnalysis
Aspect Based Sentiment Analysis is a special type of sentiment analysis. In an explicit aspect, opinion is expressed on a target(opinion target), this aspect-polarity extraction is known as ABSA.
Stars: ✭ 61 (-45.54%)
Mutual labels:  absa
ABSADatasets
Public & Community-shared datasets for Aspect-based sentiment analysis and Text Classification
Stars: ✭ 49 (-56.25%)
Mutual labels:  absa
MemNet ABSA
No description or website provided.
Stars: ✭ 20 (-82.14%)
Mutual labels:  absa
Transferable-E2E-ABSA
Transferable End-to-End Aspect-based Sentiment Analysis with Selective Adversarial Learning (EMNLP'19)
Stars: ✭ 62 (-44.64%)
Mutual labels:  absa
Bertviz
Tool for visualizing attention in the Transformer model (BERT, GPT-2, Albert, XLNet, RoBERTa, CTRL, etc.)
Stars: ✭ 3,443 (+2974.11%)
Mutual labels:  roberta
Clue
中文语言理解测评基准 Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard
Stars: ✭ 2,425 (+2065.18%)
Mutual labels:  roberta
Roberta zh
RoBERTa中文预训练模型: RoBERTa for Chinese
Stars: ✭ 1,953 (+1643.75%)
Mutual labels:  roberta
Chinese Bert Wwm
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
Stars: ✭ 6,357 (+5575.89%)
Mutual labels:  roberta
Albert zh
A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
Stars: ✭ 3,500 (+3025%)
Mutual labels:  roberta
CLUE pytorch
CLUE baseline pytorch CLUE的pytorch版本基线
Stars: ✭ 72 (-35.71%)
Mutual labels:  roberta
erc
Emotion recognition in conversation
Stars: ✭ 34 (-69.64%)
Mutual labels:  roberta
KLUE
📖 Korean NLU Benchmark
Stars: ✭ 420 (+275%)
Mutual labels:  roberta
koclip
KoCLIP: Korean port of OpenAI CLIP, in Flax
Stars: ✭ 80 (-28.57%)
Mutual labels:  roberta
Tianchi2020ChineseMedicineQuestionGeneration
2020 阿里云天池大数据竞赛-中医药文献问题生成挑战赛
Stars: ✭ 20 (-82.14%)
Mutual labels:  roberta
COVID-19-Tweet-Classification-using-Roberta-and-Bert-Simple-Transformers
Rank 1 / 216
Stars: ✭ 24 (-78.57%)
Mutual labels:  roberta
PoLitBert
Polish RoBERTA model trained on Polish literature, Wikipedia, and Oscar. The major assumption is that quality text will give a good model.
Stars: ✭ 25 (-77.68%)
Mutual labels:  roberta
Text-Summarization
Abstractive and Extractive Text summarization using Transformers.
Stars: ✭ 38 (-66.07%)
Mutual labels:  roberta
roberta-wwm-base-distill
this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large
Stars: ✭ 61 (-45.54%)
Mutual labels:  roberta
openroberta-lab
The programming environment »Open Roberta Lab« by Fraunhofer IAIS enables children and adolescents to program robots. A variety of different programming blocks are provided to program motors and sensors of the robot. Open Roberta Lab uses an approach of graphical programming so that beginners can seamlessly start coding. As a cloud-based applica…
Stars: ✭ 98 (-12.5%)
Mutual labels:  roberta
RECCON
This repository contains the dataset and the PyTorch implementations of the models from the paper Recognizing Emotion Cause in Conversations.
Stars: ✭ 126 (+12.5%)
Mutual labels:  roberta
japanese-pretrained-models
Code for producing Japanese pretrained models provided by rinna Co., Ltd.
Stars: ✭ 484 (+332.14%)
Mutual labels:  roberta
Transformer-QG-on-SQuAD
Implement Question Generator with SOTA pre-trained Language Models (RoBERTa, BERT, GPT, BART, T5, etc.)
Stars: ✭ 28 (-75%)
Mutual labels:  roberta
les-military-mrc-rank7
莱斯杯:全国第二届“军事智能机器阅读”挑战赛 - Rank7 解决方案
Stars: ✭ 37 (-66.96%)
Mutual labels:  roberta
vietnamese-roberta
A Robustly Optimized BERT Pretraining Approach for Vietnamese
Stars: ✭ 22 (-80.36%)
Mutual labels:  roberta
1-24 of 24 similar projects