MSAFOffical implementation of paper "MSAF: Multimodal Split Attention Fusion"
Stars: ✭ 47 (-17.54%)
SIGIR2021 ConureOne Person, One Model, One World: Learning Continual User Representation without Forgetting
Stars: ✭ 23 (-59.65%)
ercEmotion recognition in conversation
Stars: ✭ 34 (-40.35%)
hfusionMultimodal sentiment analysis using hierarchical fusion with context modeling
Stars: ✭ 42 (-26.32%)
BBFNThis repository contains the implementation of the paper -- Bi-Bimodal Modality Fusion for Correlation-Controlled Multimodal Sentiment Analysis
Stars: ✭ 42 (-26.32%)
task-transferabilityData and code for our paper "Exploring and Predicting Transferability across NLP Tasks", to appear at EMNLP 2020.
Stars: ✭ 35 (-38.6%)
ai web RISKOUT BTS국방 리스크 관리 플랫폼 (🏅 국방부장관상/Minister of National Defense Award)
Stars: ✭ 18 (-68.42%)
bert-tensorflow-pytorch-spacy-conversionInstructions for how to convert a BERT Tensorflow model to work with HuggingFace's pytorch-transformers, and spaCy. This walk-through uses DeepPavlov's RuBERT as example.
Stars: ✭ 26 (-54.39%)
AGHMNImplementation of the paper "Real-Time Emotion Recognition via Attention Gated Hierarchical Memory Network" in AAAI-2020.
Stars: ✭ 25 (-56.14%)
tfbert基于tensorflow1.x的预训练模型调用,支持单机多卡、梯度累积,XLA加速,混合精度。可灵活训练、验证、预测。
Stars: ✭ 54 (-5.26%)
classifier multi labelmulti-label,classifier,text classification,多标签文本分类,文本分类,BERT,ALBERT,multi-label-classification
Stars: ✭ 127 (+122.81%)
BERT-embeddingA simple wrapper class for extracting features(embedding) and comparing them using BERT in TensorFlow
Stars: ✭ 24 (-57.89%)
parsbert-ner🤗 ParsBERT Persian NER Tasks
Stars: ✭ 15 (-73.68%)
berserkerBerserker - BERt chineSE woRd toKenizER
Stars: ✭ 17 (-70.18%)
GCLList of Publications in Graph Contrastive Learning
Stars: ✭ 25 (-56.14%)
muse-as-serviceREST API for sentence tokenization and embedding using Multilingual Universal Sentence Encoder.
Stars: ✭ 45 (-21.05%)
XpersonaXPersona: Evaluating Multilingual Personalized Chatbot
Stars: ✭ 54 (-5.26%)
BossNAS(ICCV 2021) BossNAS: Exploring Hybrid CNN-transformers with Block-wisely Self-supervised Neural Architecture Search
Stars: ✭ 125 (+119.3%)
mirror-bert[EMNLP 2021] Mirror-BERT: Converting Pretrained Language Models to universal text encoders without labels.
Stars: ✭ 56 (-1.75%)
STEPSpatial Temporal Graph Convolutional Networks for Emotion Perception from Gaits
Stars: ✭ 39 (-31.58%)
bert-squeeze🛠️ Tools for Transformers compression using PyTorch Lightning ⚡
Stars: ✭ 56 (-1.75%)
rasa milktea chatbotChatbot with bert chinese model, base on rasa framework(中文聊天机器人,结合bert意图分析,基于rasa框架)
Stars: ✭ 97 (+70.18%)
CVPR21 PASSPyTorch implementation of our CVPR2021 (oral) paper "Prototype Augmentation and Self-Supervision for Incremental Learning"
Stars: ✭ 55 (-3.51%)
SoCo[NeurIPS 2021 Spotlight] Aligning Pretraining for Detection via Object-Level Contrastive Learning
Stars: ✭ 125 (+119.3%)
DeepNERAn Easy-to-use, Modular and Prolongable package of deep-learning based Named Entity Recognition Models.
Stars: ✭ 9 (-84.21%)
troveWeakly supervised medical named entity classification
Stars: ✭ 55 (-3.51%)
MSFOfficial code for "Mean Shift for Self-Supervised Learning"
Stars: ✭ 42 (-26.32%)
esvitEsViT: Efficient self-supervised Vision Transformers
Stars: ✭ 323 (+466.67%)
textwiser[AAAI 2021] TextWiser: Text Featurization Library
Stars: ✭ 26 (-54.39%)
Emotion and Polarity SOAn emotion classifier of text containing technical content from the SE domain
Stars: ✭ 74 (+29.82%)
BYOLBootstrap Your Own Latent: A New Approach to Self-Supervised Learning
Stars: ✭ 102 (+78.95%)
pillar-motionSelf-Supervised Pillar Motion Learning for Autonomous Driving (CVPR 2021)
Stars: ✭ 98 (+71.93%)
JD2Skills-BERT-XMLCCode and Dataset for the Bhola et al. (2020) Retrieving Skills from Job Descriptions: A Language Model Based Extreme Multi-label Classification Framework
Stars: ✭ 33 (-42.11%)
iMIXA framework for Multimodal Intelligence research from Inspur HSSLAB.
Stars: ✭ 21 (-63.16%)
label-studio-transformersLabel data using HuggingFace's transformers and automatically get a prediction service
Stars: ✭ 117 (+105.26%)
TensorFusionNetworkEMNLP 2017 (Oral): Tensor Fusion Network for Multimodal Sentiment Analysis Code
Stars: ✭ 55 (-3.51%)
bert nliA Natural Language Inference (NLI) model based on Transformers (BERT and ALBERT)
Stars: ✭ 97 (+70.18%)
SentimentAnalysis(BOW, TF-IDF, Word2Vec, BERT) Word Embeddings + (SVM, Naive Bayes, Decision Tree, Random Forest) Base Classifiers + Pre-trained BERT on Tensorflow Hub + 1-D CNN and Bi-Directional LSTM on IMDB Movie Reviews Dataset
Stars: ✭ 40 (-29.82%)
Text-SummarizationAbstractive and Extractive Text summarization using Transformers.
Stars: ✭ 38 (-33.33%)
consistencyImplementation of models in our EMNLP 2019 paper: A Logic-Driven Framework for Consistency of Neural Models
Stars: ✭ 26 (-54.39%)
anonymisationAnonymization of legal cases (Fr) based on Flair embeddings
Stars: ✭ 85 (+49.12%)
PDNThe official PyTorch implementation of "Pathfinder Discovery Networks for Neural Message Passing" (WebConf '21)
Stars: ✭ 44 (-22.81%)
simsiam-cifar10Code to train the SimSiam model on cifar10 using PyTorch
Stars: ✭ 33 (-42.11%)
Transformers-TutorialsThis repository contains demos I made with the Transformers library by HuggingFace.
Stars: ✭ 2,828 (+4861.4%)
info-nce-pytorchPyTorch implementation of the InfoNCE loss for self-supervised learning.
Stars: ✭ 160 (+180.7%)
referit3dCode accompanying our ECCV-2020 paper on 3D Neural Listeners.
Stars: ✭ 59 (+3.51%)
soxanWav2Vec for speech recognition, classification, and audio classification
Stars: ✭ 113 (+98.25%)