All Projects → Text Classification Models Pytorch → Similar Projects or Alternatives

1689 Open source projects that are alternatives of or similar to Text Classification Models Pytorch

Mead Baseline
Deep-Learning Model Exploration and Development for NLP
Stars: ✭ 238 (-37.2%)
NLP-paper
🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-93.93%)
Mutual labels:  transformer, seq2seq, fasttext
Rcnn Text Classification
Tensorflow Implementation of "Recurrent Convolutional Neural Network for Text Classification" (AAAI 2015)
Stars: ✭ 127 (-66.49%)
Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+801.85%)
Mutual labels:  attention, seq2seq, transformer
Base-On-Relation-Method-Extract-News-DA-RNN-Model-For-Stock-Prediction--Pytorch
基於關聯式新聞提取方法之雙階段注意力機制模型用於股票預測
Stars: ✭ 33 (-91.29%)
Mutual labels:  seq2seq, attention, fasttext
Multiturndialogzoo
Multi-turn dialogue baselines written in PyTorch
Stars: ✭ 106 (-72.03%)
Mutual labels:  attention, seq2seq, transformer
Text classification
all kinds of text classification models and more with deep learning
Stars: ✭ 7,179 (+1794.2%)
Rmdl
RMDL: Random Multimodel Deep Learning for Classification
Stars: ✭ 375 (-1.06%)
Embedding
Embedding模型代码和学习笔记总结
Stars: ✭ 25 (-93.4%)
Mutual labels:  transformer, seq2seq, fasttext
Keras Textclassification
中文长文本分类、短句子分类、多标签分类、两句子相似度(Chinese Text Classification of Keras NLP, multi-label classify, or sentence classify, long or short),字词句向量嵌入层(embeddings)和网络层(graph)构建基类,FastText,TextCNN,CharCNN,TextRNN, RCNN, DCNN, DPCNN, VDCNN, CRNN, Bert, Xlnet, Albert, Attention, DeepMoji, HAN, 胶囊网络-CapsuleNet, Transformer-encode, Seq2seq, SWEM, LEAM, TextGCN
Stars: ✭ 914 (+141.16%)
Mutual labels:  fasttext, rcnn, transformer
Nlp Tutorials
Simple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
Stars: ✭ 394 (+3.96%)
Mutual labels:  attention, seq2seq, transformer
Malware Classification
Towards Building an Intelligent Anti-Malware System: A Deep Learning Approach using Support Vector Machine for Malware Classification
Stars: ✭ 88 (-76.78%)
Image Caption Generator
A neural network to generate captions for an image using CNN and RNN with BEAM Search.
Stars: ✭ 126 (-66.75%)
Rcnn Relation Extraction
Tensorflow Implementation of Recurrent Convolutional Neural Network for Relation Extraction
Stars: ✭ 64 (-83.11%)
transformer
A PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-92.61%)
Mutual labels:  transformer, seq2seq, attention
Nlp research
NLP research:基于tensorflow的nlp深度学习项目,支持文本分类/句子匹配/序列标注/文本生成 四大任务
Stars: ✭ 141 (-62.8%)
Mutual labels:  classification, fasttext, transformer
Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+7.65%)
Mutual labels:  attention, seq2seq, transformer
Nlp Journey
Documents, papers and codes related to Natural Language Processing, including Topic Model, Word Embedding, Named Entity Recognition, Text Classificatin, Text Generation, Text Similarity, Machine Translation),etc. All codes are implemented intensorflow 2.0.
Stars: ✭ 1,290 (+240.37%)
Mutual labels:  classification, attention, fasttext
Komputation
Komputation is a neural network framework for the Java Virtual Machine written in Kotlin and CUDA C.
Stars: ✭ 295 (-22.16%)
Ner Bert
BERT-NER (nert-bert) with google bert https://github.com/google-research.
Stars: ✭ 339 (-10.55%)
Mutual labels:  classification, attention
seq2seq-pytorch
Sequence to Sequence Models in PyTorch
Stars: ✭ 41 (-89.18%)
Mutual labels:  transformer, attention
chinese ancient poetry
seq2seq attention tensorflow textrank context
Stars: ✭ 30 (-92.08%)
Mutual labels:  seq2seq, attention
TRAR-VQA
[ICCV 2021] TRAR: Routing the Attention Spans in Transformers for Visual Question Answering -- Official Implementation
Stars: ✭ 49 (-87.07%)
Mutual labels:  transformer, attention
tensorflow-chatbot-chinese
網頁聊天機器人 | tensorflow implementation of seq2seq model with bahdanau attention and Word2Vec pretrained embedding
Stars: ✭ 50 (-86.81%)
Mutual labels:  seq2seq, attention
german-sentiment
A data set and model for german sentiment classification.
Stars: ✭ 37 (-90.24%)
Mutual labels:  transformer, fasttext
tensorflow-ml-nlp-tf2
텐서플로2와 머신러닝으로 시작하는 자연어처리 (로지스틱회귀부터 BERT와 GPT3까지) 실습자료
Stars: ✭ 245 (-35.36%)
Mutual labels:  transformer, seq2seq
Transfer Learning Suite
Transfer Learning Suite in Keras. Perform transfer learning using any built-in Keras image classification model easily!
Stars: ✭ 212 (-44.06%)
chatbot
一个基于深度学习的中文聊天机器人,这里有详细的教程与代码,每份代码都有详细的注释,作为学习是美好的选择。A Chinese chatbot based on deep learning.
Stars: ✭ 94 (-75.2%)
Mutual labels:  seq2seq, attention
Seq2Seq-chatbot
TensorFlow Implementation of Twitter Chatbot
Stars: ✭ 18 (-95.25%)
Mobilenet V2
A Complete and Simple Implementation of MobileNet-V2 in PyTorch
Stars: ✭ 206 (-45.65%)
kospeech
Open-Source Toolkit for End-to-End Korean Automatic Speech Recognition leveraging PyTorch and Hydra.
Stars: ✭ 456 (+20.32%)
Mutual labels:  transformer, seq2seq
Transformer Temporal Tagger
Code and data form the paper BERT Got a Date: Introducing Transformers to Temporal Tagging
Stars: ✭ 55 (-85.49%)
Mutual labels:  transformer, seq2seq
Keras Anomaly Detection
Anomaly detection implemented in Keras
Stars: ✭ 335 (-11.61%)
Cnn 3d Images Tensorflow
3D image classification using CNN (Convolutional Neural Network)
Stars: ✭ 199 (-47.49%)
classifier multi label seq2seq attention
multi-label,classifier,text classification,多标签文本分类,文本分类,BERT,ALBERT,multi-label-classification,seq2seq,attention,beam search
Stars: ✭ 26 (-93.14%)
Mutual labels:  seq2seq, attention
Deep-Learning-Tensorflow
Gathers Tensorflow deep learning models.
Stars: ✭ 50 (-86.81%)
CrabNet
Predict materials properties using only the composition information!
Stars: ✭ 57 (-84.96%)
Mutual labels:  transformer, attention
Conformer
Official code for Conformer: Local Features Coupling Global Representations for Visual Recognition
Stars: ✭ 345 (-8.97%)
Mutual labels:  transformer, classification
learningspoons
nlp lecture-notes and source code
Stars: ✭ 29 (-92.35%)
Mutual labels:  transformer, attention
datastories-semeval2017-task6
Deep-learning model presented in "DataStories at SemEval-2017 Task 6: Siamese LSTM with Attention for Humorous Text Comparison".
Stars: ✭ 20 (-94.72%)
Pytorch Randaugment
Unofficial PyTorch Reimplementation of RandAugment.
Stars: ✭ 323 (-14.78%)
Cnn Svm
An Architecture Combining Convolutional Neural Network (CNN) and Linear Support Vector Machine (SVM) for Image Classification
Stars: ✭ 170 (-55.15%)
h-transformer-1d
Implementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (-68.07%)
Mutual labels:  transformer, attention
verseagility
Ramp up your custom natural language processing (NLP) task, allowing you to bring your own data, use your preferred frameworks and bring models into production.
Stars: ✭ 23 (-93.93%)
Mutual labels:  transformer, classification
Relation-Extraction-Transformer
NLP: Relation extraction with position-aware self-attention transformer
Stars: ✭ 63 (-83.38%)
Mutual labels:  transformer, attention
First Steps Towards Deep Learning
This is an open sourced book on deep learning.
Stars: ✭ 376 (-0.79%)
HRFormer
This is an official implementation of our NeurIPS 2021 paper "HRFormer: High-Resolution Transformer for Dense Prediction".
Stars: ✭ 357 (-5.8%)
Mutual labels:  transformer, classification
automatic-personality-prediction
[AAAI 2020] Modeling Personality with Attentive Networks and Contextual Embeddings
Stars: ✭ 43 (-88.65%)
RNNSearch
An implementation of attention-based neural machine translation using Pytorch
Stars: ✭ 43 (-88.65%)
Mutual labels:  seq2seq, attention
dl-relu
Deep Learning using Rectified Linear Units (ReLU)
Stars: ✭ 20 (-94.72%)
transformer
Neutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (-84.17%)
Mutual labels:  transformer, seq2seq
Transformer Tensorflow
TensorFlow implementation of 'Attention Is All You Need (2017. 6)'
Stars: ✭ 319 (-15.83%)
Mutual labels:  attention, transformer
deep-molecular-optimization
Molecular optimization by capturing chemist’s intuition using the Seq2Seq with attention and the Transformer
Stars: ✭ 60 (-84.17%)
Mutual labels:  transformer, seq2seq
well-classified-examples-are-underestimated
Code for the AAAI 2022 publication "Well-classified Examples are Underestimated in Classification with Deep Neural Networks"
Stars: ✭ 21 (-94.46%)
Mutual labels:  transformer, classification
visualization
a collection of visualization function
Stars: ✭ 189 (-50.13%)
Mutual labels:  transformer, attention
pytorch-transformer-chatbot
PyTorch v1.2에서 생긴 Transformer API 를 이용한 간단한 Chitchat 챗봇
Stars: ✭ 44 (-88.39%)
Mutual labels:  transformer, seq2seq
dts
A Keras library for multi-step time-series forecasting.
Stars: ✭ 130 (-65.7%)
Visual-Transformer-Paper-Summary
Summary of Transformer applications for computer vision tasks.
Stars: ✭ 51 (-86.54%)
Mutual labels:  transformer, attention
Encoder decoder
Four styles of encoder decoder model by Python, Theano, Keras and Seq2Seq
Stars: ✭ 269 (-29.02%)
Mutual labels:  attention, seq2seq
1-60 of 1689 similar projects