Text-Classification-LSTMs-PyTorchThe aim of this repository is to show a baseline model for text classification by implementing a LSTM-based model coded in PyTorch. In order to provide a better understanding of the model, it will be used a Tweets dataset provided by Kaggle.
Stars: ✭ 45 (-75.54%)
Text classificationall kinds of text classification models and more with deep learning
Stars: ✭ 7,179 (+3801.63%)
Chatbot cn基于金融-司法领域(兼有闲聊性质)的聊天机器人,其中的主要模块有信息抽取、NLU、NLG、知识图谱等,并且利用Django整合了前端展示,目前已经封装了nlp和kg的restful接口
Stars: ✭ 791 (+329.89%)
Document Classifier LstmA bidirectional LSTM with attention for multiclass/multilabel text classification.
Stars: ✭ 136 (-26.09%)
rnn-text-classification-tfTensorflow implementation of Attention-based Bidirectional RNN text classification.
Stars: ✭ 26 (-85.87%)
3HANAn original implementation of "3HAN: A Deep Neural Network for Fake News Detection" (ICONIP 2017)
Stars: ✭ 29 (-84.24%)
TextclassifierText classifier for Hierarchical Attention Networks for Document Classification
Stars: ✭ 985 (+435.33%)
Sarcasm DetectionDetecting Sarcasm on Twitter using both traditonal machine learning and deep learning techniques.
Stars: ✭ 73 (-60.33%)
Multi Label classificationtransform multi-label classification as sentence pair task, with more training data and information
Stars: ✭ 151 (-17.93%)
Classify Text"20 Newsgroups" text classification with python
Stars: ✭ 149 (-19.02%)
Lotclass[EMNLP 2020] Text Classification Using Label Names Only: A Language Model Self-Training Approach
Stars: ✭ 160 (-13.04%)
Applied Dl 2018Tel-Aviv Deep Learning Boot-camp: 12 Applied Deep Learning Labs
Stars: ✭ 146 (-20.65%)
BrowsecloudA web app to create and browse text visualizations for automated customer listening.
Stars: ✭ 143 (-22.28%)
FrontalizationPytorch deep learning face frontalization model
Stars: ✭ 160 (-13.04%)
Monkeylearn PythonOfficial Python client for the MonkeyLearn API. Build and consume machine learning models for language processing from your Python apps.
Stars: ✭ 143 (-22.28%)
Attribute Aware Attention[ACM MM 2018] Attribute-Aware Attention Model for Fine-grained Representation Learning
Stars: ✭ 143 (-22.28%)
FastnlpfastNLP: A Modularized and Extensible NLP Framework. Currently still in incubation.
Stars: ✭ 2,441 (+1226.63%)
Dive Into Dl Pytorch本项目将《动手学深度学习》(Dive into Deep Learning)原书中的MXNet实现改为PyTorch实现。
Stars: ✭ 14,234 (+7635.87%)
Parselawdocuments对收集的法律文档进行一系列分析,包括根据规范自动切分、案件相似度计算、案件聚类、法律条文推荐等(试验目前基于婚姻类案件,可扩展至其它领域)。
Stars: ✭ 138 (-25%)
Pan[Params: Only 272K!!!] Efficient Image Super-Resolution Using Pixel Attention, in ECCV Workshop, 2020.
Stars: ✭ 151 (-17.93%)
MacadamMacadam是一个以Tensorflow(Keras)和bert4keras为基础,专注于文本分类、序列标注和关系抽取的自然语言处理工具包。支持RANDOM、WORD2VEC、FASTTEXT、BERT、ALBERT、ROBERTA、NEZHA、XLNET、ELECTRA、GPT-2等EMBEDDING嵌入; 支持FineTune、FastText、TextCNN、CharCNN、BiRNN、RCNN、DCNN、CRNN、DeepMoji、SelfAttention、HAN、Capsule等文本分类算法; 支持CRF、Bi-LSTM-CRF、CNN-LSTM、DGCNN、Bi-LSTM-LAN、Lattice-LSTM-Batch、MRC等序列标注算法。
Stars: ✭ 149 (-19.02%)
HartHierarchical Attentive Recurrent Tracking
Stars: ✭ 149 (-19.02%)
TextanalyzerA text analyzer which is based on machine learning,statistics and dictionaries that can analyze text. So far, it supports hot word extracting, text classification, part of speech tagging, named entity recognition, chinese word segment, extracting address, synonym, text clustering, word2vec model, edit distance, chinese word segment, sentence similarity,word sentiment tendency, name recognition, idiom recognition, placename recognition, organization recognition, traditional chinese recognition, pinyin transform.
Stars: ✭ 162 (-11.96%)
KashgariKashgari is a production-level NLP Transfer learning framework built on top of tf.keras for text-labeling and text-classification, includes Word2Vec, BERT, and GPT2 Language Embedding.
Stars: ✭ 2,235 (+1114.67%)
Text Classification DemosNeural models for Text Classification in Tensorflow, such as cnn, dpcnn, fasttext, bert ...
Stars: ✭ 144 (-21.74%)
GatGraph Attention Networks (https://arxiv.org/abs/1710.10903)
Stars: ✭ 2,229 (+1111.41%)
Uda pytorchUDA(Unsupervised Data Augmentation) implemented by pytorch
Stars: ✭ 143 (-22.28%)
Spark NlpState of the Art Natural Language Processing
Stars: ✭ 2,518 (+1268.48%)
Seq2seq chatbot new基于seq2seq模型的简单对话系统的tf实现,具有embedding、attention、beam_search等功能,数据集是Cornell Movie Dialogs
Stars: ✭ 144 (-21.74%)
Onnxt5Summarization, translation, sentiment-analysis, text-generation and more at blazing speed using a T5 version implemented in ONNX.
Stars: ✭ 143 (-22.28%)
Bert servingexport bert model for serving
Stars: ✭ 138 (-25%)
Prediction FlowDeep-Learning based CTR models implemented by PyTorch
Stars: ✭ 138 (-25%)
Lstm attentionattention-based LSTM/Dense implemented by Keras
Stars: ✭ 168 (-8.7%)
VdcnnImplementation of Very Deep Convolutional Neural Network for Text Classification
Stars: ✭ 158 (-14.13%)
Pytorch 101 Tutorial SeriesPyTorch 101 series covering everything from the basic building blocks all the way to building custom architectures.
Stars: ✭ 136 (-26.09%)
AdnetAttention-guided CNN for image denoising(Neural Networks,2020)
Stars: ✭ 135 (-26.63%)
Nlp estimator tutorialEducational material on using the TensorFlow Estimator framework for text classification
Stars: ✭ 131 (-28.8%)
Pytorch PlaygroundBase pretrained models and datasets in pytorch (MNIST, SVHN, CIFAR10, CIFAR100, STL10, AlexNet, VGG16, VGG19, ResNet, Inception, SqueezeNet)
Stars: ✭ 2,201 (+1096.2%)
Eeg DlA Deep Learning library for EEG Tasks (Signals) Classification, based on TensorFlow.
Stars: ✭ 165 (-10.33%)
Picanet ImplementationPytorch Implementation of PiCANet: Learning Pixel-wise Contextual Attention for Saliency Detection
Stars: ✭ 157 (-14.67%)
Perceiver PytorchImplementation of Perceiver, General Perception with Iterative Attention, in Pytorch
Stars: ✭ 130 (-29.35%)
Pytorch Spectral Clustering[Under development]- Implementation of various methods for dimensionality reduction and spectral clustering implemented with Pytorch
Stars: ✭ 128 (-30.43%)
Ensemble PytorchA unified ensemble framework for Pytorch to improve the performance and robustness of your deep learning model
Stars: ✭ 153 (-16.85%)
Abstractive SummarizationImplementation of abstractive summarization using LSTM in the encoder-decoder architecture with local attention.
Stars: ✭ 128 (-30.43%)