All Projects → dongjun-Lee → transfer-learning-text-tf

dongjun-Lee / transfer-learning-text-tf

Licence: other
Tensorflow implementation of Semi-supervised Sequence Learning (https://arxiv.org/abs/1511.01432)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to transfer-learning-text-tf

backprop
Backprop makes it simple to use, finetune, and deploy state-of-the-art ML models.
Stars: ✭ 229 (+179.27%)
Mutual labels:  text-classification, transfer-learning
Kashgari
Kashgari is a production-level NLP Transfer learning framework built on top of tf.keras for text-labeling and text-classification, includes Word2Vec, BERT, and GPT2 Language Embedding.
Stars: ✭ 2,235 (+2625.61%)
Mutual labels:  text-classification, transfer-learning
Filipino-Text-Benchmarks
Open-source benchmark datasets and pretrained transformer models in the Filipino language.
Stars: ✭ 22 (-73.17%)
Mutual labels:  text-classification, transfer-learning
Chinese ulmfit
中文ULMFiT 情感分析 文本分类
Stars: ✭ 208 (+153.66%)
Mutual labels:  text-classification, transfer-learning
Bert language understanding
Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
Stars: ✭ 933 (+1037.8%)
Mutual labels:  text-classification, transfer-learning
ulm-basenet
Implementation of ULMFit algorithm for text classification via transfer learning
Stars: ✭ 94 (+14.63%)
Mutual labels:  text-classification, transfer-learning
Text-Classification-PyTorch
Implementation of papers for text classification task on SST-1/SST-2
Stars: ✭ 57 (-30.49%)
Mutual labels:  text-classification
ERNIE-text-classification-pytorch
This repo contains a PyTorch implementation of a pretrained ERNIE model for text classification.
Stars: ✭ 49 (-40.24%)
Mutual labels:  text-classification
nih-chest-xrays
A collection of projects which explore image classification on chest x-ray images (via the NIH dataset)
Stars: ✭ 32 (-60.98%)
Mutual labels:  transfer-learning
bns-short-text-similarity
📖 Use Bi-normal Separation to find document vectors which is used to compute similarity for shorter sentences.
Stars: ✭ 24 (-70.73%)
Mutual labels:  text-classification
Skin Lesions Classification DCNNs
Transfer Learning with DCNNs (DenseNet, Inception V3, Inception-ResNet V2, VGG16) for skin lesions classification
Stars: ✭ 47 (-42.68%)
Mutual labels:  transfer-learning
few shot dialogue generation
Dialogue Knowledge Transfer Networks (DiKTNet)
Stars: ✭ 24 (-70.73%)
Mutual labels:  transfer-learning
fiap-ml-visao-computacional
Repositório dos exemplos e desafios utilizados na disciplina de Visão Computacional do curso de MBA Machine Learning da FIAP
Stars: ✭ 33 (-59.76%)
Mutual labels:  transfer-learning
Revisiting-Contrastive-SSL
Revisiting Contrastive Methods for Unsupervised Learning of Visual Representations. [NeurIPS 2021]
Stars: ✭ 81 (-1.22%)
Mutual labels:  transfer-learning
text gcn tutorial
A tutorial & minimal example (8min on CPU) for Graph Convolutional Networks for Text Classification. AAAI 2019
Stars: ✭ 23 (-71.95%)
Mutual labels:  text-classification
cozmo-tensorflow
🤖 Cozmo the Robot recognizes objects with TensorFlow
Stars: ✭ 61 (-25.61%)
Mutual labels:  transfer-learning
tensorflow object detection helper tool
tensorflow object detection api helper tool ( custom object detection )
Stars: ✭ 30 (-63.41%)
Mutual labels:  transfer-learning
mrnet
Building an ACL tear detector to spot knee injuries from MRIs with PyTorch (MRNet)
Stars: ✭ 98 (+19.51%)
Mutual labels:  transfer-learning
NeuralNetworks
Implementation of a Neural Network that can detect whether a video is in-game or not
Stars: ✭ 64 (-21.95%)
Mutual labels:  transfer-learning
TextClassification
基于scikit-learn实现对新浪新闻的文本分类,数据集为100w篇文档,总计10类,测试集与训练集1:1划分。分类算法采用SVM和Bayes,其中Bayes作为baseline。
Stars: ✭ 86 (+4.88%)
Mutual labels:  text-classification

Transfer Learning for Text Classification with Tensorflow

Tensorflow implementation of Semi-supervised Sequence Learning(https://arxiv.org/abs/1511.01432).

Auto-encoder or language model is used as a pre-trained model to initialize LSTM text classification model.

  • SA-LSTM: Use auto-encoder as a pre-trained model.
  • LM-LSTM: Use language model as a pre-trained model.

Requirements

  • Python 3
  • Tensorflow
  • pip install -r requirements.txt

Usage

DBpedia dataset is used for pre-training and training.

Pre-train auto encoder or language model

$ python pre_train.py --model="<MODEL>"

(<Model>: auto_encoder | language_model)

Train LSTM text classification model

$ python train.py --pre_trained="<MODEL>"

(<Model>: none | auto_encoder | language_model)

Experimental Results

  • Orange lines: LSTM
  • Blue lines: SA-LSTM
  • Red lines: LM-LSTM

Loss

Accuracy

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].