All Projects → jensjepsen → imdb-transformer

jensjepsen / imdb-transformer

Licence: other
A simple Neural Network for sentiment analysis, embedding sentences using a Transformer network.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to imdb-transformer

libai
LiBai(李白): A Toolbox for Large-Scale Distributed Parallel Training
Stars: ✭ 284 (+992.31%)
Mutual labels:  transformer
Variational-Transformer
Variational Transformers for Diverse Response Generation
Stars: ✭ 79 (+203.85%)
Mutual labels:  transformer
dingo-serializer-switch
A middleware to switch fractal serializers in dingo
Stars: ✭ 49 (+88.46%)
Mutual labels:  transformer
kaggle-champs
Code for the CHAMPS Predicting Molecular Properties Kaggle competition
Stars: ✭ 49 (+88.46%)
Mutual labels:  transformer
TokenLabeling
Pytorch implementation of "All Tokens Matter: Token Labeling for Training Better Vision Transformers"
Stars: ✭ 385 (+1380.77%)
Mutual labels:  transformer
transformer-ls
Official PyTorch Implementation of Long-Short Transformer (NeurIPS 2021).
Stars: ✭ 201 (+673.08%)
Mutual labels:  transformer
project-code-py
Leetcode using AI
Stars: ✭ 100 (+284.62%)
Mutual labels:  transformer
catr
Image Captioning Using Transformer
Stars: ✭ 206 (+692.31%)
Mutual labels:  transformer
les-military-mrc-rank7
莱斯杯:全国第二届“军事智能机器阅读”挑战赛 - Rank7 解决方案
Stars: ✭ 37 (+42.31%)
Mutual labels:  transformer
IMDb-Scout-Mod
Auto search for movie/series on torrent, usenet, ddl, subtitles, streaming, predb and other sites. Adds links to IMDb pages from hundreds various sites. Adds movies/series to Radarr/Sonarr. Adds external ratings from Metacritic, Rotten Tomatoes, Letterboxd, Douban, Allocine. Media Server indicators for Plex, Jellyfin, Emby. Dark theme/style for …
Stars: ✭ 177 (+580.77%)
Mutual labels:  imdb
TransBTS
This repo provides the official code for : 1) TransBTS: Multimodal Brain Tumor Segmentation Using Transformer (https://arxiv.org/abs/2103.04430) , accepted by MICCAI2021. 2) TransBTSV2: Towards Better and More Efficient Volumetric Segmentation of Medical Images(https://arxiv.org/abs/2201.12785).
Stars: ✭ 254 (+876.92%)
Mutual labels:  transformer
sparql-transformer
A more handy way to use SPARQL data in your web app
Stars: ✭ 38 (+46.15%)
Mutual labels:  transformer
VideoTransformer-pytorch
PyTorch implementation of a collections of scalable Video Transformer Benchmarks.
Stars: ✭ 159 (+511.54%)
Mutual labels:  transformer
cape
Continuous Augmented Positional Embeddings (CAPE) implementation for PyTorch
Stars: ✭ 29 (+11.54%)
Mutual labels:  transformer
text simplification
Text Simplification Model based on Encoder-Decoder (includes Transformer and Seq2Seq) model.
Stars: ✭ 66 (+153.85%)
Mutual labels:  transformer
Cross-lingual-Summarization
Zero-Shot Cross-Lingual Abstractive Sentence Summarization through Teaching Generation and Attention
Stars: ✭ 28 (+7.69%)
Mutual labels:  transformer
Transformer Temporal Tagger
Code and data form the paper BERT Got a Date: Introducing Transformers to Temporal Tagging
Stars: ✭ 55 (+111.54%)
Mutual labels:  transformer
TabFormer
Code & Data for "Tabular Transformers for Modeling Multivariate Time Series" (ICASSP, 2021)
Stars: ✭ 209 (+703.85%)
Mutual labels:  transformer
Video-Action-Transformer-Network-Pytorch-
Implementation of the paper Video Action Transformer Network
Stars: ✭ 126 (+384.62%)
Mutual labels:  transformer
Neural-Scam-Artist
Web Scraping, Document Deduplication & GPT-2 Fine-tuning with a newly created scam dataset.
Stars: ✭ 18 (-30.77%)
Mutual labels:  transformer

Transformer Networks for Sentiment Analysis

Implements a simple binary classifier for sentiment analysis, embedding sentences using a Transformer network. Transformer networks were introduced in the paper All You Need is Attention, where the authors achieve state of the art performance on several NLP tasks.

Usage

Run python train.py, to train a model on the IMDB reviews dataset (it will be downloaded automatically through torchtext if it's not present). This uses trained positional embeddings for the transformer networks, as opposed to the sinusoidal positional encodings introduced in the paper.

To use the Transformer module in another project, be sure to add some sort of positional encoding to the input before passing it to the module, as these are not automatically added.

Options

python train.py --help
usage: train.py [-h] [--max_length MAX_LENGTH] [--model_size MODEL_SIZE]
                [--epochs EPOCHS] [--learning_rate LEARNING_RATE]
                [--device DEVICE] [--num_heads NUM_HEADS]
                [--num_blocks NUM_BLOCKS] [--dropout DROPOUT]
                [--train_word_embeddings TRAIN_WORD_EMBEDDINGS]
                [--batch_size BATCH_SIZE]

Train a Transformer network for sentiment analysis

optional arguments:
  -h, --help            show this help message and exit
  --max_length MAX_LENGTH
                        Maximum sequence length, sequences longer than this
                        are truncated
  --model_size MODEL_SIZE
                        Hidden size for all hidden layers of the model
  --epochs EPOCHS       Number of epochs to train for
  --learning_rate LEARNING_RATE
                        Learning rate for optimizer
  --device DEVICE       Device to use for training and evaluation e.g. (cpu,
                        cuda:0)
  --num_heads NUM_HEADS
                        Number of attention heads in the Transformer network
  --num_blocks NUM_BLOCKS
                        Number of blocks in the Transformer network
  --dropout DROPOUT     Dropout (not keep_prob, but probability of ZEROING
                        during training, i.e. keep_prob = 1 - dropout)
  --train_word_embeddings TRAIN_WORD_EMBEDDINGS
                        Train GloVE word embeddings
  --batch_size BATCH_SIZE
                        Batch size

Requirements

  • Python 2.7
  • PyTorch 4.1
  • TorchText
  • NumPy
  • tqdm (optional)

Acknowledgements

The Transformer networks were introduced by Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Łukasz Kaiser and Illia Polosukhin in All You Need is Attention.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].