All Projects → h3lio5 → linguistic-style-transfer-pytorch

h3lio5 / linguistic-style-transfer-pytorch

Licence: other
Implementation of "Disentangled Representation Learning for Non-Parallel Text Style Transfer(ACL 2019)" in Pytorch

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to linguistic-style-transfer-pytorch

CHyVAE
Code for our paper -- Hyperprior Induced Unsupervised Disentanglement of Latent Representations (AAAI 2019)
Stars: ✭ 18 (-67.27%)
Mutual labels:  variational-autoencoder, disentangled-representations, latent-representations
disent
🧶 Modular VAE disentanglement framework for python built with PyTorch Lightning ▸ Including metrics and datasets ▸ With strongly supervised, weakly supervised and unsupervised methods ▸ Easily configured and run with Hydra config ▸ Inspired by disentanglement_lib
Stars: ✭ 41 (-25.45%)
Mutual labels:  disentanglement, disentangled-representations
Mojitalk
Code for "MojiTalk: Generating Emotional Responses at Scale" https://arxiv.org/abs/1711.04090
Stars: ✭ 107 (+94.55%)
Mutual labels:  natural-language-generation, variational-autoencoder
Neurec
Next RecSys Library
Stars: ✭ 731 (+1229.09%)
Mutual labels:  adversarial-learning, variational-autoencoder
concept-based-xai
Library implementing state-of-the-art Concept-based and Disentanglement Learning methods for Explainable AI
Stars: ✭ 41 (-25.45%)
Mutual labels:  disentanglement, disentangled-representations
MIDI-VAE
No description or website provided.
Stars: ✭ 56 (+1.82%)
Mutual labels:  style-transfer, variational-autoencoder
adVAE
Implementation of 'Self-Adversarial Variational Autoencoder with Gaussian Anomaly Prior Distribution for Anomaly Detection'
Stars: ✭ 17 (-69.09%)
Mutual labels:  adversarial-learning, variational-autoencoder
Deep Learning With Python
Example projects I completed to understand Deep Learning techniques with Tensorflow. Please note that I do no longer maintain this repository.
Stars: ✭ 134 (+143.64%)
Mutual labels:  style-transfer, variational-autoencoder
Nlp Conference Compendium
Compendium of the resources available from top NLP conferences.
Stars: ✭ 349 (+534.55%)
Mutual labels:  acl, natural-language-generation
transformer-drg-style-transfer
This repository have scripts and Jupyter-notebooks to perform all the different steps involved in Transforming Delete, Retrieve, Generate Approach for Controlled Text Style Transfer
Stars: ✭ 97 (+76.36%)
Mutual labels:  style-transfer, text-style-transfer
DisCont
Code for the paper "DisCont: Self-Supervised Visual Attribute Disentanglement using Context Vectors".
Stars: ✭ 13 (-76.36%)
Mutual labels:  disentanglement, disentangled-representations
Acl
The Hoa\Acl library.
Stars: ✭ 27 (-50.91%)
Mutual labels:  acl
laminas-permissions-acl
Provides a lightweight and flexible access control list (ACL) implementation for privileges management
Stars: ✭ 29 (-47.27%)
Mutual labels:  acl
sqlalchemy-adapter
SQLAlchemy Adapter for PyCasbin
Stars: ✭ 53 (-3.64%)
Mutual labels:  acl
nlg-markovify-api
An API built on Plumber (R) utilizing Markovify, a Python package, wrapped in markovifyR (R). It builds a Markov Chain-model based on text (user input) and generates new text based on the model.
Stars: ✭ 19 (-65.45%)
Mutual labels:  natural-language-generation
browser-acl
Simple acceess control (ACL) library for the browser inspired by Laravel's guards and policies.
Stars: ✭ 36 (-34.55%)
Mutual labels:  acl
numberwords
Convert a number to an approximated text expression: from '0.23' to 'less than a quarter'.
Stars: ✭ 191 (+247.27%)
Mutual labels:  natural-language-generation
DPDK SURICATA-4 1 1
dpdk infrastructure for software acceleration. Currently working on RX and ACL pre-filter
Stars: ✭ 81 (+47.27%)
Mutual labels:  acl
favorite-research-papers
Listing my favorite research papers 📝 from different fields as I read them.
Stars: ✭ 12 (-78.18%)
Mutual labels:  style-transfer
Adversarial-Learning-for-Generative-Conversational-Agents
This repository contains a new adversarial training method for Generative Conversational Agents
Stars: ✭ 71 (+29.09%)
Mutual labels:  adversarial-learning

Linguistic Style Transfer

Implementation of the paper Disentangled Representation Learning for Non-Parallel Text Style Transfer(link) in Pytorch

Abstract

This paper tackles the problem of disentangling the latent representations of style and content in language models. We propose a simple yet effective approach, which incorporates auxiliary multi-task and adversarial objectives, for style prediction and bag-of-words prediction, respectively. We show, both qualitatively and quantitatively, that the style and content are indeed disentangled in the latent space. This disentangled latent representation learning can be applied to style transfer on non-parallel corpora. We achieve high performance in terms of transfer accuracy, content preservation, and language fluency, in comparision to various previous approaches.

To get a basic overview of the paper, read the summary.

1.Setup Instructions and Dependencies

You may setup the repository on your local machine by either downloading it or running the following line on terminal.

git clone https://github.com/h3lio5/linguistic-style-transfer-pytorch.git

All dependencies required by this repo can be downloaded by creating a virtual environment with Python 3.7 and running

python3 -m venv .env
source .env/bin/activate
pip install -r requirements.txt
pip install -e .

Note: Run all the commands from the root directory.

2.Training Model from Scratch

To train your own model from scratch, run

python train.py 
  • The parameters for your experiment are all set by defualt. But you are free to set them on your own by editing the config.py file.
  • The training script will create a folder checkpoints as specified in your config.py file.
  • This folder will contain all model parameters saved after each epoch.

3. Transfering Text Style from Trained Models

To transfer text style of a sentence from trained models, run

python generate.py 

The user will be prompted to enter the source sentence and the target style on running the above command:
Example:

Please enter the source sentence: the book is good
Please enter the target style: pos or neg: neg
Your style transfered sentence is: the book is boring

4.Repository Overview

This repository contains the following files and folders

  1. images: Contains media for readme.md.

  2. linguistic-style-transfer-pytorch/data_loader.py: Contains helper functions that load data.

  3. linguistic-style-transfer-pytorch/model.py: Contains code to build the model.

  4. linguistic-style-transfer-pytorch/config.py: Contains information about various file paths and model configuration.

  5. linguistic-style-transfer-pytorch/utils/vocab.py: Contains code to build the vocabulary and word embeddings.

  6. linguistic-style-transfer-pytorch/utils/preprocess.py Contains code to preprocess the data.

  7. linguistic-style-transfer-pytorch/utils/train_w2v.py: Contains code to train word2vec embeddings from scratch on the downloaded data.

  8. generate.py: Used to generate and save images from trained models.

  9. train.py: Contains code to train models from scratch.

  10. requirements.txt: Lists dependencies for easy setup in virtual environments.

5.Training and Inference

Illustration of training and inference.
training_and_inference

Resources

  • Original paper Disentangled Representation Learning for Non-Parallel Text Style Transfer (link)
  • tensorflow implementation by the author link
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].