All Projects → jaywonchung → BERT4Rec-VAE-Pytorch

jaywonchung / BERT4Rec-VAE-Pytorch

Licence: GPL-3.0 license
Pytorch implementation of BERT4Rec and Netflix VAE.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to BERT4Rec-VAE-Pytorch

datatheque.com
a data science blog
Stars: ✭ 12 (-94.34%)
Mutual labels:  recommendation-system
Advanced Models
여러가지 유명한 신경망 모델들을 제공합니다. (DCGAN, VAE, Resnet 등등)
Stars: ✭ 48 (-77.36%)
Mutual labels:  vae
seminar
ECNU ICA seminar materials
Stars: ✭ 14 (-93.4%)
Mutual labels:  recommendation-system
Generative-Model
Repository for implementation of generative models with Tensorflow 1.x
Stars: ✭ 66 (-68.87%)
Mutual labels:  vae
data-science-popular-algorithms
Data Science algorithms and topics that you must know. (Newly Designed) Recommender Systems, Decision Trees, K-Means, LDA, RFM-Segmentation, XGBoost in Python, R, and Scala.
Stars: ✭ 65 (-69.34%)
Mutual labels:  recommendation-system
VAENAR-TTS
PyTorch Implementation of VAENAR-TTS: Variational Auto-Encoder based Non-AutoRegressive Text-to-Speech Synthesis.
Stars: ✭ 66 (-68.87%)
Mutual labels:  vae
AE-Slicer
Save PNG slices in AE. Support multiple slice zones in one comp.
Stars: ✭ 21 (-90.09%)
Mutual labels:  ae
Carla-ppo
This repository hosts a customized PPO based agent for Carla. The goal of this project is to make it easier to interact with and experiment in Carla with reinforcement learning based agents -- this, by wrapping Carla in a gym like environment that can handle custom reward functions, custom debug output, etc.
Stars: ✭ 122 (-42.45%)
Mutual labels:  vae
recommender system with Python
recommender system tutorial with Python
Stars: ✭ 106 (-50%)
Mutual labels:  recommendation-system
cs6101
The Web IR / NLP Group (WING)'s public reading group at the National University of Singapore.
Stars: ✭ 17 (-91.98%)
Mutual labels:  recommendation-system
nvae
An unofficial toy implementation for NVAE 《A Deep Hierarchical Variational Autoencoder》
Stars: ✭ 83 (-60.85%)
Mutual labels:  vae
NVTabular
NVTabular is a feature engineering and preprocessing library for tabular data designed to quickly and easily manipulate terabyte scale datasets used to train deep learning based recommender systems.
Stars: ✭ 797 (+275.94%)
Mutual labels:  recommendation-system
Pytorch-RL-CPP
A Repository with C++ implementations of Reinforcement Learning Algorithms (Pytorch)
Stars: ✭ 73 (-65.57%)
Mutual labels:  vae
DiffEqDevTools.jl
Benchmarking, testing, and development tools for differential equations and scientific machine learning (SciML)
Stars: ✭ 37 (-82.55%)
Mutual labels:  dae
WSDM2022-PTUPCDR
This is the official implementation of our paper Personalized Transfer of User Preferences for Cross-domain Recommendation (PTUPCDR), which has been accepted by WSDM2022.
Stars: ✭ 65 (-69.34%)
Mutual labels:  recommendation-system
probabilistic nlg
Tensorflow Implementation of Stochastic Wasserstein Autoencoder for Probabilistic Sentence Generation (NAACL 2019).
Stars: ✭ 28 (-86.79%)
Mutual labels:  vae
vqvae-2
PyTorch implementation of VQ-VAE-2 from "Generating Diverse High-Fidelity Images with VQ-VAE-2"
Stars: ✭ 65 (-69.34%)
Mutual labels:  vae
Recommendation-System-Baseline
Some common recommendation system baseline, with description and link.
Stars: ✭ 34 (-83.96%)
Mutual labels:  recommendation-system
awesome-graph-self-supervised-learning-based-recommendation
A curated list of awesome graph & self-supervised-learning-based recommendation.
Stars: ✭ 37 (-82.55%)
Mutual labels:  recommendation-system
auction-website
🏷️ An e-commerce marketplace template. An online auction and shopping website for buying and selling a wide variety of goods and services worldwide.
Stars: ✭ 44 (-79.25%)
Mutual labels:  recommendation-system

Introduction

This repository implements models from the following two papers:

BERT4Rec: Sequential Recommendation with BERT (Sun et al.)

Variational Autoencoders for Collaborative Filtering (Liang et al.)

and lets you train them on MovieLens-1m and MovieLens-20m.

Usage

Overall

Run main.py with arguments to train and/or test you model. There are predefined templates for all models.

On running main.py, it asks you whether to train on MovieLens-1m or MovieLens-20m. (Enter 1 or 20)

After training, it also asks you whether to run test set evaluation on the trained model. (Enter y or n)

BERT4Rec

python main.py --template train_bert

DAE

python main.py --template train_dae

VAE

Search for the optimal beta

python main.py --template train_vae_search_beta

Use the found optimal beta

First, fill out the optimal beta value in templates.py. Then, run the following.

python main.py --template train_vae_give_beta

The Best_beta plot will help you determine the optimal beta value. It can be seen that the optimal beta value is 0.285.

The gray graph in the Beta plot was trained by fixing the beta value to 0.285.

The NDCG_10 metric shows that the improvement claimed by the paper has been reproduced.

Examples

  1. Train BERT4Rec on ML-20m and run test set inference after training

    printf '20\ny\n' | python main.py --template train_bert
  2. Search for optimal beta for VAE on ML-1m and do not run test set inference

    printf '1\nn\n' | python main.py --template train_vae_search_beta

Test Set Results

Numbers under model names indicate the number of hidden layers.

MovieLens-1m

MovieLens-20m

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].