All Projects → yihong-chen → Neural Collaborative Filtering

yihong-chen / Neural Collaborative Filtering

pytorch version of neural collaborative filtering

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Neural Collaborative Filtering

Recotour
A tour through recommendation algorithms in python [IN PROGRESS]
Stars: ✭ 140 (-46.77%)
Mutual labels:  jupyter-notebook, collaborative-filtering, matrix-factorization
Recsys19 hybridsvd
Accompanying code for reproducing experiments from the HybridSVD paper. Preprint is available at https://arxiv.org/abs/1802.06398.
Stars: ✭ 23 (-91.25%)
Mutual labels:  jupyter-notebook, collaborative-filtering
Vae cf
Variational autoencoders for collaborative filtering
Stars: ✭ 386 (+46.77%)
Mutual labels:  jupyter-notebook, collaborative-filtering
Cofactor
CoFactor: Regularizing Matrix Factorization with Item Co-occurrence
Stars: ✭ 160 (-39.16%)
Mutual labels:  jupyter-notebook, matrix-factorization
Implicit
Fast Python Collaborative Filtering for Implicit Feedback Datasets
Stars: ✭ 2,569 (+876.81%)
Mutual labels:  collaborative-filtering, matrix-factorization
Polara
Recommender system and evaluation framework for top-n recommendations tasks that respects polarity of feedbacks. Fast, flexible and easy to use. Written in python, boosted by scientific python stack.
Stars: ✭ 205 (-22.05%)
Mutual labels:  collaborative-filtering, matrix-factorization
Expo Mf
Exposure Matrix Factorization: modeling user exposure in recommendation
Stars: ✭ 81 (-69.2%)
Mutual labels:  jupyter-notebook, matrix-factorization
Elliot
Comprehensive and Rigorous Framework for Reproducible Recommender Systems Evaluation
Stars: ✭ 49 (-81.37%)
Mutual labels:  collaborative-filtering, matrix-factorization
matrix-completion
Lightweight Python library for in-memory matrix completion.
Stars: ✭ 94 (-64.26%)
Mutual labels:  collaborative-filtering, matrix-factorization
Quick-Data-Science-Experiments-2017
Quick-Data-Science-Experiments
Stars: ✭ 19 (-92.78%)
Mutual labels:  collaborative-filtering, matrix-factorization
Recommendation.jl
Building recommender systems in Julia
Stars: ✭ 42 (-84.03%)
Mutual labels:  collaborative-filtering, matrix-factorization
Rsparse
Fast and accurate machine learning on sparse matrices - matrix factorizations, regression, classification, top-N recommendations.
Stars: ✭ 145 (-44.87%)
Mutual labels:  collaborative-filtering, matrix-factorization
Rectorch
rectorch is a pytorch-based framework for state-of-the-art top-N recommendation
Stars: ✭ 121 (-53.99%)
Mutual labels:  collaborative-filtering, matrix-factorization
Movielens
4 different recommendation engines for the MovieLens dataset.
Stars: ✭ 265 (+0.76%)
Mutual labels:  jupyter-notebook, collaborative-filtering
Metarec
PyTorch Implementations For A Series Of Deep Learning-Based Recommendation Models (IN PROGRESS)
Stars: ✭ 120 (-54.37%)
Mutual labels:  collaborative-filtering, matrix-factorization
Recommendation-System-Baseline
Some common recommendation system baseline, with description and link.
Stars: ✭ 34 (-87.07%)
Mutual labels:  collaborative-filtering, matrix-factorization
Deeprec
An Open-source Toolkit for Deep Learning based Recommendation with Tensorflow.
Stars: ✭ 954 (+262.74%)
Mutual labels:  collaborative-filtering, matrix-factorization
Recoder
Large scale training of factorization models for Collaborative Filtering with PyTorch
Stars: ✭ 46 (-82.51%)
Mutual labels:  collaborative-filtering, matrix-factorization
Tutorials
AI-related tutorials. Access any of them for free → https://towardsai.net/editorial
Stars: ✭ 204 (-22.43%)
Mutual labels:  jupyter-notebook, collaborative-filtering
recommender system with Python
recommender system tutorial with Python
Stars: ✭ 106 (-59.7%)
Mutual labels:  collaborative-filtering, matrix-factorization

neural-collaborative-filtering

Neural collaborative filtering(NCF), is a deep learning based framework for making recommendations. The key idea is to learn the user-item interaction using neural networks. Check the follwing paper for details about NCF.

He, Xiangnan, et al. "Neural collaborative filtering." Proceedings of the 26th International Conference on World Wide Web. International World Wide Web Conferences Steering Committee, 2017.

The authors of NCF actually published a nice implementation written in tensorflow(keras). This repo instead provides my implementation written in pytorch. I hope it would be helpful to pytorch fans. Have fun playing with it !

Dataset

The Movielens 1M Dataset is used to test the repo.

Files

data.py: prepare train/test dataset

utils.py: some handy functions for model training etc.

metrics.py: evaluation metrics including hit ratio(HR) and NDCG

gmf.py: generalized matrix factorization model

mlp.py: multi-layer perceptron model

neumf.py: fusion of gmf and mlp

engine.py: training engine

train.py: entry point for train a NCF model

Performance

The hyper params are not tuned. Better performance can be achieved with careful tuning, especially for the MLP model. Pretraining the user embedding & item embedding might be helpful to improve the performance of the MLP model.

Experiments' results with num_negative_samples = 4 and dim_latent_factor=8 are shown as follows

GMF V.S. MLP

Note that the MLP model was trained from scratch but the authors suggest that the performance might be boosted by pretrain the embedding layer with GMF model.

NeuMF pretrain V.S no pretrain

The pretrained version converges much faster.

L2 regularization for GMF model

Large l2 regularization might lead to the bug of HR=0.0 NDCG=0.0

L2 regularization for MLP model

a bit l2 regulzrization seems to improve the performance of the MLP model

L2 for MLP

MLP with pretrained user/item embedding

Pre-training the MLP model with user/item embedding from the trained GMF gives better result.

MLP network size = [16, 64, 32, 16, 8]

Pretrain for MLP Pretrain for MLP

Implicit feedback without pretrain

Ratings are set to 1 (interacted) or 0 (uninteracted). Train from scratch. binarize

Pytorch Versions

The repo works under torch 1.0. You can find the old versions working under torch 0.2 and 0.4 in tags.

TODO

  • Batchify the test data to handle large dataset.
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].