All Projects → huangtinglin → NGCF-PyTorch

huangtinglin / NGCF-PyTorch

Licence: other
PyTorch Implementation for Neural Graph Collaborative Filtering

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to NGCF-PyTorch

Alink
Alink is the Machine Learning algorithm platform based on Flink, developed by the PAI team of Alibaba computing platform.
Stars: ✭ 2,936 (+1368%)
Mutual labels:  graph-algorithms, recommender-system
disentangled graph collaborative filtering
Disentagnled Graph Collaborative Filtering, SIGIR2020
Stars: ✭ 118 (-41%)
Mutual labels:  recommender-system, neural-collaborative-filtering
svae cf
[ WSDM '19 ] Sequential Variational Autoencoders for Collaborative Filtering
Stars: ✭ 38 (-81%)
Mutual labels:  recommender-system, pytorch-implementation
SSE-PT
Codes and Datasets for paper RecSys'20 "SSE-PT: Sequential Recommendation Via Personalized Transformer" and NurIPS'19 "Stochastic Shared Embeddings: Data-driven Regularization of Embedding Layers"
Stars: ✭ 103 (-48.5%)
Mutual labels:  recommender-system
ResUNetPlusPlus-with-CRF-and-TTA
ResUNet++, CRF, and TTA for segmentation of medical images (IEEE JBIHI)
Stars: ✭ 98 (-51%)
Mutual labels:  pytorch-implementation
RioGNN
Reinforced Neighborhood Selection Guided Multi-Relational Graph Neural Networks
Stars: ✭ 46 (-77%)
Mutual labels:  graph-algorithms
Advanced-Shortest-Paths-Algorithms
Java Code for Contraction Hierarchies Algorithm, A-Star Algorithm and Bidirectional Dijkstra Algorithm. Tested and Verified Code.
Stars: ✭ 63 (-68.5%)
Mutual labels:  graph-algorithms
pytorch-convcnp
A PyTorch Implementation of Convolutional Conditional Neural Process.
Stars: ✭ 41 (-79.5%)
Mutual labels:  pytorch-implementation
HuaweiCodeCraft2020
2020华为软件精英挑战赛
Stars: ✭ 14 (-93%)
Mutual labels:  graph-algorithms
RecommenderSystemsNotebooks
Set of notebooks analysing and discussing the ideas presented at Coursera's Recommender Systems course
Stars: ✭ 28 (-86%)
Mutual labels:  recommender-system
MolDQN-pytorch
A PyTorch Implementation of "Optimization of Molecules via Deep Reinforcement Learning".
Stars: ✭ 58 (-71%)
Mutual labels:  pytorch-implementation
nodegraph
NodeGraph - A simple directed graph with visualization UI.
Stars: ✭ 21 (-89.5%)
Mutual labels:  graph-algorithms
neuro-symbolic-ai-soc
Neuro-Symbolic Visual Question Answering on Sort-of-CLEVR using PyTorch
Stars: ✭ 41 (-79.5%)
Mutual labels:  pytorch-implementation
blossom
Edmonds's blossom algorithm for maximum weight matching in undirected graphs
Stars: ✭ 16 (-92%)
Mutual labels:  graph-algorithms
slopeone
PHP implementation of the Weighted Slope One rating-based collaborative filtering scheme.
Stars: ✭ 85 (-57.5%)
Mutual labels:  recommender-system
cycleGAN-PyTorch
A clean and lucid implementation of cycleGAN using PyTorch
Stars: ✭ 107 (-46.5%)
Mutual labels:  pytorch-implementation
VT-UNet
[MICCAI2022] This is an official PyTorch implementation for A Robust Volumetric Transformer for Accurate 3D Tumor Segmentation
Stars: ✭ 151 (-24.5%)
Mutual labels:  pytorch-implementation
openpose-pytorch
🔥 OpenPose api wrapper in PyTorch.
Stars: ✭ 52 (-74%)
Mutual labels:  pytorch-implementation
InterviewPrep
A repository containing link of good interview questions
Stars: ✭ 54 (-73%)
Mutual labels:  graph-algorithms
Representation-Learning-for-Information-Extraction
Pytorch implementation of Paper by Google Research - Representation Learning for Information Extraction from Form-like Documents.
Stars: ✭ 82 (-59%)
Mutual labels:  pytorch-implementation

Neural Graph Collaborative Filtering

This is my PyTorch implementation for the paper:

Xiang Wang, Xiangnan He, Meng Wang, Fuli Feng, and Tat-Seng Chua (2019). Neural Graph Collaborative Filtering, Paper in ACM DL or Paper in arXiv. In SIGIR'19, Paris, France, July 21-25, 2019.

The TensorFlow implementation can be found here.

Introduction

My implementation mainly refers to the original TensorFlow implementation. It has the evaluation metrics as the original project. Here is the example of Gowalla dataset:

Best Iter=[38]@[32904.5]	recall=[0.15571	0.21793	0.26385	0.30103	0.33170], precision=[0.04763	0.03370	0.02744	0.02359	0.02088], hit=[0.53996	0.64559	0.70464	0.74546	0.77406], ndcg=[0.22752	0.26555	0.29044	0.30926	0.32406]

Hope it can help you!

Environment Requirement

The code has been tested under Python 3.6.9. The required packages are as follows:

  • pytorch == 1.3.1
  • numpy == 1.18.1
  • scipy == 1.3.2
  • sklearn == 0.21.3

Example to Run the Codes

The instruction of commands has been clearly stated in the codes (see the parser function in NGCF/utility/parser.py).

  • Gowalla dataset
python main.py --dataset gowalla --regs [1e-5] --embed_size 64 --layer_size [64,64,64] --lr 0.0001 --save_flag 1 --pretrain 0 --batch_size 1024 --epoch 400 --verbose 1 --node_dropout [0.1] --mess_dropout [0.1,0.1,0.1]
  • Amazon-book dataset
python main.py --dataset amazon-book --regs [1e-5] --embed_size 64 --layer_size [64,64,64] --lr 0.0005 --save_flag 1 --pretrain 0 --batch_size 1024 --epoch 200 --verbose 50 --node_dropout [0.1] --mess_dropout [0.1,0.1,0.1]

Supplement

  • The parameter negative_slope of LeakyReLu was set to 0.2, since the default value of PyTorch and TensorFlow is different.
  • If the arguement node_dropout_flag is set to 1, it will lead to higher calculational cost.
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].