All Projects → easezyc → WSDM2022-PTUPCDR

easezyc / WSDM2022-PTUPCDR

Licence: other
This is the official implementation of our paper Personalized Transfer of User Preferences for Cross-domain Recommendation (PTUPCDR), which has been accepted by WSDM2022.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to WSDM2022-PTUPCDR

SIGIR2021 Conure
One Person, One Model, One World: Learning Continual User Representation without Forgetting
Stars: ✭ 23 (-64.62%)
Mutual labels:  recommendation, transfer-learning
GNN-Recommender-Systems
An index of recommendation algorithms that are based on Graph Neural Networks.
Stars: ✭ 505 (+676.92%)
Mutual labels:  recommendation-system, recommendation
Recommenders
Best Practices on Recommendation Systems
Stars: ✭ 11,818 (+18081.54%)
Mutual labels:  recommendation-system, recommendation
MetaHeac
This is an official implementation for "Learning to Expand Audience via Meta Hybrid Experts and Critics for Recommendation and Advertising"(KDD2021).
Stars: ✭ 36 (-44.62%)
Mutual labels:  recommendation, transfer-learning
Machine-Learning
Examples of all Machine Learning Algorithm in Apache Spark
Stars: ✭ 15 (-76.92%)
Mutual labels:  recommendation-system, recommendation
eTrust
Source code and dataset for TKDE 2019 paper “Trust Relationship Prediction in Alibaba E-Commerce Platform”
Stars: ✭ 14 (-78.46%)
Mutual labels:  recommendation
meta-learning-progress
Repository to track the progress in Meta-Learning (MtL), including the datasets and the current state-of-the-art for the most common MtL problems.
Stars: ✭ 26 (-60%)
Mutual labels:  transfer-learning
object detection
Implementatoin of object detection using Tensorflow 2.1.0 | this can be use in a car for object detection
Stars: ✭ 13 (-80%)
Mutual labels:  transfer-learning
smoke-detection-transfer-learning
use transfer learning to detect smoke in images and videos
Stars: ✭ 16 (-75.38%)
Mutual labels:  transfer-learning
seminar
ECNU ICA seminar materials
Stars: ✭ 14 (-78.46%)
Mutual labels:  recommendation-system
self-driving-car
Implementation of the paper "End to End Learning for Self-Driving Cars"
Stars: ✭ 54 (-16.92%)
Mutual labels:  transfer-learning
THACIL
Temporal Hierarchical Attention at Category- and Item-Level for Micro-Video Click-Through Prediction
Stars: ✭ 24 (-63.08%)
Mutual labels:  recommendation
tutorials
A tutorial series by Preferred.AI
Stars: ✭ 136 (+109.23%)
Mutual labels:  recommendation
awesome-graph-self-supervised-learning
Awesome Graph Self-Supervised Learning
Stars: ✭ 805 (+1138.46%)
Mutual labels:  transfer-learning
ReinventCommunity
No description or website provided.
Stars: ✭ 103 (+58.46%)
Mutual labels:  transfer-learning
Transfer-Learning-for-Fault-Diagnosis
This repository is for the transfer learning or domain adaptive with fault diagnosis.
Stars: ✭ 123 (+89.23%)
Mutual labels:  transfer-learning
recommender system with Python
recommender system tutorial with Python
Stars: ✭ 106 (+63.08%)
Mutual labels:  recommendation-system
cs6101
The Web IR / NLP Group (WING)'s public reading group at the National University of Singapore.
Stars: ✭ 17 (-73.85%)
Mutual labels:  recommendation-system
SAMN
This is our implementation of SAMN: Social Attentional Memory Network
Stars: ✭ 45 (-30.77%)
Mutual labels:  recommendation
data-science-popular-algorithms
Data Science algorithms and topics that you must know. (Newly Designed) Recommender Systems, Decision Trees, K-Means, LDA, RFM-Segmentation, XGBoost in Python, R, and Scala.
Stars: ✭ 65 (+0%)
Mutual labels:  recommendation-system

Personalized Transfer of User Preferences for Cross-domain Recommendation (PTUPCDR)

This is the official implementation of our paper Personalized Transfer of User Preferences for Cross-domain Recommendation (PTUPCDR), which has been accepted by WSDM2022. Paper

Cold-start problem is still a very challenging problem in recommender systems. Fortunately, the interactions of the cold-start users in the auxiliary source domain can help cold-start recommendations in the target domain. How to transfer user's preferences from the source domain to the target domain, is the key issue in Cross-domain Recommendation (CDR) which is a promising solution to deal with the cold-start problem. Most existing methods model a common preference bridge to transfer preferences for all users. Intuitively, since preferences vary from user to user, the preference bridges of different users should be different. Along this line, we propose a novel framework named Personalized Transfer of User Preferences for Cross-domain Recommendation (PTUPCDR). Specifically, a meta network fed with users' characteristic embeddings is learned to generate personalized bridge functions to achieve personalized transfer of preferences for each user. To learn the meta network stably, we employ a task-oriented optimization procedure. With the meta-generated personalized bridge function, the user's preference embedding in the source domain can be transformed into the target domain, and the transformed user preference embedding can be utilized as the initial embedding for the cold-start user in the target domain. Using large real-world datasets, we conduct extensive experiments to evaluate the effectiveness of PTUPCDR on both cold-start and warm-start stages.

Introduction

This repository provides the implementations of PTUPCDR and three popular baselines (TGTOnly, CMF, EMCDR):

Requirements

  • Python 3.6
  • Pytorch > 1.0
  • tensorflow
  • Pandas
  • Numpy
  • Tqdm

File Structure

.
├── code
│   ├── config.json         # Configurations
│   ├── entry.py            # Entry function
│   ├── models.py           # Models based on MF, GMF or Youtube DNN
│   ├── preprocessing.py    # Parsing and Segmentation
│   ├── readme.md
│   └── run.py              # Training and Evaluating 
└── data
    ├── mid                 # Mid data
    │   ├── Books.csv
    │   ├── CDs_and_Vinyl.csv
    │   └── Movies_and_TV.csv
    ├── raw                 # Raw data
    │   ├── reviews_Books_5.json.gz
    │   ├── reviews_CDs_and_Vinyl_5.json.gz
    │   └── reviews_Movies_and_TV_5.json.gz
    └── ready               # Ready to use
        ├── _2_8
        ├── _5_5
        └── _8_2

Dataset

We utilized the Amazon Reviews 5-score dataset. To download the Amazon dataset, you can use the following link: Amazon Reviews. Download the three domains: CDs and Vinyl, Movies and TV, Books (5-scores), and then put the data in ./data/raw.

You can use the following command to preprocess the dataset. The two-phase data preprocessing includes parsing the raw data and segmenting the mid data. The final data will be under ./data/ready.

python entry.py --process_data_mid 1 --process_data_ready 1

Run

Parameter Configuration:

  • task: different tasks within 1, 2 or 3, default for 1
  • base_model: different base models within MF, GMF or DNN, default for MF
  • ratio: train/test ratio within [0.8, 0.2], [0.5, 0.5] or [0.2, 0.8], default for [0.8, 0.2]
  • epoch: pre-training and CDR mapping training epoches, default for 10
  • seed: random seed, default for 2020
  • gpu: the index of gpu you will use, default for 0
  • lr: learning_rate, default for 0.01
  • model_name: base model for embedding, default for MF

You can run this model through:

# Run directly with default parameters 
python entry.py

# Reset training epoch to `10`
python entry.py --epoch 20

# Reset several parameters
python entry.py --gpu 1 --lr 0.02

# Reset seed (we use seed in[900, 1000, 10, 2020, 500])
python entry.py --seed 900

If you wanna try different weight decay, meta net dimension, embedding dimmension or more tasks, you may change the settings in ./code/config.json. Note that this repository consists of our PTUPCDR and three baselines, TGTOnly, CMF, and EMCDR.

Reference

Zhu, Yongchun, et al. "Personalized Transfer of User Preferences for Cross-domain Recommendation." Proceedings of the Fifteenth ACM International Conference on Web Search and Data Mining. 2022.

or in bibtex style:

@inproceedings{zhu2022personalized,
  title={Personalized Transfer of User Preferences for Cross-domain Recommendation},
  author={Zhu, Yongchun and Tang, Zhenwei and Liu, Yudan and Zhuang, Fuzhen and Xie, Ruobing and Zhang, Xu and Lin, Leyu and He, Qing},
  booktitle={Proceedings of the Fifteenth ACM International Conference on Web Search and Data Mining},
  pages={1507--1515},
  year={2022}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].