All Projects → hexiangnan → Attentional_factorization_machine

hexiangnan / Attentional_factorization_machine

TensorFlow Implementation of Attentional Factorization Machine

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Attentional factorization machine

Attentional Neural Factorization Machine
Attention,Factorization Machine, Deep Learning, Recommender System
Stars: ✭ 39 (-89.23%)
Mutual labels:  recommender-system, factorization-machines
Fastfm
fastFM: A Library for Factorization Machines
Stars: ✭ 908 (+150.83%)
Mutual labels:  recommender-system, factorization-machines
Openlearning4deeprecsys
Some deep learning based recsys for open learning.
Stars: ✭ 383 (+5.8%)
Mutual labels:  recommender-system, factorization-machines
Recommendation.jl
Building recommender systems in Julia
Stars: ✭ 42 (-88.4%)
Mutual labels:  recommender-system, factorization-machines
Fmg
KDD17_FMG
Stars: ✭ 116 (-67.96%)
Mutual labels:  recommender-system, factorization-machines
Rankfm
Factorization Machines for Recommendation and Ranking Problems with Implicit Feedback Data
Stars: ✭ 71 (-80.39%)
Mutual labels:  recommender-system, factorization-machines
Neural factorization machine
TenforFlow Implementation of Neural Factorization Machine
Stars: ✭ 422 (+16.57%)
Mutual labels:  recommender-system, factorization-machines
Flurs
🌊 FluRS: A Python library for streaming recommendation algorithms
Stars: ✭ 97 (-73.2%)
Mutual labels:  recommender-system, factorization-machines
Rsparse
Fast and accurate machine learning on sparse matrices - matrix factorizations, regression, classification, top-N recommendations.
Stars: ✭ 145 (-59.94%)
Mutual labels:  recommender-system, factorization-machines
Daisyrec
A developing recommender system in pytorch. Algorithm: KNN, LFM, SLIM, NeuMF, FM, DeepFM, VAE and so on, which aims to fair comparison for recommender system benchmarks
Stars: ✭ 280 (-22.65%)
Mutual labels:  recommender-system, factorization-machines
Tensorflow Xnn
Tensorflow implementation of DeepFM variant that won 4th Place in Mercari Price Suggestion Challenge on Kaggle.
Stars: ✭ 263 (-27.35%)
Mutual labels:  factorization-machines
Xlearn
High performance, easy-to-use, and scalable machine learning (ML) package, including linear model (LR), factorization machines (FM), and field-aware factorization machines (FFM) for Python and CLI interface.
Stars: ✭ 2,968 (+719.89%)
Mutual labels:  factorization-machines
Recommendation Systems Paperlist
Papers about recommendation systems that I am interested in
Stars: ✭ 308 (-14.92%)
Mutual labels:  recommender-system
Ytk Learn
Ytk-learn is a distributed machine learning library which implements most of popular machine learning algorithms(GBDT, GBRT, Mixture Logistic Regression, Gradient Boosting Soft Tree, Factorization Machines, Field-aware Factorization Machines, Logistic Regression, Softmax).
Stars: ✭ 337 (-6.91%)
Mutual labels:  factorization-machines
ds3-spring-2018
Материалы третьего набора офлайн-программы Data Scientist.
Stars: ✭ 22 (-93.92%)
Mutual labels:  recommender-system
Recsys
项亮的《推荐系统实践》的代码实现
Stars: ✭ 306 (-15.47%)
Mutual labels:  recommender-system
JNSKR
This is our implementation of JNSKR: Jointly Non-Sampling Learning for Knowledge Graph Enhanced Recommendation (SIGIR 2020)
Stars: ✭ 25 (-93.09%)
Mutual labels:  recommender-system
netflix-style-recommender
A simple movie recommendation engine
Stars: ✭ 65 (-82.04%)
Mutual labels:  recommender-system
Spotify-Song-Recommendation-ML
UC Berkeley team's submission for RecSys Challenge 2018
Stars: ✭ 70 (-80.66%)
Mutual labels:  recommender-system
Lightfm
A Python implementation of LightFM, a hybrid recommendation algorithm.
Stars: ✭ 3,884 (+972.93%)
Mutual labels:  recommender-system

attentional_factorization_machine

This is our implementation for the paper:

Jun Xiao, Hao Ye, Xiangnan He, Hanwang Zhang, Fei Wu and Tat-Seng Chua (2017). Attentional Factorization Machines: Learning the Weight of Feature Interactions via Attention Networks IJCAI, Melbourne, Australia, August 19-25, 2017.

We have additionally released our TensorFlow implementation of Factorization Machines under our proposed neural network framework.

Please cite our IJCAI'17 paper if you use our codes. Thanks!

Author: Xiangnan He ([email protected]) and Hao Ye ([email protected])

Environments

  • Tensorflow (version: 1.0.1)
  • numpy
  • sklearn

Dataset

We use the same input format as the LibFM toolkit (http://www.libfm.org/). In this instruction, we use MovieLens. The MovieLens data has been used for personalized tag recommendation, which contains 668,953 tag applications of users on movies. We convert each tag application (user ID, movie ID and tag) to a feature vector using one-hot encoding and obtain 90,445 binary features. The following examples are based on this dataset and it will be referred as ml-tag wherever in the files' name or inside the code. When the dataset is ready, the current directory should be like this:

  • code
    • AFM.py
    • FM.py
    • LoadData.py
  • data
    • ml-tag
      • ml-tag.train.libfm
      • ml-tag.validation.libfm
      • ml-tag.test.libfm

Quick Example with Optimal parameters

Use the following command to train the model with the optimal parameters:

# step into the code folder
cd code
# train FM model and save as pretrain file
python FM.py --dataset ml-tag --epoch 100 --pretrain -1 --batch_size 4096 --hidden_factor 256 --lr 0.01 --keep 0.7
# train AFM model using the pretrained weights from FM
python AFM.py --dataset ml-tag --epoch 100 --pretrain 1 --batch_size 4096 --hidden_factor [8,256] --keep [1.0,0.5] --lamda_attention 2.0 --lr 0.1

The instruction of commands has been clearly stated in the codes (see the parse_args function).

The current implementation supports regression classification, which optimizes RMSE.

Performance Comparison

Parameters

For the sake of a quick demonstration for the improvement of our AFM model compared to original FM, we set the dimension of the embedding factor to be 16 (instead of 256 in our paper), and epoch as 20.

Train

Step into the code folder and train FM and AFM as follows. This will start to train our AFM model on the dataset frappe based on the pretrained model of FM. The parameters have been initialized optimally according to our experiments. It will loop 20 epochs and print the best epoch depending on the validation result.

# step into the code folder
cd code
# train FM model with optimal parameters
python FM.py --dataset ml-tag --epoch 20 --pretrain -1 --batch_size 4096 --hidden_factor 16 --lr 0.01 --keep 0.7
# train AFM model with optimal parameters
python AFM.py --dataset ml-tag --epoch 20 --pretrain 1 --batch_size 4096 --hidden_factor [16,16] --keep [1.0,0.5] --lamda_attention 100.0 --lr 0.1

After the trainning processes finish, the trained models will be saved into the pretrain folder, which should be like this:

  • pretrain
    • afm_ml-tag_16
      • checkpoint
      • ml-tag_16.data-00000-of-00001
      • ml-tag_16.index
      • ml-tag_16.meta
    • fm_ml-tag_16
      • checkpoint
      • ml-tag_16.data-00000-of-00001
      • ml-tag_16.index
      • ml-tag_16.meta

Evaluate

Now it's time to evaluate the pretrained models with the test datasets, which can be done by running AFM.py and FM.py with --process evaluate as follows:

# evaluate the pretrained FM model
python FM.py --dataset ml-tag --epoch 20 --batch_size 4096 --lr 0.01 --keep 0.7 --process evaluate
# evaluate the pretrained AFM model
python AFM.py --dataset ml-tag --epoch 20 --pretrain 1 --batch_size 4096 --hidden_factor [16,16] --keep [1.0,0.5] --lamda_attention 100.0 --lr 0.1 --process evaluate

Last Update Date: Aug 2, 2017

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].