All Projects → facebookresearch → Foltr Es

facebookresearch / Foltr Es

Licence: other
The source code to reproduce the results reported in the 'Federated Online Learning to Rank with Evolution Strategies' paper, published at WSDM 2019.

Projects that are alternatives of or similar to Foltr Es

Madmom tutorials
Tutorials for the madmom package.
Stars: ✭ 32 (-3.03%)
Mutual labels:  jupyter-notebook
Voice emotion
Detecting emotion in voices
Stars: ✭ 33 (+0%)
Mutual labels:  jupyter-notebook
Geemap
A Python package for interactive mapping with Google Earth Engine, ipyleaflet, and folium
Stars: ✭ 959 (+2806.06%)
Mutual labels:  jupyter-notebook
Machinelearningdeeplearning
李宏毅2021机器学习深度学习笔记PPT作业
Stars: ✭ 32 (-3.03%)
Mutual labels:  jupyter-notebook
Pm Pyro
PyMC3-like Interface for Pyro
Stars: ✭ 33 (+0%)
Mutual labels:  jupyter-notebook
Simple Ssd For Beginners
This repository contains easy SSD(Single Shot MultiBox Detector) implemented with Pytorch and is easy to read and learn
Stars: ✭ 33 (+0%)
Mutual labels:  jupyter-notebook
Pyhat
Python Hyperspectral Analysis Tools
Stars: ✭ 32 (-3.03%)
Mutual labels:  jupyter-notebook
Multitask Learning
MSc group project: Reproduction of 'Multi-Task Learning using Uncertainty to Weigh Losses for Scene Geometry and Semantics'; A. Kendall, Y. Gal, R. Cipolla
Stars: ✭ 33 (+0%)
Mutual labels:  jupyter-notebook
Attentive Neural Processes
implementing "recurrent attentive neural processes" to forecast power usage (w. LSTM baseline, MCDropout)
Stars: ✭ 33 (+0%)
Mutual labels:  jupyter-notebook
Multi Label
Pytorch code for multi-Instance multi-label problem
Stars: ✭ 33 (+0%)
Mutual labels:  jupyter-notebook
Lectures2020
Stars: ✭ 33 (+0%)
Mutual labels:  jupyter-notebook
Yolact Tutorial
A tutorial for using YOLACT in Google Colab
Stars: ✭ 33 (+0%)
Mutual labels:  jupyter-notebook
Pytorch Softplus Normalization Uncertainty Estimation Bayesian Cnn
PyTorch code for Paper "Uncertainty Estimations by Softplus normalization in Bayesian Convolutional Neural Networks with Variational Inference"
Stars: ✭ 33 (+0%)
Mutual labels:  jupyter-notebook
Omx
Open Matrix (OMX)
Stars: ✭ 32 (-3.03%)
Mutual labels:  jupyter-notebook
Aws Deepracer Workshops
DeepRacer workshop content
Stars: ✭ 968 (+2833.33%)
Mutual labels:  jupyter-notebook
Pydata Amsterdam 2016
Machine Learning with Scikit-Learn (material for pydata Amsterdam 2016)
Stars: ✭ 32 (-3.03%)
Mutual labels:  jupyter-notebook
Numerical methods youtube
Stars: ✭ 32 (-3.03%)
Mutual labels:  jupyter-notebook
Sanet Keras
Implement SANet for crowd counting in Keras.
Stars: ✭ 33 (+0%)
Mutual labels:  jupyter-notebook
Natural Language Processing
Resources for "Natural Language Processing" Coursera course.
Stars: ✭ 969 (+2836.36%)
Mutual labels:  jupyter-notebook
Object detection tools
Object detection useful tools for TensorFlow Object Detection API
Stars: ✭ 33 (+0%)
Mutual labels:  jupyter-notebook

Federated Online Learning to Rank with Evolution Strategies

This repo contains the code used to run experiments for the paper 'Federated Online Learning to Rank with Evolution Strategies', WSDM 2019 (link)

Installation

In order to reproduce the experiments, we need to (1) setup an appropriate python environment, and (2) download the datasets used in the paper. The steps below assume that we create a new conda environment solely for the purpose of running this code.

First, you need to install PyTorch (pytorch.org). Next, we will create a conda environment:

conda create --name federated python=3.6
source activate federated
# assuming you want to checkout the repo in the current directory
git clone https://github.com/facebookresearch/foltr-es.git && cd foltr-es
pip install -r requirements.txt

That's it! The next step is to download the datasets.

Getting datasets

In the paper, two datasets are used, MQ2007 and MQ2008. They can be downloaded from the Microsoft Research website.

After downloading MQ2007.rar and MQ2008.rar files, they have to be unpacked within the ./data folder.

Once the requirements are installed and the datasets are downloaded, one you can check that everything is in order by running pytest:

pytest -v

All tests should be passing.

Reproducing results

All notebooks are in the ./notebooks folder.

Estimating privacy loss by simulating user click behavior

To get the results that are reported in Table 2:

python eps_estimate.py

Hyperparameter investigation

Figures 1 - 3 are generated by the following Jupyter notebooks:

  • antithetic.ipynb: influence of antithetic variates on the optimization performance (Figure 1);
  • n_interactions.ipynb: batch size vs optimization performance (Figure 2);
  • privacy robustness.ipynb: privacy level vs optimization performance (Figure 3).

You can simply start a jupyter session in the root of the repo and re-run them.

Learning to rank experiments

The Figures 4 and 5 are generated by the notebooks letor-2007.ipynb and letor-2008.ipynb, respectively. However, these notebooks actually use pre-calculated scores of the baselines, that are stored in baselines.json. Once you want to re-generate those, you have to:

  • download SVMRank from the author's site and put it in ./svmrank/:
mkdir svmrank && cd svmrank
# an example for the linux64 platform; please check the site for more options
wget http://download.joachims.org/svm_rank/current/svm_rank_linux64.tar.gz
tar -xzf svm_rank_linux64.tar.gz
  • after then you can run the script to generate the baselines.json:
python baselines.py

Citation

If you find this code or the ideas in the paper useful in your research, please consider citing the paper:

@inproceedings{Kharitonov2019,
    title={Federated Online Learning to Rank with Evolution Strategies},
    author={Kharitonov, Eugene},
    booktitle={WSDM},
    year={2019}
}

License

foltr-es is CC-BY-NC licensed, as found in the LICENSE file.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].