All Projects → juho-lee → Set_transformer

juho-lee / Set_transformer

Licence: mit
Pytorch implementation of set transformer

Projects that are alternatives of or similar to Set transformer

Skylift
Wi-Fi Geolocation Spoofing with the ESP8266
Stars: ✭ 223 (-0.45%)
Mutual labels:  jupyter-notebook
Rethinking Numpyro
Statistical Rethinking (2nd ed.) with NumPyro
Stars: ✭ 225 (+0.45%)
Mutual labels:  jupyter-notebook
Datascienceprojects
The code repository for projects and tutorials in R and Python that covers a variety of topics in data visualization, statistics sports analytics and general application of probability theory.
Stars: ✭ 223 (-0.45%)
Mutual labels:  jupyter-notebook
Sohu competition
Sohu's 2018 content recognition competition 1st solution(搜狐内容识别大赛第一名解决方案)
Stars: ✭ 224 (+0%)
Mutual labels:  jupyter-notebook
Machinelearningwithpython
Starter files for Pluralsight course: Understanding Machine Learning with Python
Stars: ✭ 224 (+0%)
Mutual labels:  jupyter-notebook
Deeplearning cv notes
📓 deepleaning and cv notes.
Stars: ✭ 223 (-0.45%)
Mutual labels:  jupyter-notebook
Video to bvh
Convert human motion from video to .bvh
Stars: ✭ 222 (-0.89%)
Mutual labels:  jupyter-notebook
Poretools
a toolkit for working with Oxford nanopore data
Stars: ✭ 225 (+0.45%)
Mutual labels:  jupyter-notebook
Tutorial
Tutorial covering Open Source tools for Source Separation.
Stars: ✭ 223 (-0.45%)
Mutual labels:  jupyter-notebook
Lstm Crf Medical
构建医疗实体识别的模型,包含词典和语料标注,基于python构建
Stars: ✭ 224 (+0%)
Mutual labels:  jupyter-notebook
Notebook
my note
Stars: ✭ 221 (-1.34%)
Mutual labels:  jupyter-notebook
Sdc Vehicle Detection
Udacity Project - Vehicle Detection
Stars: ✭ 224 (+0%)
Mutual labels:  jupyter-notebook
Zoom Learn Zoom
computational zoom from raw sensor data
Stars: ✭ 224 (+0%)
Mutual labels:  jupyter-notebook
Dragonn
A toolkit to learn how to model and interpret regulatory sequence data using deep learning.
Stars: ✭ 222 (-0.89%)
Mutual labels:  jupyter-notebook
Ml From Scratch
机器学习算法 基于西瓜书以及《统计学习方法》,当然包括DL。
Stars: ✭ 225 (+0.45%)
Mutual labels:  jupyter-notebook
Machine Learning Notebooks
Machine Learning notebooks for refreshing concepts.
Stars: ✭ 222 (-0.89%)
Mutual labels:  jupyter-notebook
Gan steerability
On the "steerability" of generative adversarial networks
Stars: ✭ 225 (+0.45%)
Mutual labels:  jupyter-notebook
Source separation
Deep learning based speech source separation using Pytorch
Stars: ✭ 226 (+0.89%)
Mutual labels:  jupyter-notebook
Lrp toolbox
The LRP Toolbox provides simple and accessible stand-alone implementations of LRP for artificial neural networks supporting Matlab and Python. The Toolbox realizes LRP functionality for the Caffe Deep Learning Framework as an extension of Caffe source code published in 10/2015.
Stars: ✭ 225 (+0.45%)
Mutual labels:  jupyter-notebook
Attention network with keras
An example attention network with simple dataset.
Stars: ✭ 225 (+0.45%)
Mutual labels:  jupyter-notebook

set_transformer

Official PyTorch implementation of the paper Set Transformer: A Framework for Attention-based Permutation-Invariant Neural Networks .

Requirements

  • Python 3
  • torch >= 1.0
  • matplotlib
  • scipy
  • tqdm

Abstract

Many machine learning tasks such as multiple instance learning, 3D shape recognition, and few-shot image classification are defined on sets of instances. Since solutions to such problems do not depend on the order of elements of the set, models used to address them should be permutation invariant. We present an attention-based neural network module, the Set Transformer, specifically designed to model interactions among elements in the input set. The model consists of an encoder and a decoder, both of which rely on attention mechanisms. In an effort to reduce computational complexity, we introduce an attention scheme inspired by inducing point methods from sparse Gaussian process literature. It reduces the computation time of self-attention from quadratic to linear in the number of elements in the set. We show that our model is theoretically attractive and we evaluate it on a range of tasks, demonstrating the state-of-the-art performance compared to recent methods for set-structured data.

Experiments

This repository implements the maximum value regression (section 5.1), amortized clustering (section 5.3), and point cloud classification (section 5.5) experiments in the paper.

Maximum Value Regression

This experiment is reproduced in max_regression_demo.ipynb.

Amortized Clustering

To run the amortized clustering experiment with Set Transformer, run

python run.py --net=set_transformer

To run the same experiment with Deep Sets, run

python run.py --net=deepset

Point Cloud Classification

We used the same preprocessed ModelNet40 dataset used in the DeepSets paper. We cannot publicly share this file due to copyright and license issues. To run this code, you must obtain the preprocessed dataset "ModelNet40_cloud.h5". We recommend using multiple GPUs for this experiment; we used 8 Tesla P40s.

To run the point cloud classification experiment, run

python main_pointcloud.py --batch_size 256 --num_pts 100
python main_pointcloud.py --batch_size 256 --num_pts 1000
python main_pointcloud.py --batch_size 256 --num_pts 5000

The hyperparameters here were minimally tuned yet reproduced the results in the paper. It is likely that further tuning will get better results.

Reference

If you found the provided code useful, please consider citing our work.

@InProceedings{lee2019set,
    title={Set Transformer: A Framework for Attention-based Permutation-Invariant Neural Networks},
    author={Lee, Juho and Lee, Yoonho and Kim, Jungtaek and Kosiorek, Adam and Choi, Seungjin and Teh, Yee Whye},
    booktitle={Proceedings of the 36th International Conference on Machine Learning},
    pages={3744--3753},
    year={2019}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].