All Projects → boathit → deepgtt

boathit / deepgtt

Licence: other
DeepGTT: Learning Travel Time Distributions with Deep Generative Model

Programming Languages

Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language
julia
2034 projects

Projects that are alternatives of or similar to deepgtt

Cada Vae Pytorch
Official implementation of the paper "Generalized Zero- and Few-Shot Learning via Aligned Variational Autoencoders" (CVPR 2019)
Stars: ✭ 198 (+560%)
Mutual labels:  vae
DeepSSM SysID
Official PyTorch implementation of "Deep State Space Models for Nonlinear System Identification", 2020.
Stars: ✭ 62 (+106.67%)
Mutual labels:  vae
benchmark VAE
Unifying Variational Autoencoder (VAE) implementations in Pytorch (NeurIPS 2022)
Stars: ✭ 1,211 (+3936.67%)
Mutual labels:  vae
Vq Vae
Minimalist implementation of VQ-VAE in Pytorch
Stars: ✭ 224 (+646.67%)
Mutual labels:  vae
Human body prior
VPoser: Variational Human Pose Prior
Stars: ✭ 244 (+713.33%)
Mutual labels:  vae
MIDI-VAE
No description or website provided.
Stars: ✭ 56 (+86.67%)
Mutual labels:  vae
Twostagevae
Stars: ✭ 192 (+540%)
Mutual labels:  vae
InpaintNet
Code accompanying ISMIR'19 paper titled "Learning to Traverse Latent Spaces for Musical Score Inpaintning"
Stars: ✭ 48 (+60%)
Mutual labels:  vae
Video prediction
Stochastic Adversarial Video Prediction
Stars: ✭ 247 (+723.33%)
Mutual labels:  vae
soft-intro-vae-pytorch
[CVPR 2021 Oral] Official PyTorch implementation of Soft-IntroVAE from the paper "Soft-IntroVAE: Analyzing and Improving Introspective Variational Autoencoders"
Stars: ✭ 170 (+466.67%)
Mutual labels:  vae
Tf Vqvae
Tensorflow Implementation of the paper [Neural Discrete Representation Learning](https://arxiv.org/abs/1711.00937) (VQ-VAE).
Stars: ✭ 226 (+653.33%)
Mutual labels:  vae
Vae Cvae Mnist
Variational Autoencoder and Conditional Variational Autoencoder on MNIST in PyTorch
Stars: ✭ 229 (+663.33%)
Mutual labels:  vae
language-models
Keras implementations of three language models: character-level RNN, word-level RNN and Sentence VAE (Bowman, Vilnis et al 2016).
Stars: ✭ 39 (+30%)
Mutual labels:  vae
Pytorch Vq Vae
PyTorch implementation of VQ-VAE by Aäron van den Oord et al.
Stars: ✭ 204 (+580%)
Mutual labels:  vae
EfficientMORL
EfficientMORL (ICML'21)
Stars: ✭ 22 (-26.67%)
Mutual labels:  vae
S Vae Tf
Tensorflow implementation of Hyperspherical Variational Auto-Encoders
Stars: ✭ 198 (+560%)
Mutual labels:  vae
pytorch-convcnp
A PyTorch Implementation of Convolutional Conditional Neural Process.
Stars: ✭ 41 (+36.67%)
Mutual labels:  deep-generative-models
vae-concrete
Keras implementation of a Variational Auto Encoder with a Concrete Latent Distribution
Stars: ✭ 51 (+70%)
Mutual labels:  vae
probai-2019
Materials of the Nordic Probabilistic AI School 2019.
Stars: ✭ 127 (+323.33%)
Mutual labels:  deep-generative-models
Fun-with-MNIST
Playing with MNIST. Machine Learning. Generative Models.
Stars: ✭ 23 (-23.33%)
Mutual labels:  vae

DeepGTT

This repository holds the code used in our WWW-19 paper: Learning Travel Time Distributions with Deep Generative Model.

Requirements

  • Ubuntu OS (16.04 and 18.04 are tested)
  • Julia >= 1.0
  • Python >= 3.6
  • PyTorch >= 0.4 (both 0.4 and 1.0 are tested)

Please refer to the source code to install the required packages in both Julia and Python. You can install packages for Julia in shell as

julia -e 'using Pkg; Pkg.add("HDF5"); Pkg.add("CSV"); Pkg.add("DataFrames"); Pkg.add("Distances"); Pkg.add("StatsBase"); Pkg.add("JSON"); Pkg.add("Lazy"); Pkg.add("JLD2"); Pkg.add("ArgParse")'

Dataset

The dataset contains 1 million+ trips collected by 1,3000+ taxi cabs during 5 days. This dataset is a subset of the one we used in the paper, but it suffices to reproduce the results that are very close to what we have reported in the paper.

git clone https://github.com/boathit/deepgtt

cd deepgtt && mkdir -p data/h5path data/jldpath data/trainpath data/validpath data/testpath

Download the dataset and put the extracted *.h5 files into deepgtt/data/h5path.

Data format

Each h5 file contains n trips of the day. For each trip, it has three fields lon (longitude), lat (latitude), tms (timestamp). You can read the h5 file using the readtripsh5 function in Julia. If you want to use your own data, you can also refer to readtripsh5 to dump your trajectories into the required hdf5 files.

Preprocessing

Map matching

First, setting up the map server and matching server by referring to barefoot.

Then, matching the trips

cd deepgtt/harbin/julia

julia -p 6 mapmatch.jl --inputpath ../data/h5path --outputpath ../data/jldpath

where 6 is the number of cpu cores available in your machine.

Generate training, validation and test data

julia gentraindata.jl --inputpath ../data/jldpath --outputpath ../data/trainpath

cd .. && mv data/trainpath/150106.h5 data/validpath && mv data/trainpath/150107.h5 data/testpath

Training

To run the python code, make sure you have set up the road network postgresql server by referring to the map server setup in barefoot. The road network server (see this file) is used to provide road segment features for the model.

cd deepgtt/harbin/python

python train.py -trainpath ../data/trainpath -validpath ../data/validpath -kl_decay 0.0 -use_selu -random_emit

Testing

python estimate.py -testpath ../data/testpath

Reference

@inproceedings{www19xc,
  author    = {Xiucheng Li and
               Gao Cong and
               Aixin Sun and
               Yun Cheng},
  title     = {Learning Travel Time Distributions with Deep Generative Model},
  booktitle = {Proceedings of the 2019 World Wide Web Conference on World Wide Web,
               {WWW} 2019, San Francisco, California, May 13-17, 2019},
  year      = {2019},
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].