All Projects → jthickstun → Pytorch_musicnet

jthickstun / Pytorch_musicnet

PyTorch DataSet and Jupyter demos for MusicNet

Projects that are alternatives of or similar to Pytorch musicnet

Siraj chatbot challenge
Entry for machine learning
Stars: ✭ 50 (-1.96%)
Mutual labels:  jupyter-notebook
Bnlearn
Python package for learning the graphical structure of Bayesian networks, parameter learning, inference and sampling methods.
Stars: ✭ 51 (+0%)
Mutual labels:  jupyter-notebook
Workshops
DSSG Workshops
Stars: ✭ 51 (+0%)
Mutual labels:  jupyter-notebook
Sgn 41007
Supplementary materials for course "Pattern Recognition and Machine Learning" at
Stars: ✭ 50 (-1.96%)
Mutual labels:  jupyter-notebook
Russiansuperglue
Russian SuperGLUE benchmark
Stars: ✭ 51 (+0%)
Mutual labels:  jupyter-notebook
Machine Learning Asset Management
Machine Learning in Asset Management (by @firmai)
Stars: ✭ 1,060 (+1978.43%)
Mutual labels:  jupyter-notebook
Scona
Code to analyse structural covariance brain networks using python.
Stars: ✭ 50 (-1.96%)
Mutual labels:  jupyter-notebook
Bookcnn
《深度卷积网络:原理与实践》现已在淘宝天猫京东当当发售. 这里是其中的代码下载.
Stars: ✭ 51 (+0%)
Mutual labels:  jupyter-notebook
Stock Trading
『파이썬과 리액트를 활용한 주식 자동 시스템』 예제 코드
Stars: ✭ 51 (+0%)
Mutual labels:  jupyter-notebook
Training toolbox caffe
Training Toolbox for Caffe
Stars: ✭ 51 (+0%)
Mutual labels:  jupyter-notebook
Bigartm Book
Topic modeling with BigARTM: an interactive book
Stars: ✭ 50 (-1.96%)
Mutual labels:  jupyter-notebook
Pytorch Transfomer
My implementation of the transformer architecture from the Attention is All you need paper applied to time series.
Stars: ✭ 51 (+0%)
Mutual labels:  jupyter-notebook
Gym Continuousdoubleauction
A custom MARL (multi-agent reinforcement learning) environment where multiple agents trade against one another (self-play) in a zero-sum continuous double auction. Ray [RLlib] is used for training.
Stars: ✭ 50 (-1.96%)
Mutual labels:  jupyter-notebook
Doubletdetection
Doublet detection in single-cell RNA-seq data.
Stars: ✭ 50 (-1.96%)
Mutual labels:  jupyter-notebook
Phik
Phi_K correlation analyzer library
Stars: ✭ 51 (+0%)
Mutual labels:  jupyter-notebook
Sketchback
Keras implementation of sketch inversion using deep convolution neural networks (synthesising photo-realistic images from pencil sketches)
Stars: ✭ 50 (-1.96%)
Mutual labels:  jupyter-notebook
Advbox
Advbox is a toolbox to generate adversarial examples that fool neural networks in PaddlePaddle、PyTorch、Caffe2、MxNet、Keras、TensorFlow and Advbox can benchmark the robustness of machine learning models. Advbox give a command line tool to generate adversarial examples with Zero-Coding.
Stars: ✭ 1,055 (+1968.63%)
Mutual labels:  jupyter-notebook
Blackbox Attack
Blackbox attacks for deep neural network models
Stars: ✭ 51 (+0%)
Mutual labels:  jupyter-notebook
Pix2code Template
Build a neural network to code a basic a HTML and CSS website based on a picture of a design mockup.
Stars: ✭ 51 (+0%)
Mutual labels:  jupyter-notebook
Machine Learning Decal Spring 2019
A 2-unit decal run by [email protected]'s education team
Stars: ✭ 51 (+0%)
Mutual labels:  jupyter-notebook

pytorch_musicnet

This repository provides PyTorch DataSet and DataLoader classes for MusicNet (musicnet.py) and two Jupyter that demonstrate how to use this dataset to train a simple MusicNet model using PyTorch. The DataSet class provides a built-in download method to acquire MusicNet. Note that MusicNet expands to ~50Gb uncompressed. The download process leaves behind temporary files that can be safely removed if necessary to save space: the musicnet.tar.gz file, and the raw ".wav" files in the data directories. MusicNet models will often optimize more efficiently if you can afford to load the entire dataset into memory (~21Gb of RAM). If you don't have this much RAM, you can set mmap=False in the DataSet class to load data directly from disk.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].