All Projects → slSeanWU → Jazz_transformer

slSeanWU / Jazz_transformer

Licence: mit
Transformer-XL for Jazz music composition. Paper: "The Jazz Transformer on the Front Line: Exploring the Shortcomings of AI-Composed Music through Quantitative Measures", ISMIR 2020

Programming Languages

python
139335 projects - #7 most used programming language
python3
1442 projects

Projects that are alternatives of or similar to Jazz transformer

Wenet
Production First and Production Ready End-to-End Speech Recognition Toolkit
Stars: ✭ 617 (+1613.89%)
Mutual labels:  transformer
Getting Things Done With Pytorch
Jupyter Notebook tutorials on solving real-world problems with Machine Learning & Deep Learning using PyTorch. Topics: Face detection with Detectron 2, Time Series anomaly detection with LSTM Autoencoders, Object Detection with YOLO v5, Build your first Neural Network, Time Series forecasting for Coronavirus daily cases, Sentiment Analysis with BERT.
Stars: ✭ 738 (+1950%)
Mutual labels:  transformer
Transgan
[Preprint] "TransGAN: Two Transformers Can Make One Strong GAN", Yifan Jiang, Shiyu Chang, Zhangyang Wang
Stars: ✭ 864 (+2300%)
Mutual labels:  transformer
Deep Ctr Prediction
CTR prediction models based on deep learning(基于深度学习的广告推荐CTR预估模型)
Stars: ✭ 628 (+1644.44%)
Mutual labels:  transformer
Trax
Trax — Deep Learning with Clear Code and Speed
Stars: ✭ 6,666 (+18416.67%)
Mutual labels:  transformer
Turbotransformers
a fast and user-friendly runtime for transformer inference (Bert, Albert, GPT2, Decoders, etc) on CPU and GPU.
Stars: ✭ 826 (+2194.44%)
Mutual labels:  transformer
React Native Svg Transformer
Import SVG files in your React Native project the same way that you would in a Web application.
Stars: ✭ 568 (+1477.78%)
Mutual labels:  transformer
Meta Emb
Multilingual Meta-Embeddings for Named Entity Recognition (RepL4NLP & EMNLP 2019)
Stars: ✭ 28 (-22.22%)
Mutual labels:  transformer
Rasa chatbot cn
building a chinese dialogue system based on the newest version of rasa(基于最新版本rasa搭建的对话系统)
Stars: ✭ 723 (+1908.33%)
Mutual labels:  transformer
Figma Transformer
A tiny utility library that makes the Figma API more human friendly.
Stars: ✭ 27 (-25%)
Mutual labels:  transformer
Laravel Responder
A Laravel Fractal package for building API responses, giving you the power of Fractal with Laravel's elegancy.
Stars: ✭ 673 (+1769.44%)
Mutual labels:  transformer
Easyflipviewpager
📖 The library for creating book and card flip animations in ViewPager in Android
Stars: ✭ 698 (+1838.89%)
Mutual labels:  transformer
Odsc 2020 nlp
Repository for ODSC talk related to Deep Learning NLP
Stars: ✭ 20 (-44.44%)
Mutual labels:  transformer
Awesome Fast Attention
list of efficient attention modules
Stars: ✭ 627 (+1641.67%)
Mutual labels:  transformer
Witwicky
Witwicky: An implementation of Transformer in PyTorch.
Stars: ✭ 21 (-41.67%)
Mutual labels:  transformer
Typescript Is
Stars: ✭ 595 (+1552.78%)
Mutual labels:  transformer
Bert Keras
Keras implementation of BERT with pre-trained weights
Stars: ✭ 820 (+2177.78%)
Mutual labels:  transformer
Nlp Experiments In Pytorch
PyTorch repository for text categorization and NER experiments in Turkish and English.
Stars: ✭ 35 (-2.78%)
Mutual labels:  transformer
Keras Textclassification
中文长文本分类、短句子分类、多标签分类、两句子相似度(Chinese Text Classification of Keras NLP, multi-label classify, or sentence classify, long or short),字词句向量嵌入层(embeddings)和网络层(graph)构建基类,FastText,TextCNN,CharCNN,TextRNN, RCNN, DCNN, DPCNN, VDCNN, CRNN, Bert, Xlnet, Albert, Attention, DeepMoji, HAN, 胶囊网络-CapsuleNet, Transformer-encode, Seq2seq, SWEM, LEAM, TextGCN
Stars: ✭ 914 (+2438.89%)
Mutual labels:  transformer
Cell Detr
Official and maintained implementation of the paper Attention-Based Transformers for Instance Segmentation of Cells in Microstructures [BIBM 2020].
Stars: ✭ 26 (-27.78%)
Mutual labels:  transformer

The Jazz Transformer

An adapted Transformer-XL deep learning model that composes Jazz music (lead sheets—chord progression & melody).

Tensorflow implementation of the automatic music composition model presented in our paper:

  • The Jazz Transformer on the Front Line: Exploring the Shortcomings of AI-composed Music through Quantitative Measures
    Shih-Lun Wu and Yi-Hsuan Yang
    The 21st International Society for Music Information Retrieval Conference (ISMIR), 2020.

Want to listen to some compositions by the Jazz Transformer first? Click here!

Usage Notes

Prerequisites

  • Python 3.6 (install)
  • Recommended: a working GPU with ≥2GB of memory
  • Install dependencies (pip or pip3, depending on your sytem)
pip3 install -r requirements.txt

Compose Some Songs Right Away

  • Download pretrained model
./download_model.sh
  • Inference (compose)
python3 inference.py [--model MODEL] [--temp TEMP] [--struct_csv CSV] [--n_bars N_BARS] output_midi
  • output_midi: path to the output MIDI file
  • --model MODEL: path to the trained model checkpoint (default: the downloaded checkpoint)
  • --temp TEMP: sampling temperature for generation (default: 1.2)
  • --n_bars N_BARS: # of bars to generate (default: 32)
  • --struct_csv CSV: path to the output csv file that records generated structure-related events (optional)

Train from Scratch

  • Preprocess dataset
./data_preprocess.sh
  • Train the model
python3 train.py ckpt_dir log_file
  • ckpt_dir: directory to save checkpoints
  • log_file: path to the log file

Likewise, you may compose music with the model trained by yourself using inference.py (see above for instructions)

Directory Structure

├── data_preprocess.sh      (executes python scripts to build vocab and prepare data) 
├── inference.py            (generates Jazz music)
├── requirements.txt        (python dependencies)
├── train.py                (trains Transformer-XL from scratch)
├── data                    (.pkl files for training)
├── mcsv_beat               (Jazzomat dataset content---beats+chords)
├── mcsv_melody             (Jazzomat dataset content---solo melody)
├── output                  (sample generated piece)
│   ├── demo.csv
│   ├── demo.midi
├── pickles                 (houses required metadata for training)
├── remi_encs_struct        (contains training data in readable REMI event sequences)
├── src
│   ├── build_chord_profile.py   (reads and stores key templates for different chord types defined in ``chord_profile.txt``)
│   ├── build_vocab.py           (builds the vocabulary for the Jazz Transformer)
│   ├── chord_processor.py       (the class and methods for converting notes to chords and vice versa)
│   ├── chord_profile.txt        (hand-crafted key templates for each chord type)
│   ├── containers.py            (container classes for events in mcsv files)
│   ├── convert_to_remi.py       (converts Jazzomat dataset to REMI events for training)
│   ├── explore_mcsv.py          (utilities for reading events from dataset .csv files)
│   ├── mcsv_to_midi.py          (converts mcsv file to midi format)
│   ├── midi_decoder.py          (the class and methods for conversion from REMI to midi)
│   ├── mlus_events.txt          (the mlu events used by the Jazz Transformer)
│   ├── mlu_processor.py         (the class and methods for defining and parsing Mid-level Unit (MLU) events)
│   ├── prepare_data.py          (splits data into training and validation sets before training the Jazz transformer)
│   ├── remi_containers.py       (container classes for REMI events)
│   ├── utils.py                 (miscellaneous utilities)
├── transformer_xl
│   ├── model_aug.py             (Jazz Transformer model)
│   ├── modules.py               (functions for constructing Transformer-XL)

Acknowledgements

The Jazz Transformer is trained on the Weimar Jazz Database (WJazzD), a dataset meticulously annotated by the Jazzomat Research Project (@ University of Music FRANZ LISZT Weimar). Many thanks to them for the great work and making it publicly accessible!

Also, we would like to thank Yi-Jen Shih (@ NTUEE, personal GitHub) for the help he provided in arranging the codes of this repository.

See Also

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].