All Projects → alexarnimueller → Lstm_peptides

alexarnimueller / Lstm_peptides

Licence: other
Long short-term memory recurrent neural networks for learning peptide and protein sequences to later design new, similar examples.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Lstm peptides

Basicocr
BasicOCR是一个致力于解决自然场景文字识别算法研究的项目。该项目由长城数字大数据应用技术研究院佟派AI团队发起和维护。
Stars: ✭ 336 (+1020%)
Mutual labels:  lstm, rnn
Multi Class Text Classification Cnn Rnn
Classify Kaggle San Francisco Crime Description into 39 classes. Build the model with CNN, RNN (GRU and LSTM) and Word Embeddings on Tensorflow.
Stars: ✭ 570 (+1800%)
Mutual labels:  lstm, rnn
Thesemicolon
This repository contains Ipython notebooks and datasets for the data analytics youtube tutorials on The Semicolon.
Stars: ✭ 345 (+1050%)
Mutual labels:  lstm, rnn
Rnnsharp
RNNSharp is a toolkit of deep recurrent neural network which is widely used for many different kinds of tasks, such as sequence labeling, sequence-to-sequence and so on. It's written by C# language and based on .NET framework 4.6 or above versions. RNNSharp supports many different types of networks, such as forward and bi-directional network, sequence-to-sequence network, and different types of layers, such as LSTM, Softmax, sampled Softmax and others.
Stars: ✭ 277 (+823.33%)
Mutual labels:  lstm, rnn
Lstm Sentiment Analysis
Sentiment Analysis with LSTMs in Tensorflow
Stars: ✭ 886 (+2853.33%)
Mutual labels:  lstm, rnn
Unet Zoo
A collection of UNet and hybrid architectures in PyTorch for 2D and 3D Biomedical Image segmentation
Stars: ✭ 302 (+906.67%)
Mutual labels:  lstm, rnn
Video Classification
Tutorial for video classification/ action recognition using 3D CNN/ CNN+RNN on UCF101
Stars: ✭ 543 (+1710%)
Mutual labels:  lstm, rnn
Pytorch-POS-Tagger
Part-of-Speech Tagger and custom implementations of LSTM, GRU and Vanilla RNN
Stars: ✭ 24 (-20%)
Mutual labels:  lstm, rnn
Deep Music Genre Classification
🎵 Using Deep Learning to Categorize Music as Time Progresses Through Spectrogram Analysis
Stars: ✭ 23 (-23.33%)
Mutual labels:  lstm, rnn
Ad examples
A collection of anomaly detection methods (iid/point-based, graph and time series) including active learning for anomaly detection/discovery, bayesian rule-mining, description for diversity/explanation/interpretability. Analysis of incorporating label feedback with ensemble and tree-based detectors. Includes adversarial attacks with Graph Convolutional Network.
Stars: ✭ 641 (+2036.67%)
Mutual labels:  lstm, rnn
Deeplearning.ai Assignments
Stars: ✭ 268 (+793.33%)
Mutual labels:  lstm, rnn
Seq2seq Chatbot
Chatbot in 200 lines of code using TensorLayer
Stars: ✭ 777 (+2490%)
Mutual labels:  lstm, rnn
Lstm Human Activity Recognition
Human Activity Recognition example using TensorFlow on smartphone sensors dataset and an LSTM RNN. Classifying the type of movement amongst six activity categories - Guillaume Chevalier
Stars: ✭ 2,943 (+9710%)
Mutual labels:  lstm, rnn
Tensorflow poems
中文古诗自动作诗机器人,屌炸天,基于tensorflow1.10 api,正在积极维护升级中,快star,保持更新!
Stars: ✭ 3,429 (+11330%)
Mutual labels:  lstm, rnn
sgrnn
Tensorflow implementation of Synthetic Gradient for RNN (LSTM)
Stars: ✭ 40 (+33.33%)
Mutual labels:  lstm, rnn
Easy Deep Learning With Keras
Keras tutorial for beginners (using TF backend)
Stars: ✭ 367 (+1123.33%)
Mutual labels:  lstm, rnn
tiny-rnn
Lightweight C++11 library for building deep recurrent neural networks
Stars: ✭ 41 (+36.67%)
Mutual labels:  lstm, rnn
hcn
Hybrid Code Networks https://arxiv.org/abs/1702.03274
Stars: ✭ 81 (+170%)
Mutual labels:  lstm, rnn
Telemanom
A framework for using LSTMs to detect anomalies in multivariate time series data. Includes spacecraft anomaly data and experiments from the Mars Science Laboratory and SMAP missions.
Stars: ✭ 589 (+1863.33%)
Mutual labels:  lstm, rnn
Stockpriceprediction
Stock Price Prediction using Machine Learning Techniques
Stars: ✭ 700 (+2233.33%)
Mutual labels:  lstm, rnn

LSTM_peptides

Introduction

This repository contains scripts for training a generative long short-term memory recurrent neural network on peptide sequences. The user can provide sets of amino acid sequences to train the model, and finally invoke sampling of sequences that should be similar to the training data. As such, artificial intelligence is put in charge of de novo design of new peptide sequences. The code in this repository relies on the keras package by Chollet and others (https://github.com/fchollet/keras) with tensorflow backend (http://tensorflow.org) as well as on sklearn (http://scikit-learn.org) and modlamp (https://modlamp.org).

Content

  • README.md: this file
  • LSTM_peptides.py: contains the main code in the following two classes:
    • SequenceHandler: class that is used for reading amino acid sequences and translating them into a one-hot vector encoding.
    • Model: class that generates and trains the model, can perform cross-validation and plot training and validation loss.
  • requirements.txt: requirements / package dependencies
  • LICENSE: MIT opensource license

How To Install And Use

Clone the directory to your computer by:

git clone https://github.com/alexarnimueller/LSTM_peptides

Then, install all requirements (in requirements.txt). In this folder, type:

pip install -r requirements.txt

Finally run the model as follows (with your own parameters provided, see the list below):

python LSTM_peptides.py --dataset $TRAINING_DATA_FILE --name $YOUR_RUN_NAME  $FURTHER_OPTIONAL_PARAMETERS

Parameters:

  • dataset (default=training_sequences_noC.csv)
    • file containing training data with one sequence per line
  • name (default=test)
    • run name for all generated data; a new directory will be created with this name
  • batch_size (OPTIONAL, default=128)
    • Batch size to use by the model.
  • epochs (OPTIONAL, default=50)
    • Number of epochs to train the model.
  • layers (OPTIONAL, default=2)
    • Number of LSTM layers in the model.
  • neurons (OPTIONAL, default=256)
    • Number of units per LSTM layer.
  • cell (OPTIONAL, default=LSTM)
    • type of neuron to use, available: LSTM, GRU
  • dropout (OPTIONAL, default=0.1)
    • Fraction of dropout to apply to the network. This scales with depth, so layer1 gets 1*dropout, Layer2 2*dropout etc.
  • train (OPTIONAL, default=True)
    • Whether to train the model (True) or just sample from a pre-trained model (False).
  • valsplit (OPTIONAL, default=0.2)
    • Fraction of the data to use for model validation. If 0, no validation is performed.
  • sample (OPTIONAL, default=100)
    • Number of sequences to sample from the model after training.
  • temp (OPTIONAL, default=1.25)
    • Temperature to use for sampling.
  • maxlen (OPTIONAL, default=0)
    • Maximum sequence length allowed when sampling new sequences. If 0, the longest sequence length of the training data is maxlen
  • startchar (OPTIONAL, default=B)
  • lr (OPTIONAL, default=0.01)
    • Learning rate to be used for Adam optimizer.
  • modfile (OPTIONAL, default=None)
    • If train=False, a pre-trained model file needs to be provided, e.g. modfile=./checkpoint/model_epoch_49.hdf5.
  • cv (OPTIONAL, default=None)
    • Folds of cross-validation to use for model validation. If None, no cross-validation is performed.
  • window (OPTIONAL, default=0)
    • Size of window to use for enhancing training data by sliding-windows. If 0, all sequences are padded to the length of the longest sequence in the data set.
  • step (OPTIONAL, default=1)
    • Step size to move the sliding window or the prediction target
  • target (OPTIONAL, default=all)
    • whether to learn all proceeding characters or just the last the single next one in sequence
  • target (OPTIONAL, default='all')
    • whether to learn all proceeding characters or just the last 'one' in sequence
  • padlen (OPTIONAL, default=0)
    • number of tailing padding spaces to add to the sequences. If 0, sequences are padded to the length of the longest sequence in the dataset.
  • refs, (OPTIONAL, default=True
    • whether reference sequence sets should be generated for the analysis

Example: training a 2-layer model with 64 neurons on new sequences for 100 epochs

python LSTM_peptides.py --name train100 --dataset new_sequences.csv --layers 2 --neurons 64 --epochs 100

Example: sampling 100 sequences from a pre-trained model

python LSTM_peptides.py --name testsample --modfile pretrained_model/checkpoint/model_epoch_99.hdf5 --train False --sample 100

Example: finetune a pre-trained model on a finetuning set for 10 epochs

python LSTM_peptides.py --name finetune10 --dataset finetune_set.csv --modfile pretrained_model/checkpoint/model_epoch_99.hdf5 --epochs 10--train False --finetune True

Cite

When using this code for any publication, please cite the following article:

A. T. Müller, J. A. Hiss, G. Schneider, "Recurrent Neural Network Model for Constructive Peptide Design" J. Chem. Inf . Model. 2018, DOI: 10.1021/acs.jcim.7b00414.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].