All Projects → compsciencelab → Ligdream

compsciencelab / Ligdream

Licence: agpl-3.0
Novel molecules from a reference shape!

Programming Languages

python3
1442 projects

Projects that are alternatives of or similar to Ligdream

Dmm
Deep Markov Models
Stars: ✭ 103 (+119.15%)
Mutual labels:  jupyter-notebook, generative-model
First Order Model
This repository contains the source code for the paper First Order Motion Model for Image Animation
Stars: ✭ 11,964 (+25355.32%)
Mutual labels:  jupyter-notebook, generative-model
Spectralnormalizationkeras
Spectral Normalization for Keras Dense and Convolution Layers
Stars: ✭ 100 (+112.77%)
Mutual labels:  jupyter-notebook, generative-model
Vae protein function
Protein function prediction using a variational autoencoder
Stars: ✭ 57 (+21.28%)
Mutual labels:  jupyter-notebook, generative-model
Tf Vqvae
Tensorflow Implementation of the paper [Neural Discrete Representation Learning](https://arxiv.org/abs/1711.00937) (VQ-VAE).
Stars: ✭ 226 (+380.85%)
Mutual labels:  jupyter-notebook, generative-model
Mine Mutual Information Neural Estimation
A pytorch implementation of MINE(Mutual Information Neural Estimation)
Stars: ✭ 167 (+255.32%)
Mutual labels:  jupyter-notebook, generative-model
Psgan
Periodic Spatial Generative Adversarial Networks
Stars: ✭ 108 (+129.79%)
Mutual labels:  jupyter-notebook, generative-model
Discogan Pytorch
PyTorch implementation of "Learning to Discover Cross-Domain Relations with Generative Adversarial Networks"
Stars: ✭ 961 (+1944.68%)
Mutual labels:  jupyter-notebook, generative-model
Neuralnetworks.thought Experiments
Observations and notes to understand the workings of neural network models and other thought experiments using Tensorflow
Stars: ✭ 199 (+323.4%)
Mutual labels:  jupyter-notebook, generative-model
Dragan
A stable algorithm for GAN training
Stars: ✭ 189 (+302.13%)
Mutual labels:  jupyter-notebook, generative-model
Generative models tutorial with demo
Generative Models Tutorial with Demo: Bayesian Classifier Sampling, Variational Auto Encoder (VAE), Generative Adversial Networks (GANs), Popular GANs Architectures, Auto-Regressive Models, Important Generative Model Papers, Courses, etc..
Stars: ✭ 276 (+487.23%)
Mutual labels:  jupyter-notebook, generative-model
Pytorch Mnist Vae
Stars: ✭ 32 (-31.91%)
Mutual labels:  jupyter-notebook, generative-model
Deepdream pytorch
Stars: ✭ 46 (-2.13%)
Mutual labels:  jupyter-notebook
Sub Gc
Code repository for our paper "Comprehensive Image Captioning via Scene Graph Decomposition" in ECCV 2020.
Stars: ✭ 46 (-2.13%)
Mutual labels:  jupyter-notebook
Variantnet
A simple neural network for calling het-/hom-variants from alignments of single molecule reads to a reference
Stars: ✭ 46 (-2.13%)
Mutual labels:  jupyter-notebook
Sdc System Integration
Self Driving Car Engineer Nanodegree System Integration Capstone Project
Stars: ✭ 46 (-2.13%)
Mutual labels:  jupyter-notebook
Price Forecaster
Forecasting the future prices of BTC and More using Machine and Deep Learning Models
Stars: ✭ 47 (+0%)
Mutual labels:  jupyter-notebook
Animefacenotebooks
notebooks and some data for playing with animeface stylegan2 and deepdanbooru
Stars: ✭ 47 (+0%)
Mutual labels:  jupyter-notebook
Marvin Public Engines
Marvin AI has been accepted into the Apache Foundation and is now available at https://github.com/apache/incubator-marvin
Stars: ✭ 46 (-2.13%)
Mutual labels:  jupyter-notebook
Ce9010 2018
Python notebooks and slides for CE9010: Introduction to Data Science, Semester 2 2017/18
Stars: ✭ 46 (-2.13%)
Mutual labels:  jupyter-notebook

LigDream: Shape-Based Compound Generation

Citing

If you are using content of the repository please consider citing the follow work:

@article{skalic2019shape,
  title={Shape-Based Generative Modeling for de-novo Drug Design},
  author={Skalic, Miha and Jim{\'e}nez Luna, Jos{\'e} and Sabbadin, Davide and De Fabritiis, Gianni},
  journal={Journal of chemical information and modeling},
  doi = {10.1021/acs.jcim.8b00706},
  publisher={ACS Publications}
}

Requirements

Model training is written in pytorch==0.3.1 and uses keras==2.2.2 for data loaders. RDKit==2017.09.2.0 and HTMD==1.13.9 are needed for molecule manipulation.

Add the repo to your pythonpath

  export PYTHONPATH=/path/to/ligdream/repo/:$PYTHONPATH

Before starting

For the training a smi file is needed. We used subset of the Zinc15 dataset, using only the drug-like. The same cleaned dataset can be retrieve by using the getDataset.sh script. The latter will download the smi file required for the training (see next section).

  bash getDataset.sh

In the traindataset folder there will be the zinc15_druglike_clean_canonical_max60.smi file that is required for the training step (see next section).

For the generation stage the model files are necessary. It is possible to use the ones that are generated during the training step or you can download the ones that we have already generated by using the following script:

  bash getWeights.sh

In the modelweights folder there will be the three models:

  • decoder-210000.pkl
  • encoder-210000.pkl
  • vae-210000.pkl

Training

Note that training runs on a GPU and it will take several days to complete.

First construct a set of training molecules:

$ python prepare_data.py -i "./path/to/my/smiles.smi" -o "./path/to/my/smiles.npy"

Secondly, execute the training of a model:

$ python train.py -i "./path/to/my/smiles.npy" -o "./path/to/models"

Generation

Web based compund generation is available at https://playmolecule.org/LigDream/.

For an example of local novel compound generation please follow notebook generate.ipynb.

License

Code is released under GNU AFFERO GENERAL PUBLIC LICENSE.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].