All Projects → klicperajo → Dimenet

klicperajo / Dimenet

Licence: other
DimeNet and DimeNet++ models, as proposed in "Directional Message Passing for Molecular Graphs" (ICLR 2020) and "Fast and Uncertainty-Aware Directional Message Passing for Non-Equilibrium Molecules" (NeurIPS-W 2020)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Dimenet

Diffwave
DiffWave is a fast, high-quality neural vocoder and waveform synthesizer.
Stars: ✭ 139 (+31.13%)
Mutual labels:  paper, pretrained-models
Wavegrad
A fast, high-quality neural vocoder.
Stars: ✭ 138 (+30.19%)
Mutual labels:  paper, pretrained-models
Awesome Distributed Systems
A curated list to learn about distributed systems
Stars: ✭ 7,263 (+6751.89%)
Mutual labels:  paper, architecture
Quad Ci
A CI server written in Simple Haskell.
Stars: ✭ 98 (-7.55%)
Mutual labels:  architecture
Universal Resume
Minimal and formal résumé (CV) website template for print, mobile, and desktop. https://bit.ly/ur_demo
Stars: ✭ 1,349 (+1172.64%)
Mutual labels:  paper
Library
Collection of papers in the field of distributed systems, game theory, cryptography, cryptoeconomics, zero knowledge
Stars: ✭ 100 (-5.66%)
Mutual labels:  paper
Android Kotlin Clean Architecture
Android Sample Clean Architecture App written in Kotlin
Stars: ✭ 1,562 (+1373.58%)
Mutual labels:  architecture
Papers Literature Ml Dl Rl Ai
Highly cited and useful papers related to machine learning, deep learning, AI, game theory, reinforcement learning
Stars: ✭ 1,341 (+1165.09%)
Mutual labels:  paper
Gate
A high performant & paralleled Minecraft proxy server with scalability, flexibility & excellent server version support - ready for the cloud!
Stars: ✭ 102 (-3.77%)
Mutual labels:  paper
Covid Twitter Bert
Pretrained BERT model for analysing COVID-19 Twitter data
Stars: ✭ 101 (-4.72%)
Mutual labels:  pretrained-models
Aspnetboilerplate
ASP.NET Boilerplate - Web Application Framework
Stars: ✭ 10,061 (+9391.51%)
Mutual labels:  architecture
Awesome Transfer Learning
Best transfer learning and domain adaptation resources (papers, tutorials, datasets, etc.)
Stars: ✭ 1,349 (+1172.64%)
Mutual labels:  paper
Evolution Strategies Starter
Code for the paper "Evolution Strategies as a Scalable Alternative to Reinforcement Learning"
Stars: ✭ 1,363 (+1185.85%)
Mutual labels:  paper
Log4brains
✍️ Log and publish your architecture decisions (ADR)
Stars: ✭ 98 (-7.55%)
Mutual labels:  architecture
Dev Stuff
😎 Programming stuff for everyone. Collection of articles, videos about architecture, Domain Driven Design, microservices, testing etc.
Stars: ✭ 105 (-0.94%)
Mutual labels:  architecture
Copymtl
AAAI20 "CopyMTL: Copy Mechanism for Joint Extraction of Entities and Relations with Multi-Task Learning"
Stars: ✭ 97 (-8.49%)
Mutual labels:  paper
Linter Farch
Make sure the filenames stay the same, control them! 👁
Stars: ✭ 101 (-4.72%)
Mutual labels:  architecture
Dataset
Crop/Weed Field Image Dataset
Stars: ✭ 98 (-7.55%)
Mutual labels:  paper
Cleanarchitecture
An example of how to implement various clean coding & architecture techniques. Technologies used: .Net Core, Razor Pages, EF Core, Bootstrap 4
Stars: ✭ 98 (-7.55%)
Mutual labels:  architecture
Research And Coding
研究资源列表 A curated list of research resources
Stars: ✭ 100 (-5.66%)
Mutual labels:  paper

Directional Message Passing Neural Network (DimeNet and DimeNet++)

Reference implementation of the DimeNet model proposed in the paper:

Directional Message Passing for Molecular Graphs
by Johannes Klicpera, Janek Groß, Stephan Günnemann
Published at ICLR 2020.

As well as DimeNet++, its significantly faster successor:

Fast and Uncertainty-Aware Directional Message Passing for Non-Equilibrium Molecules
by Johannes Klicpera, Shankari Giri, Johannes T. Margraf, Stephan Günnemann
Published at the ML for Molecules workshop, NeurIPS 2020.

Run the code

This repository contains a notebook for training the model (train.ipynb) and for generating predictions on the test set with a trained model (predict.ipynb). It also contains a script for training the model on a cluster with Sacred and SEML (train_seml.py). For faster experimentation we also offer two sets of pretrained models, which you can find in the pretrained folder.

DimeNet++ and TF2

The new DimeNet++ model is both 8x faster and 10% more accurate, so we recommend using this model instead of the original.

There are some slight differences between this repository and the original (TF1) DimeNet model, such as slightly different training and initialization in TF2. This implementation uses orthogonal Glorot initialization in the output layer for the targets alpha, R2, U0, U, H, G, and Cv and zero initialization for Mu, HOMO, LUMO, and ZPVE. The paper only used zero initialization for the output layer.

The following table gives an overview of all MAEs:

Architecture

DimeNet

DimeNet++

Requirements

The repository uses these packages:

numpy
scipy>=1.3
sympy>=1.5
tensorflow>=2.1
tensorflow_addons
tqdm

Known issues

Unfortunately there are a few issues/bugs in the code (and paper) that we can't fix without retraining the models. So far, these are:

  • The second distance used for calculating the angles is switched (DimeNet).
  • The envelope function is implicitly divided by the distance. This is accounted for in the radial bessel basis layer but leads to an incorrect spherical basis layer (DimeNet and DimeNet++).
  • DimeNet was evaluated on MD17's Benzene17 dataset, but compared to sGDML on Benzene18, which gives sGDML an unfair advantage.
  • In TensorFlow AddOns <0.12 there is a bug when checkpointing. The earlier versions require explicitly passing the _optimizer variable of the MovingAverage optimizer. This is only relevant if you actually load checkpoints from disk and continue training (DimeNet and DimeNet++).
  • The radial basis functions in the interaction block actually use d_kj and not d_ji. The best way to fix this is by just using d_ji instead of d_kj in the SBF and leaving the RBF unchanged (DimeNet and DimeNet++).

Contact

Please contact [email protected] if you have any questions.

Cite

Please cite our papers if you use the model or this code in your own work:

@inproceedings{klicpera_dimenet_2020,
  title = {Directional Message Passing for Molecular Graphs},
  author = {Klicpera, Johannes and Gro{\ss}, Janek and G{\"u}nnemann, Stephan},
  booktitle={International Conference on Learning Representations (ICLR)},
  year = {2020}
}

@inproceedings{klicpera_dimenetpp_2020,
title = {Fast and Uncertainty-Aware Directional Message Passing for Non-Equilibrium Molecules},
author = {Klicpera, Johannes and Giri, Shankari and Margraf, Johannes T. and G{\"u}nnemann, Stephan},
booktitle={NeurIPS-W},
year = {2020} }
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].