All Projects → microsoft → Graphormer

microsoft / Graphormer

Licence: MIT license
Graphormer is a deep learning package that allows researchers and developers to train custom models for molecule modeling tasks. It aims to accelerate the research and application in AI for molecule science, such as material design, drug discovery, etc.

Programming Languages

python
139335 projects - #7 most used programming language
cython
566 projects
shell
77523 projects

Projects that are alternatives of or similar to Graphormer

En-transformer
Implementation of E(n)-Transformer, which extends the ideas of Welling's E(n)-Equivariant Graph Neural Network to attention
Stars: ✭ 131 (-89.03%)
Mutual labels:  transformer
Zero-Shot-TTS
Unofficial Implementation of Zero-Shot Text-to-Speech for Text-Based Insertion in Audio Narration
Stars: ✭ 33 (-97.24%)
Mutual labels:  transformer
php-json-api
JSON API transformer outputting valid (PSR-7) API Responses.
Stars: ✭ 68 (-94.3%)
Mutual labels:  transformer
rx-scheduler-transformer
rxjava scheduler transformer tools for android
Stars: ✭ 15 (-98.74%)
Mutual labels:  transformer
Neural-Machine-Translation
Several basic neural machine translation models implemented by PyTorch & TensorFlow
Stars: ✭ 29 (-97.57%)
Mutual labels:  transformer
TRAR-VQA
[ICCV 2021] TRAR: Routing the Attention Spans in Transformers for Visual Question Answering -- Official Implementation
Stars: ✭ 49 (-95.9%)
Mutual labels:  transformer
svelte-jest
Jest Svelte component transformer
Stars: ✭ 37 (-96.9%)
Mutual labels:  transformer
h-transformer-1d
Implementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (-89.87%)
Mutual labels:  transformer
proc-that
proc(ess)-that - easy extendable ETL tool for Node.js. Written in TypeScript.
Stars: ✭ 25 (-97.91%)
Mutual labels:  transformer
DolboNet
Русскоязычный чат-бот для Discord на архитектуре Transformer
Stars: ✭ 53 (-95.56%)
Mutual labels:  transformer
kospeech
Open-Source Toolkit for End-to-End Korean Automatic Speech Recognition leveraging PyTorch and Hydra.
Stars: ✭ 456 (-61.81%)
Mutual labels:  transformer
keyword-transformer
Official implementation of the Keyword Transformer: https://arxiv.org/abs/2104.00769
Stars: ✭ 76 (-93.63%)
Mutual labels:  transformer
Highway-Transformer
[ACL‘20] Highway Transformer: A Gated Transformer.
Stars: ✭ 26 (-97.82%)
Mutual labels:  transformer
TitleStylist
Source code for our "TitleStylist" paper at ACL 2020
Stars: ✭ 72 (-93.97%)
Mutual labels:  transformer
german-sentiment
A data set and model for german sentiment classification.
Stars: ✭ 37 (-96.9%)
Mutual labels:  transformer
Transformer-MM-Explainability
[ICCV 2021- Oral] Official PyTorch implementation for Generic Attention-model Explainability for Interpreting Bi-Modal and Encoder-Decoder Transformers, a novel method to visualize any Transformer-based network. Including examples for DETR, VQA.
Stars: ✭ 484 (-59.46%)
Mutual labels:  transformer
basis-expansions
Basis expansion transformers in sklearn style.
Stars: ✭ 74 (-93.8%)
Mutual labels:  transformer
KitanaQA
KitanaQA: Adversarial training and data augmentation for neural question-answering models
Stars: ✭ 58 (-95.14%)
Mutual labels:  transformer
tensorflow-ml-nlp-tf2
텐서플로2와 머신러닝으로 시작하는 자연어처리 (로지스틱회귀부터 BERT와 GPT3까지) 실습자료
Stars: ✭ 245 (-79.48%)
Mutual labels:  transformer
FasterTransformer
Transformer related optimization, including BERT, GPT
Stars: ✭ 1,571 (+31.57%)
Mutual labels:  transformer

Graphormer is a deep learning package that allows researchers and developers to train custom models for molecule modeling tasks. It aims to accelerate the research and application in AI for molecule science, such as material discovery, drug discovery, etc. Project website.

Hiring

We are hiring at all levels (including FTE researchers and interns)! If you are interested in working with us on AI for Molecule Science, please send your resume to [email protected].

Highlights in Graphormer v2.0

  • The model, code, and script used in the Open Catalyst Challenge are available.
  • Pre-trained models on PCQM4M and PCQM4Mv2 are available, more pre-trained models are comming soon.
  • Supports interface and datasets of PyG, DGL, OGB, and OCP.
  • Supports fairseq backbone.
  • Document is online!

What's New:

03/10/2022

  1. We upload a technical report which describes improved benchmarks on PCQM4M & Open Catalyst Project.

12/22/2021

  1. Graphormer v2.0 is released. Enjoy!

12/10/2021

  1. Graphormer has won the Open Catalyst Challenge. The technical talk could be found through this link.
  2. The slides of NeurIPS 2021 could be found through this link.
  3. The new release of Graphormer is comming soon, as a general molecule modeling toolkit, with models used in OC dataset, completed pre-trained model zoo, flexible data interface, and higher effiency of training.

09/30/2021

  1. Graphormer has been accepted by NeurIPS 2021.
  2. We're hiring! Please contact shuz[at]microsoft.com for more information.

08/03/2021

  1. Codes and scripts are released.

06/16/2021

  1. Graphormer has won the 1st place of quantum prediction track of Open Graph Benchmark Large-Scale Challenge (KDD CUP 2021) [Competition Description] [Competition Result] [Technical Report] [Blog (English)] [Blog (Chinese)]

Get Started

Our primary documentation is at https://graphormer.readthedocs.io/ and is generated from this repository, which contains instructions for getting started, training new models and extending Graphormer with new model types and tasks.

Next you may want to read:

  • Examples showing command line usage of common tasks.

Requirements and Installation

Setup with Conda

bash install.sh

Citation

Please kindly cite this paper if you use the code:

@article{shi2022benchmarking,
  title={Benchmarking Graphormer on Large-Scale Molecular Modeling Datasets},
  author={Yu Shi and Shuxin Zheng and Guolin Ke and Yifei Shen and Jiacheng You and Jiyan He and Shengjie Luo and Chang Liu and Di He and Tie-Yan Liu},
  journal={arXiv preprint arXiv:2203.04810},
  year={2022},
  url={https://arxiv.org/abs/2203.04810}
}

@inproceedings{
ying2021do,
title={Do Transformers Really Perform Badly for Graph Representation?},
author={Chengxuan Ying and Tianle Cai and Shengjie Luo and Shuxin Zheng and Guolin Ke and Di He and Yanming Shen and Tie-Yan Liu},
booktitle={Thirty-Fifth Conference on Neural Information Processing Systems},
year={2021},
url={https://openreview.net/forum?id=OeWooOxFwDa}
}

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.

When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

Trademarks

This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].