All Projects → anthony-wang → CrabNet

anthony-wang / CrabNet

Licence: MIT license
Predict materials properties using only the composition information!

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to CrabNet

Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+615.79%)
Mutual labels:  transformer, attention, attention-mechanism
seq2seq-pytorch
Sequence to Sequence Models in PyTorch
Stars: ✭ 41 (-28.07%)
Mutual labels:  transformer, attention, self-attention
Self Attention Cv
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (+266.67%)
Mutual labels:  transformer, attention, attention-mechanism
visualization
a collection of visualization function
Stars: ✭ 189 (+231.58%)
Mutual labels:  transformer, attention, attention-mechanism
SMACT
Python package to aid materials design
Stars: ✭ 46 (-19.3%)
Mutual labels:  materials-science, materials-informatics, materials-screening
Pytorch Original Transformer
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+621.05%)
Mutual labels:  transformer, attention, attention-mechanism
h-transformer-1d
Implementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (+112.28%)
Mutual labels:  transformer, attention, attention-mechanism
learningspoons
nlp lecture-notes and source code
Stars: ✭ 29 (-49.12%)
Mutual labels:  transformer, attention
Transformers-RL
An easy PyTorch implementation of "Stabilizing Transformers for Reinforcement Learning"
Stars: ✭ 107 (+87.72%)
Mutual labels:  transformer, attention-mechanism
masci-tools
Tools, utility, parsers useful in daily material science work
Stars: ✭ 18 (-68.42%)
Mutual labels:  materials-science, materials-informatics
MolDQN-pytorch
A PyTorch Implementation of "Optimization of Molecules via Deep Reinforcement Learning".
Stars: ✭ 58 (+1.75%)
Mutual labels:  materials-science, materials-informatics
lstm-attention
Attention-based bidirectional LSTM for Classification Task (ICASSP)
Stars: ✭ 87 (+52.63%)
Mutual labels:  attention, attention-mechanism
Jddc solution 4th
2018-JDDC大赛第4名的解决方案
Stars: ✭ 235 (+312.28%)
Mutual labels:  transformer, attention
TianChi AIEarth
TianChi AIEarth Contest Solution
Stars: ✭ 57 (+0%)
Mutual labels:  transformer, attention-mechanism
Im2LaTeX
An implementation of the Show, Attend and Tell paper in Tensorflow, for the OpenAI Im2LaTeX suggested problem
Stars: ✭ 16 (-71.93%)
Mutual labels:  attention, attention-mechanism
query-selector
LONG-TERM SERIES FORECASTING WITH QUERYSELECTOR – EFFICIENT MODEL OF SPARSEATTENTION
Stars: ✭ 63 (+10.53%)
Mutual labels:  transformer, self-attention
Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+5896.49%)
Mutual labels:  transformer, attention
MASTER-pytorch
Code for the paper "MASTER: Multi-Aspect Non-local Network for Scene Text Recognition" (Pattern Recognition 2021)
Stars: ✭ 263 (+361.4%)
Mutual labels:  transformer, self-attention
En-transformer
Implementation of E(n)-Transformer, which extends the ideas of Welling's E(n)-Equivariant Graph Neural Network to attention
Stars: ✭ 131 (+129.82%)
Mutual labels:  transformer, attention-mechanism
tilde
Materials informatics framework for ab initio data repositories
Stars: ✭ 19 (-66.67%)
Mutual labels:  materials-science, materials-informatics

Compositionally-Restricted Attention-Based Network (CrabNet)

This software package implements the Compositionally-Restricted Attention-Based Network (CrabNet) that takes only composition information to predict material properties. Additionally, it demonstrates several model interpretability techniques that are possible with CrabNet.

Table of Contents

  • Publications / How to cite
  • Installation and basic use of CrabNet
  • Model interpretability with CrabNet
  • Maintainers

Publications / How to cite

This repository contains the code accompanying two CrabNet publications. Please consider citing them if you want to use CrabNet or the techniques discussed within the works:

  1. The "CrabNet publication": A. Y.-T. Wang, S. K. Kauwe, R. J. Murdock, T. D. Sparks, Compositionally restricted attention-based network for materials property predictions, npj Comput. Mater., 2021, 7: 77. DOI: 10.1038/s41524-021-00545-1.
  2. The "ExplainableGap publication": A. Y.-T. Wang, M. S. Mahmoud, M. Czasny, A. Gurlo, CrabNet for Explainable Deep Learning in Materials Science: Bridging the Gap Between Academia and Industry, Integr. Mater. Manuf. Innov., 2022, 11 (1): 41-56. DOI: 10.1007/s40192-021-00247-y.

The references in bibtex form:

@article{Wang2021crabnet,
 author = {Wang, Anthony Yu-Tung and Kauwe, Steven K. and Murdock, Ryan J. and Sparks, Taylor D.},
 year = {2021},
 title = {Compositionally restricted attention-based network for materials property predictions},
 pages = {77},
 volume = {7},
 number = {1},
 doi = {10.1038/s41524-021-00545-1},
 publisher = {{Nature Publishing Group}},
 shortjournal = {npj Comput. Mater.},
 journal = {npj Computational Materials}
}
@article{Wang2022explainablegap,
 author = {Wang, Anthony Yu-Tung and Mahmoud, Mahamad Salah and Czasny, Mathias and Gurlo, Aleksander},
 year = {2022},
 title = {CrabNet for Explainable Deep Learning in Materials Science: Bridging the Gap Between Academia and Industry},
 url = {https://doi.org/10.1007/s40192-021-00247-y},
 pages = {41--56},
 volume = {11},
 number = {1},
 shortjournal = {Integr. Mater. Manuf. Innov.},
 journal = {Integrating Materials and Manufacturing Innovation},
 doi = {10.1007/s40192-021-00247-y},
 publisher = {{Springer International Publishing AG}}
}

Installation and basic use of CrabNet

For the steps, please see the readme located at README_CrabNet.md.

Model interpretability with CrabNet

For the steps, please see the readme located at README_ExplainableGap.md.

Maintainers

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].