All Projects β†’ poloclub β†’ dodrio

poloclub / dodrio

Licence: MIT license
Exploring attention weights in transformer-based models with linguistic knowledge.

Programming Languages

Svelte
593 projects
javascript
184084 projects - #8 most used programming language
python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to dodrio

NLP-paper
🎨 🎨NLP θ‡ͺ焢语言倄理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-90.13%)
Mutual labels:  transformer, attention-mechanism
Eeg Dl
A Deep Learning library for EEG Tasks (Signals) Classification, based on TensorFlow.
Stars: ✭ 165 (-29.18%)
Mutual labels:  transformer, attention-mechanism
Overlappredator
[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stars: ✭ 106 (-54.51%)
Mutual labels:  transformer, attention-mechanism
Sockeye
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+324.89%)
Mutual labels:  transformer, attention-mechanism
TianChi AIEarth
TianChi AIEarth Contest Solution
Stars: ✭ 57 (-75.54%)
Mutual labels:  transformer, attention-mechanism
Se3 Transformer Pytorch
Implementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.
Stars: ✭ 73 (-68.67%)
Mutual labels:  transformer, attention-mechanism
Routing Transformer
Fully featured implementation of Routing Transformer
Stars: ✭ 149 (-36.05%)
Mutual labels:  transformer, attention-mechanism
Pytorch Original Transformer
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+76.39%)
Mutual labels:  transformer, attention-mechanism
Transformers-RL
An easy PyTorch implementation of "Stabilizing Transformers for Reinforcement Learning"
Stars: ✭ 107 (-54.08%)
Mutual labels:  transformer, attention-mechanism
Self Attention Cv
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (-10.3%)
Mutual labels:  transformer, attention-mechanism
Awesome Bert Nlp
A curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
Stars: ✭ 567 (+143.35%)
Mutual labels:  transformer, attention-mechanism
h-transformer-1d
Implementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (-48.07%)
Mutual labels:  transformer, attention-mechanism
Nmt Keras
Neural Machine Translation with Keras
Stars: ✭ 501 (+115.02%)
Mutual labels:  transformer, attention-mechanism
Eqtransformer
EQTransformer, a python package for earthquake signal detection and phase picking using AI.
Stars: ✭ 95 (-59.23%)
Mutual labels:  transformer, attention-mechanism
Transformer Tts
A Pytorch Implementation of "Neural Speech Synthesis with Transformer Network"
Stars: ✭ 418 (+79.4%)
Mutual labels:  transformer, attention-mechanism
Transformer In Generating Dialogue
An Implementation of 'Attention is all you need' with Chinese Corpus
Stars: ✭ 121 (-48.07%)
Mutual labels:  transformer, attention-mechanism
Transformer
A TensorFlow Implementation of the Transformer: Attention Is All You Need
Stars: ✭ 3,646 (+1464.81%)
Mutual labels:  transformer, attention-mechanism
Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+75.11%)
Mutual labels:  transformer, attention-mechanism
Linear Attention Transformer
Transformer based on a variant of attention that is linear complexity in respect to sequence length
Stars: ✭ 205 (-12.02%)
Mutual labels:  transformer, attention-mechanism
En-transformer
Implementation of E(n)-Transformer, which extends the ideas of Welling's E(n)-Equivariant Graph Neural Network to attention
Stars: ✭ 131 (-43.78%)
Mutual labels:  transformer, attention-mechanism

Dodrio

An interactive visualization system designed to help NLP researchers and practitioners analyze and compare attention weights in transformer-based models with linguistic knowledge.

build arxiv badge DOI:10.18653/v1/2021.acl-demo.16

For more information, check out our manuscript:

Dodrio: Exploring Transformer Models with Interactive Visualization. Zijie J. Wang, Robert Turko, and Duen Horng Chau. arXiv preprint 2021. arXiv:2103.14625.

Live Demo

For a live demo, visit: http://poloclub.github.io/dodrio/

Running Locally

Clone or download this repository:

git clone [email protected]:poloclub/dodrio.git

# use degit if you don't want to download commit histories
degit poloclub/dodrio

Install the dependencies:

npm install

Then run Dodrio:

npm run dev

Navigate to localhost:5000. You should see Dodrio running in your broswer :)

To see how we trained the Transformer or customize the visualization with a different model or dataset, visit the ./data-generation/ directory.

Credits

Dodrio was created by Jay Wang, Robert Turko, and Polo Chau.

Citation

@inproceedings{wangDodrioExploringTransformer2021,
  title = {Dodrio: {{Exploring Transformer Models}} with {{Interactive Visualization}}},
  shorttitle = {Dodrio},
  booktitle = {Proceedings of the 59th {{Annual Meeting}} of the {{Association}} for {{Computational Linguistics}} and the 11th {{International Joint Conference}} on {{Natural Language Processing}}: {{System Demonstrations}}},
  author = {Wang, Zijie J. and Turko, Robert and Chau, Duen Horng},
  year = {2021},
  pages = {132--141},
  publisher = {{Association for Computational Linguistics}},
  address = {{Online}},
  language = {en}
}

License

The software is available under the MIT License.

Contact

If you have any questions, feel free to open an issue or contact Jay Wang.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].