All Projects → vanzytay → Quaterniontransformers

vanzytay / Quaterniontransformers

Repository for ACL 2019 paper

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Quaterniontransformers

Corenlp
Stanford CoreNLP: A Java suite of core NLP tools.
Stars: ✭ 8,248 (+14628.57%)
Mutual labels:  natural-language-processing
Thot
Thot toolkit for statistical machine translation
Stars: ✭ 53 (-5.36%)
Mutual labels:  natural-language-processing
Vietnamese Electra
Electra pre-trained model using Vietnamese corpus
Stars: ✭ 55 (-1.79%)
Mutual labels:  natural-language-processing
Lingua Franca
Mycroft's multilingual text parsing and formatting library
Stars: ✭ 51 (-8.93%)
Mutual labels:  natural-language-processing
Fasttext multilingual
Multilingual word vectors in 78 languages
Stars: ✭ 1,067 (+1805.36%)
Mutual labels:  natural-language-processing
Market Reporter
Automatic Generation of Brief Summaries of Time-Series Data
Stars: ✭ 54 (-3.57%)
Mutual labels:  natural-language-processing
Convai Baseline
ConvAI baseline solution
Stars: ✭ 49 (-12.5%)
Mutual labels:  natural-language-processing
Research papers
Record some papers I have read and paper notes I have taken, also including some awesome papers reading lists and academic blog posts.
Stars: ✭ 55 (-1.79%)
Mutual labels:  natural-language-processing
Notes
The notes for Math, Machine Learning, Deep Learning and Research papers.
Stars: ✭ 53 (-5.36%)
Mutual labels:  natural-language-processing
Emotion Detector
A python code to detect emotions from text
Stars: ✭ 54 (-3.57%)
Mutual labels:  natural-language-processing
Iob2corpus
Japanese IOB2 tagged corpus for Named Entity Recognition.
Stars: ✭ 51 (-8.93%)
Mutual labels:  natural-language-processing
Python Tutorial Notebooks
Python tutorials as Jupyter Notebooks for NLP, ML, AI
Stars: ✭ 52 (-7.14%)
Mutual labels:  natural-language-processing
Jieba Php
"結巴"中文分詞:做最好的 PHP 中文分詞、中文斷詞組件。 / "Jieba" (Chinese for "to stutter") Chinese text segmentation: built to be the best PHP Chinese word segmentation module.
Stars: ✭ 1,073 (+1816.07%)
Mutual labels:  natural-language-processing
Spark Nkp
Natural Korean Processor for Apache Spark
Stars: ✭ 50 (-10.71%)
Mutual labels:  natural-language-processing
Coarij
Corpus of Annual Reports in Japan
Stars: ✭ 55 (-1.79%)
Mutual labels:  natural-language-processing
Pattern
Web mining module for Python, with tools for scraping, natural language processing, machine learning, network analysis and visualization.
Stars: ✭ 8,112 (+14385.71%)
Mutual labels:  natural-language-processing
Nltk Book Resource
Notes and solutions to complement the official NLTK book
Stars: ✭ 54 (-3.57%)
Mutual labels:  natural-language-processing
Hmtl
🌊HMTL: Hierarchical Multi-Task Learning - A State-of-the-Art neural network model for several NLP tasks based on PyTorch and AllenNLP
Stars: ✭ 1,084 (+1835.71%)
Mutual labels:  natural-language-processing
Demos
Some JavaScript works published as demos, mostly ML or DS
Stars: ✭ 55 (-1.79%)
Mutual labels:  natural-language-processing
Scdv
Text classification with Sparse Composite Document Vectors.
Stars: ✭ 54 (-3.57%)
Mutual labels:  natural-language-processing

QuaternionTransformers

This is our Tensor2Tensor implementation of Quaternion Transformers. This paper will be presented in the upcoming ACL 2019 in Florence.

Dependencies

  1. Tensorflow 1.12.0
  2. Tensor2Tensor 1.12.0
  3. Python 2.7

Usage

  1. The usage of this repository follows the original Tensor2Tensor repository (e.g., t2t-datagen, t2t-trainer followed by t2t-decoder). It helps to gain familiarity on T2T before attempting to run our code.
  2. Setting --t2t_usr_dir=./QuaternionTransformers will allow T2T to register Quaternion Transformers. To verify, using t2t-trainer --registry_help to verify that you are able to load Quaternion transformers.
  3. You should be able to load MODEL=quaternion_transformer and use base or big setting as per normal.
  4. Be sure to set --hparams="self_attention_type="quaternion_dot_product"" to activate Quaternion Attention.
  5. By default, Quaternion FFNs are activated for positional FFN layers. To revert and not use Quaternion FFNs on the position-wise FFN, set --hparams="ffn_layer="raw_dense_relu_dense".

Citation

If you find our work useful, please consider citing our paper:

@article{tay2019lightweight,
  title={Lightweight and Efficient Neural Natural Language Processing with Quaternion Networks},
  author={Tay, Yi and Zhang, Aston and Tuan, Luu Anh and Rao, Jinfeng and Zhang, Shuai and Wang, Shuohang and Fu, Jie and Hui, Siu Cheung},
  journal={arXiv preprint arXiv:1906.04393},
  year={2019}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].