All Projects → shiyuzh2007 → Asr

shiyuzh2007 / Asr

Licence: apache-2.0

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Asr

Kospeech
Open-Source Toolkit for End-to-End Korean Automatic Speech Recognition.
Stars: ✭ 190 (+251.85%)
Mutual labels:  seq2seq, asr, transformer
kospeech
Open-Source Toolkit for End-to-End Korean Automatic Speech Recognition leveraging PyTorch and Hydra.
Stars: ✭ 456 (+744.44%)
Mutual labels:  transformer, seq2seq, asr
Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+655.56%)
Mutual labels:  seq2seq, asr, transformer
Machine Translation
Stars: ✭ 51 (-5.56%)
Mutual labels:  seq2seq, transformer
pytorch-transformer-chatbot
PyTorch v1.2에서 생긴 Transformer API 를 이용한 간단한 Chitchat 챗봇
Stars: ✭ 44 (-18.52%)
Mutual labels:  transformer, seq2seq
deep-molecular-optimization
Molecular optimization by capturing chemist’s intuition using the Seq2Seq with attention and the Transformer
Stars: ✭ 60 (+11.11%)
Mutual labels:  transformer, seq2seq
transformer
A PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-48.15%)
Mutual labels:  transformer, seq2seq
Nmtpytorch
Sequence-to-Sequence Framework in PyTorch
Stars: ✭ 392 (+625.93%)
Mutual labels:  seq2seq, asr
torch-asg
Auto Segmentation Criterion (ASG) implemented in pytorch
Stars: ✭ 42 (-22.22%)
Mutual labels:  seq2seq, asr
Nlp Tutorials
Simple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
Stars: ✭ 394 (+629.63%)
Mutual labels:  seq2seq, transformer
Seq2seqchatbots
A wrapper around tensor2tensor to flexibly train, interact, and generate data for neural chatbots.
Stars: ✭ 466 (+762.96%)
Mutual labels:  seq2seq, transformer
transformer
Neutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (+11.11%)
Mutual labels:  transformer, seq2seq
Embedding
Embedding模型代码和学习笔记总结
Stars: ✭ 25 (-53.7%)
Mutual labels:  transformer, seq2seq
kosr
Korean speech recognition based on transformer (트랜스포머 기반 한국어 음성 인식)
Stars: ✭ 25 (-53.7%)
Mutual labels:  transformer, asr
speech-transformer
Transformer implementation speciaized in speech recognition tasks using Pytorch.
Stars: ✭ 40 (-25.93%)
Mutual labels:  transformer, asr
Text Classification Models Pytorch
Implementation of State-of-the-art Text Classification Models in Pytorch
Stars: ✭ 379 (+601.85%)
Mutual labels:  seq2seq, transformer
Joeynmt
Minimalist NMT for educational purposes
Stars: ✭ 420 (+677.78%)
Mutual labels:  seq2seq, transformer
Athena
an open-source implementation of sequence-to-sequence based speech processing engine
Stars: ✭ 542 (+903.7%)
Mutual labels:  asr, transformer
NLP-paper
🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-57.41%)
Mutual labels:  transformer, seq2seq
wenet
Production First and Production Ready End-to-End Speech Recognition Toolkit
Stars: ✭ 2,384 (+4314.81%)
Mutual labels:  transformer, asr

ASR Transformer

This project is originally forked from https://github.com/Kyubyong/transformer and https://github.com/chqiwang/transformer. We change it into ASR task. Five modeling units are compared on Mandarin Chinese ASR tasks with HKUST datasets by the ASR Transformer, including CI-phonemes, syllables, words, sub-words and characters.

Usage

1)config your *.yaml; 2)python train.py; 3)python third_party/tensor2tensor/avg_checkpoints.py; 4)python evaluate.py

Source Code for paper:

1)Zhou, S., Dong, L., Xu, S., & Xu, B. (2018). Syllable-Based Sequence-to-Sequence Speech Recognition with the Transformer in Mandarin Chinese. arXiv preprint arXiv:1804.10752.

2)Zhou, S., Dong, L., Xu, S., & Xu, B. (2018). A Comparison of Modeling Units in Sequence-to-Sequence Speech Recognition with the Transformer on Mandarin Chinese. arXiv preprint arXiv:1805.06239.

Some results:

Contact Raise an issue on github or email to [email protected].

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].