SockeyeSequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (-72.85%)
Nmt KerasNeural Machine Translation with Keras
Stars: ✭ 501 (-86.26%)
Pytorch Original TransformerMy implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (-88.73%)
pynmta simple and complete pytorch implementation of neural machine translation system
Stars: ✭ 13 (-99.64%)
WitwickyWitwicky: An implementation of Transformer in PyTorch.
Stars: ✭ 21 (-99.42%)
Neural spEnd-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (-88.81%)
Transformer TtsA Pytorch Implementation of "Neural Speech Synthesis with Transformer Network"
Stars: ✭ 418 (-88.54%)
Eeg DlA Deep Learning library for EEG Tasks (Signals) Classification, based on TensorFlow.
Stars: ✭ 165 (-95.47%)
Keras AttentionVisualizing RNNs using the attention mechanism
Stars: ✭ 697 (-80.88%)
Transformer TensorflowTensorFlow implementation of 'Attention Is All You Need (2017. 6)'
Stars: ✭ 319 (-91.25%)
DabData Augmentation by Backtranslation (DAB) ヽ( •_-)ᕗ
Stars: ✭ 294 (-91.94%)
Speech TransformerA PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Stars: ✭ 565 (-84.5%)
Gpt 2 Tensorflow2.0OpenAI GPT2 pre-training and sequence prediction implementation in Tensorflow 2.0
Stars: ✭ 172 (-95.28%)
Overlappredator[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stars: ✭ 106 (-97.09%)
Linear Attention Recurrent Neural NetworkA recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Stars: ✭ 119 (-96.74%)
Rust BertRust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
Stars: ✭ 510 (-86.01%)
Onnxt5Summarization, translation, sentiment-analysis, text-generation and more at blazing speed using a T5 version implemented in ONNX.
Stars: ✭ 143 (-96.08%)
transformerA simple TensorFlow implementation of the Transformer
Stars: ✭ 25 (-99.31%)
kospeechOpen-Source Toolkit for End-to-End Korean Automatic Speech Recognition leveraging PyTorch and Hydra.
Stars: ✭ 456 (-87.49%)
NLP-paper🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-99.37%)
h-transformer-1dImplementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (-96.68%)
dodrioExploring attention weights in transformer-based models with linguistic knowledge.
Stars: ✭ 233 (-93.61%)
speech-transformerTransformer implementation speciaized in speech recognition tasks using Pytorch.
Stars: ✭ 40 (-98.9%)
OverlapPredator[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stars: ✭ 293 (-91.96%)
enformer-pytorchImplementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch
Stars: ✭ 146 (-96%)
KospeechOpen-Source Toolkit for End-to-End Korean Automatic Speech Recognition.
Stars: ✭ 190 (-94.79%)
Image-CaptionUsing LSTM or Transformer to solve Image Captioning in Pytorch
Stars: ✭ 36 (-99.01%)
FragmentVCAny-to-any voice conversion by end-to-end extracting and fusing fine-grained voice fragments with attention
Stars: ✭ 134 (-96.32%)
EqtransformerEQTransformer, a python package for earthquake signal detection and phase picking using AI.
Stars: ✭ 95 (-97.39%)
Se3 Transformer PytorchImplementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.
Stars: ✭ 73 (-98%)
Awesome Bert NlpA curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
Stars: ✭ 567 (-84.45%)
Self Attention CvImplementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (-94.27%)
Linear Attention TransformerTransformer based on a variant of attention that is linear complexity in respect to sequence length
Stars: ✭ 205 (-94.38%)
transformerNeutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (-98.35%)
Transformer-in-TransformerAn Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches
Stars: ✭ 40 (-98.9%)
TransformerA Pytorch Implementation of "Attention is All You Need" and "Weighted Transformer Network for Machine Translation"
Stars: ✭ 271 (-92.57%)
Transformers-RLAn easy PyTorch implementation of "Stabilizing Transformers for Reinforcement Learning"
Stars: ✭ 107 (-97.07%)
En-transformerImplementation of E(n)-Transformer, which extends the ideas of Welling's E(n)-Equivariant Graph Neural Network to attention
Stars: ✭ 131 (-96.41%)
linformerImplementation of Linformer for Pytorch
Stars: ✭ 119 (-96.74%)
CrabNetPredict materials properties using only the composition information!
Stars: ✭ 57 (-98.44%)
transformerA PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-99.23%)
Njunmt TfAn open-source neural machine translation system developed by Natural Language Processing Group, Nanjing University.
Stars: ✭ 97 (-97.34%)
visualizationa collection of visualization function
Stars: ✭ 189 (-94.82%)
attention-is-all-you-need-paperImplementation of Vaswani, Ashish, et al. "Attention is all you need." Advances in neural information processing systems. 2017.
Stars: ✭ 97 (-97.34%)
galerkin-transformer[NeurIPS 2021] Galerkin Transformer: a linear attention without softmax
Stars: ✭ 111 (-96.96%)
Cognitive Speech TtsMicrosoft Text-to-Speech API sample code in several languages, part of Cognitive Services.
Stars: ✭ 312 (-91.44%)
VedastrA scene text recognition toolbox based on PyTorch
Stars: ✭ 290 (-92.05%)
Lbry DesktopA browser and wallet for LBRY, the decentralized, user-controlled content marketplace.
Stars: ✭ 3,597 (-1.34%)
Contextualized Topic ModelsA python package to run contextualized topic modeling. CTMs combine BERT with topic models to get coherent topics. Also supports multilingual tasks. Cross-lingual Zero-shot model published at EACL 2021.
Stars: ✭ 318 (-91.28%)
Localize🏁 Automatically clean your Localizable.strings files
Stars: ✭ 311 (-91.47%)
GotextGo (Golang) GNU gettext utilities package
Stars: ✭ 292 (-91.99%)
Js Lingui🌍📖 A readable, automated, and optimized (5 kb) internationalization for JavaScript
Stars: ✭ 3,249 (-10.89%)