Nmt KerasNeural Machine Translation with Keras
Stars: ✭ 501 (+3753.85%)
Njunmt TfAn open-source neural machine translation system developed by Natural Language Processing Group, Nanjing University.
Stars: ✭ 97 (+646.15%)
TransformerA TensorFlow Implementation of the Transformer: Attention Is All You Need
Stars: ✭ 3,646 (+27946.15%)
SockeyeSequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+7515.38%)
CrabNetPredict materials properties using only the composition information!
Stars: ✭ 57 (+338.46%)
Transformer ClinicUnderstanding the Difficulty of Training Transformers
Stars: ✭ 179 (+1276.92%)
Transformer-in-TransformerAn Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches
Stars: ✭ 40 (+207.69%)
JoeynmtMinimalist NMT for educational purposes
Stars: ✭ 420 (+3130.77%)
EqtransformerEQTransformer, a python package for earthquake signal detection and phase picking using AI.
Stars: ✭ 95 (+630.77%)
Self Attention CvImplementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (+1507.69%)
Linear Attention TransformerTransformer based on a variant of attention that is linear complexity in respect to sequence length
Stars: ✭ 205 (+1476.92%)
Eeg DlA Deep Learning library for EEG Tasks (Signals) Classification, based on TensorFlow.
Stars: ✭ 165 (+1169.23%)
h-transformer-1dImplementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (+830.77%)
Neural spEnd-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+3038.46%)
Pytorch Original TransformerMy implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+3061.54%)
Se3 Transformer PytorchImplementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.
Stars: ✭ 73 (+461.54%)
Seq2seq chatbot基于seq2seq模型的简单对话系统的tf实现,具有embedding、attention、beam_search等功能,数据集是Cornell Movie Dialogs
Stars: ✭ 308 (+2269.23%)
Image-CaptionUsing LSTM or Transformer to solve Image Captioning in Pytorch
Stars: ✭ 36 (+176.92%)
Transformer TensorflowTensorFlow implementation of 'Attention Is All You Need (2017. 6)'
Stars: ✭ 319 (+2353.85%)
Keras AttentionVisualizing RNNs using the attention mechanism
Stars: ✭ 697 (+5261.54%)
Rust BertRust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
Stars: ✭ 510 (+3823.08%)
Onnxt5Summarization, translation, sentiment-analysis, text-generation and more at blazing speed using a T5 version implemented in ONNX.
Stars: ✭ 143 (+1000%)
Transformers-RLAn easy PyTorch implementation of "Stabilizing Transformers for Reinforcement Learning"
Stars: ✭ 107 (+723.08%)
En-transformerImplementation of E(n)-Transformer, which extends the ideas of Welling's E(n)-Equivariant Graph Neural Network to attention
Stars: ✭ 131 (+907.69%)
Overlappredator[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stars: ✭ 106 (+715.38%)
NLP-paper🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (+76.92%)
dodrioExploring attention weights in transformer-based models with linguistic knowledge.
Stars: ✭ 233 (+1692.31%)
visualizationa collection of visualization function
Stars: ✭ 189 (+1353.85%)
Pytorch Transformerpytorch implementation of Attention is all you need
Stars: ✭ 199 (+1430.77%)
Awesome Bert NlpA curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
Stars: ✭ 567 (+4261.54%)
enformer-pytorchImplementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch
Stars: ✭ 146 (+1023.08%)
galerkin-transformer[NeurIPS 2021] Galerkin Transformer: a linear attention without softmax
Stars: ✭ 111 (+753.85%)
linformerImplementation of Linformer for Pytorch
Stars: ✭ 119 (+815.38%)
Transformer TtsA Pytorch Implementation of "Neural Speech Synthesis with Transformer Network"
Stars: ✭ 418 (+3115.38%)
Routing TransformerFully featured implementation of Routing Transformer
Stars: ✭ 149 (+1046.15%)
OverlapPredator[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stars: ✭ 293 (+2153.85%)
FragmentVCAny-to-any voice conversion by end-to-end extracting and fusing fine-grained voice fragments with attention
Stars: ✭ 134 (+930.77%)
linguistLinguist is a powerful browser extension for translate pages and text, which are ready to replace your favorite translate service
Stars: ✭ 21 (+61.54%)
Xrm-Quick-EditA Dynamics CRM Add-In for speeding up tasks such as translating or toggling field security on or off
Stars: ✭ 13 (+0%)
TranslatioSuper lightweight library that helps you to localize strings, even directly in storyboards!
Stars: ✭ 19 (+46.15%)
co-attentionPytorch implementation of "Dynamic Coattention Networks For Question Answering"
Stars: ✭ 54 (+315.38%)
ttslearnttslearn: Library for Pythonで学ぶ音声合成 (Text-to-speech with Python)
Stars: ✭ 158 (+1115.38%)
whats🌐 a terminal translation tool
Stars: ✭ 16 (+23.08%)
max-deeplabUnofficial implementation of MaX-DeepLab for Instance Segmentation
Stars: ✭ 84 (+546.15%)
IT-Terms-EN-CNEnglish to Chinese Translation Table for IT Terminologies , ITEC (IT術語及計算機科學術語中英文對照表)
Stars: ✭ 53 (+307.69%)
Variational-NMTVariational Neural Machine Translation System
Stars: ✭ 37 (+184.62%)
TeachYourselfCS-CNTeachYourselfCS 的中文翻译 | A Chinese translation of TeachYourselfCS
Stars: ✭ 13,234 (+101700%)
deep-molecular-optimizationMolecular optimization by capturing chemist’s intuition using the Seq2Seq with attention and the Transformer
Stars: ✭ 60 (+361.54%)
sqlglotPython SQL Parser and Transpiler
Stars: ✭ 310 (+2284.62%)