All Projects → Transformers.jl → Similar Projects or Alternatives

1318 Open source projects that are alternatives of or similar to Transformers.jl

Nlp Tutorial
Natural Language Processing Tutorial for Deep Learning Researchers
Stars: ✭ 9,895 (+5619.65%)
CrabNet
Predict materials properties using only the composition information!
Stars: ✭ 57 (-67.05%)
Mutual labels:  transformer, attention
Multimodal Sentiment Analysis
Attention-based multimodal fusion for sentiment analysis
Stars: ✭ 172 (-0.58%)
Relation-Extraction-Transformer
NLP: Relation extraction with position-aware self-attention transformer
Stars: ✭ 63 (-63.58%)
Mutual labels:  transformer, attention
Cell Detr
Official and maintained implementation of the paper Attention-Based Transformers for Instance Segmentation of Cells in Microstructures [BIBM 2020].
Stars: ✭ 26 (-84.97%)
Mutual labels:  attention, transformer
Multiturndialogzoo
Multi-turn dialogue baselines written in PyTorch
Stars: ✭ 106 (-38.73%)
Mutual labels:  attention, transformer
Insight
Repository for Project Insight: NLP as a Service
Stars: ✭ 246 (+42.2%)
seq2seq-pytorch
Sequence to Sequence Models in PyTorch
Stars: ✭ 41 (-76.3%)
Mutual labels:  transformer, attention
Awesome Fast Attention
list of efficient attention modules
Stars: ✭ 627 (+262.43%)
Mutual labels:  attention, transformer
Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+135.84%)
Mutual labels:  attention, transformer
Vietnamese Electra
Electra pre-trained model using Vietnamese corpus
Stars: ✭ 55 (-68.21%)
ai challenger 2018 sentiment analysis
Fine-grained Sentiment Analysis of User Reviews --- AI CHALLENGER 2018
Stars: ✭ 16 (-90.75%)
Mutual labels:  transformer, attention
Awesome Bert Nlp
A curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
Stars: ✭ 567 (+227.75%)
Absa Pytorch
Aspect Based Sentiment Analysis, PyTorch Implementations. 基于方面的情感分析,使用PyTorch实现。
Stars: ✭ 1,181 (+582.66%)
Visual-Transformer-Paper-Summary
Summary of Transformer applications for computer vision tasks.
Stars: ✭ 51 (-70.52%)
Mutual labels:  transformer, attention
Hardware Aware Transformers
[ACL 2020] HAT: Hardware-Aware Transformers for Efficient Natural Language Processing
Stars: ✭ 206 (+19.08%)
Neat Vision
Neat (Neural Attention) Vision, is a visualization tool for the attention mechanisms of deep-learning models for Natural Language Processing (NLP) tasks. (framework-agnostic)
Stars: ✭ 213 (+23.12%)
learningspoons
nlp lecture-notes and source code
Stars: ✭ 29 (-83.24%)
Mutual labels:  transformer, attention
transformer
A PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-83.82%)
Mutual labels:  transformer, attention
Question generation
Neural question generation using transformers
Stars: ✭ 356 (+105.78%)
Text Classification Models Pytorch
Implementation of State-of-the-art Text Classification Models in Pytorch
Stars: ✭ 379 (+119.08%)
Mutual labels:  attention, transformer
Attention Is All You Need Pytorch
A PyTorch implementation of the Transformer model in "Attention is All You Need".
Stars: ✭ 6,070 (+3408.67%)
Pytorch Original Transformer
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+137.57%)
Mutual labels:  attention, transformer
Njunmt Tf
An open-source neural machine translation system developed by Natural Language Processing Group, Nanjing University.
Stars: ✭ 97 (-43.93%)
Mutual labels:  attention, transformer
Multimodal Toolkit
Multimodal model for text and tabular data with HuggingFace transformers as building block for text data
Stars: ✭ 78 (-54.91%)
Transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Stars: ✭ 55,742 (+32120.81%)
Nlp Tutorials
Simple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
Stars: ✭ 394 (+127.75%)
Mutual labels:  attention, transformer
Keras Transformer
Transformer implemented in Keras
Stars: ✭ 273 (+57.8%)
Mutual labels:  attention, transformer
Speech Transformer
A PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Stars: ✭ 565 (+226.59%)
Mutual labels:  attention, transformer
Transformer Tensorflow
TensorFlow implementation of 'Attention Is All You Need (2017. 6)'
Stars: ✭ 319 (+84.39%)
Mutual labels:  attention, transformer
Medical Transformer
Pytorch Code for "Medical Transformer: Gated Axial-Attention for Medical Image Segmentation"
Stars: ✭ 153 (-11.56%)
Mutual labels:  attention, transformer
Sightseq
Computer vision tools for fairseq, containing PyTorch implementation of text recognition and object detection
Stars: ✭ 116 (-32.95%)
Mutual labels:  attention, transformer
Gpt2
PyTorch Implementation of OpenAI GPT-2
Stars: ✭ 64 (-63.01%)
Graphtransformer
Graph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.
Stars: ✭ 187 (+8.09%)
Mutual labels:  attention, transformer
Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+1875.72%)
Mutual labels:  attention, transformer
Jddc solution 4th
2018-JDDC大赛第4名的解决方案
Stars: ✭ 235 (+35.84%)
Mutual labels:  attention, transformer
Bertviz
Tool for visualizing attention in the Transformer model (BERT, GPT-2, Albert, XLNet, RoBERTa, CTRL, etc.)
Stars: ✭ 3,443 (+1890.17%)
Self Attention Cv
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (+20.81%)
Mutual labels:  attention, transformer
h-transformer-1d
Implementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (-30.06%)
Mutual labels:  transformer, attention
TRAR-VQA
[ICCV 2021] TRAR: Routing the Attention Spans in Transformers for Visual Question Answering -- Official Implementation
Stars: ✭ 49 (-71.68%)
Mutual labels:  transformer, attention
visualization
a collection of visualization function
Stars: ✭ 189 (+9.25%)
Mutual labels:  transformer, attention
Deeplearning Nlp Models
A small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (-63.01%)
Mutual labels:  attention, transformer
Bertqa Attention On Steroids
BertQA - Attention on Steroids
Stars: ✭ 112 (-35.26%)
Mutual labels:  attention, transformer
Multihead Siamese Nets
Implementation of Siamese Neural Networks built upon multihead attention mechanism for text semantic similarity task.
Stars: ✭ 144 (-16.76%)
Hey Jetson
Deep Learning based Automatic Speech Recognition with attention for the Nvidia Jetson.
Stars: ✭ 161 (-6.94%)
Mutual labels:  attention
Acl Anthology
Data and software for building the ACL Anthology.
Stars: ✭ 168 (-2.89%)
Covid Papers Browser
Browse Covid-19 & SARS-CoV-2 Scientific Papers with Transformers 🦠 📖
Stars: ✭ 161 (-6.94%)
Ngx Dynamic Dashboard Framework
This is a JSON driven angular x based dashboard framework that is inspired by JIRA's dashboard implementation and https://github.com/raulgomis/angular-dashboard-framework
Stars: ✭ 160 (-7.51%)
General Store
Simple, flexible store implementation for Flux. #hubspot-open-source
Stars: ✭ 171 (-1.16%)
Mutual labels:  flux
Eeg Dl
A Deep Learning library for EEG Tasks (Signals) Classification, based on TensorFlow.
Stars: ✭ 165 (-4.62%)
Mutual labels:  transformer
Nlp bahasa resources
A Curated List of Dataset and Usable Library Resources for NLP in Bahasa Indonesia
Stars: ✭ 158 (-8.67%)
Effective transformer
Running BERT without Padding
Stars: ✭ 169 (-2.31%)
Mutual labels:  transformer
Mixtext
MixText: Linguistically-Informed Interpolation of Hidden Space for Semi-Supervised Text Classification
Stars: ✭ 159 (-8.09%)
Pytorch Nlp
Basic Utilities for PyTorch Natural Language Processing (NLP)
Stars: ✭ 1,996 (+1053.76%)
Dive Into Dl Pytorch
本项目将《动手学深度学习》(Dive into Deep Learning)原书中的MXNet实现改为PyTorch实现。
Stars: ✭ 14,234 (+8127.75%)
Textvec
Text vectorization tool to outperform TFIDF for classification tasks
Stars: ✭ 167 (-3.47%)
Hrnet Semantic Segmentation
The OCR approach is rephrased as Segmentation Transformer: https://arxiv.org/abs/1909.11065. This is an official implementation of semantic segmentation for HRNet. https://arxiv.org/abs/1908.07919
Stars: ✭ 2,369 (+1269.36%)
Mutual labels:  transformer
Mtbook
《机器翻译:基础与模型》肖桐 朱靖波 著 - Machine Translation: Foundations and Models
Stars: ✭ 2,307 (+1233.53%)
Retalk
🐤 The Simplest Redux
Stars: ✭ 168 (-2.89%)
Mutual labels:  flux
Mishkal
Mishkal is an arabic text vocalization software
Stars: ✭ 158 (-8.67%)
1-60 of 1318 similar projects