All Projects → RETRO-pytorch → Similar Projects or Alternatives

362 Open source projects that are alternatives of or similar to RETRO-pytorch

Vit Pytorch
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
Stars: ✭ 7,199 (+1421.99%)
transganformer
Implementation of TransGanFormer, an all-attention GAN that combines the finding from the recent GanFormer and TransGan paper
Stars: ✭ 137 (-71.04%)
long-short-transformer
Implementation of Long-Short Transformer, combining local and global inductive biases for attention over long sequences, in Pytorch
Stars: ✭ 103 (-78.22%)
STAM-pytorch
Implementation of STAM (Space Time Attention Model), a pure and simple attention model that reaches SOTA for video classification
Stars: ✭ 109 (-76.96%)
uniformer-pytorch
Implementation of Uniformer, a simple attention and 3d convolutional net that achieved SOTA in a number of video classification tasks, debuted in ICLR 2022
Stars: ✭ 90 (-80.97%)
Dalle Pytorch
Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch
Stars: ✭ 3,661 (+674%)
nuwa-pytorch
Implementation of NÜWA, state of the art attention network for text to video synthesis, in Pytorch
Stars: ✭ 347 (-26.64%)
Reformer Pytorch
Reformer, the efficient Transformer, in Pytorch
Stars: ✭ 1,644 (+247.57%)
OpenDialog
An Open-Source Package for Chinese Open-domain Conversational Chatbot (中文闲聊对话系统,一键部署微信闲聊机器人)
Stars: ✭ 94 (-80.13%)
Mutual labels:  retrieval, transformers
hexia
Mid-level PyTorch Based Framework for Visual Question Answering.
Stars: ✭ 24 (-94.93%)
Mutual labels:  attention-mechanism
LanguageModel-using-Attention
Pytorch implementation of a basic language model using Attention in LSTM network
Stars: ✭ 27 (-94.29%)
Mutual labels:  attention-mechanism
Optic-Disc-Unet
Attention Unet model with post process for retina optic disc segmention
Stars: ✭ 77 (-83.72%)
Mutual labels:  attention-mechanism
ChangeFormer
Official PyTorch implementation of our IGARSS'22 paper: A Transformer-Based Siamese Network for Change Detection
Stars: ✭ 220 (-53.49%)
Mutual labels:  attention-mechanism
cineast
Cineast is a multi-feature content-based mulitmedia retrieval engine. It is capable of retrieving images, audio- and video sequences as well as 3d models based on edge or color sketches, textual descriptions and example objects.
Stars: ✭ 51 (-89.22%)
Mutual labels:  retrieval
LSTM-Attention
A Comparison of LSTMs and Attention Mechanisms for Forecasting Financial Time Series
Stars: ✭ 53 (-88.79%)
Mutual labels:  attention-mechanism
Ask2Transformers
A Framework for Textual Entailment based Zero Shot text classification
Stars: ✭ 102 (-78.44%)
Mutual labels:  transformers
NARRE
This is our implementation of NARRE:Neural Attentional Regression with Review-level Explanations
Stars: ✭ 100 (-78.86%)
Mutual labels:  attention-mechanism
En-transformer
Implementation of E(n)-Transformer, which extends the ideas of Welling's E(n)-Equivariant Graph Neural Network to attention
Stars: ✭ 131 (-72.3%)
Mutual labels:  attention-mechanism
jax-models
Unofficial JAX implementations of deep learning research papers
Stars: ✭ 108 (-77.17%)
Mutual labels:  transformers
Visual-Attention-Model
Chainer implementation of Deepmind's Visual Attention Model paper
Stars: ✭ 27 (-94.29%)
Mutual labels:  attention-mechanism
axial-attention
Implementation of Axial attention - attending to multi-dimensional data efficiently
Stars: ✭ 245 (-48.2%)
Mutual labels:  attention-mechanism
Introduction-to-Deep-Learning-and-Neural-Networks-Course
Code snippets and solutions for the Introduction to Deep Learning and Neural Networks Course hosted in educative.io
Stars: ✭ 33 (-93.02%)
Mutual labels:  transformers
CIAN
Implementation of the Character-level Intra Attention Network (CIAN) for Natural Language Inference (NLI) upon SNLI and MultiNLI corpus
Stars: ✭ 17 (-96.41%)
Mutual labels:  attention-mechanism
Neural-Chatbot
A Neural Network based Chatbot
Stars: ✭ 68 (-85.62%)
Mutual labels:  attention-mechanism
amta-net
Asymmetric Multi-Task Attention Network for Prostate Bed Segmentation in CT Images
Stars: ✭ 26 (-94.5%)
Mutual labels:  attention-mechanism
Transformer-MM-Explainability
[ICCV 2021- Oral] Official PyTorch implementation for Generic Attention-model Explainability for Interpreting Bi-Modal and Encoder-Decoder Transformers, a novel method to visualize any Transformer-based network. Including examples for DETR, VQA.
Stars: ✭ 484 (+2.33%)
Mutual labels:  transformers
Video-Description-with-Spatial-Temporal-Attention
[ACM MM 2017 & IEEE TMM 2020] This is the Theano code for the paper "Video Description with Spatial Temporal Attention"
Stars: ✭ 53 (-88.79%)
Mutual labels:  attention-mechanism
text
Using Transformers from HuggingFace in R
Stars: ✭ 66 (-86.05%)
Mutual labels:  transformers
S2VT-seq2seq-video-captioning-attention
S2VT (seq2seq) video captioning with bahdanau & luong attention implementation in Tensorflow
Stars: ✭ 18 (-96.19%)
Mutual labels:  attention-mechanism
plexus
Plexus - Interactive Emotion Visualization based on Social Media
Stars: ✭ 27 (-94.29%)
Mutual labels:  retrieval
X-Transformer
X-Transformer: Taming Pretrained Transformers for eXtreme Multi-label Text Classification
Stars: ✭ 127 (-73.15%)
Mutual labels:  transformers
organic-chemistry-reaction-prediction-using-NMT
organic chemistry reaction prediction using NMT with Attention
Stars: ✭ 30 (-93.66%)
Mutual labels:  attention-mechanism
question generator
An NLP system for generating reading comprehension questions
Stars: ✭ 188 (-60.25%)
Mutual labels:  transformers
Transformer-in-PyTorch
Transformer/Transformer-XL/R-Transformer examples and explanations
Stars: ✭ 21 (-95.56%)
Mutual labels:  transformers
code-transformer
Implementation of the paper "Language-agnostic representation learning of source code from structure and context".
Stars: ✭ 130 (-72.52%)
Mutual labels:  transformers
image embeddings
Using efficientnet to provide embeddings for retrieval
Stars: ✭ 107 (-77.38%)
Mutual labels:  retrieval
nlp workshop odsc europe20
Extensive tutorials for the Advanced NLP Workshop in Open Data Science Conference Europe 2020. We will leverage machine learning, deep learning and deep transfer learning to learn and solve popular tasks using NLP including NER, Classification, Recommendation \ Information Retrieval, Summarization, Classification, Language Translation, Q&A and T…
Stars: ✭ 127 (-73.15%)
Mutual labels:  transformers
beir
A Heterogeneous Benchmark for Information Retrieval. Easy to use, evaluate your models across 15+ diverse IR datasets.
Stars: ✭ 738 (+56.03%)
Mutual labels:  retrieval
CVPR2020 PADS
(CVPR 2020) This repo contains code for "PADS: Policy-Adapted Sampling for Visual Similarity Learning", which proposes learnable triplet mining with Reinforcement Learning.
Stars: ✭ 57 (-87.95%)
Mutual labels:  retrieval
TransQuest
Transformer based translation quality estimation
Stars: ✭ 85 (-82.03%)
Mutual labels:  transformers
danish transformers
A collection of Danish Transformers
Stars: ✭ 30 (-93.66%)
Mutual labels:  transformers
memory-compressed-attention
Implementation of Memory-Compressed Attention, from the paper "Generating Wikipedia By Summarizing Long Sequences"
Stars: ✭ 47 (-90.06%)
Mutual labels:  attention-mechanism
tf retrieval baseline
A Tensorflow retrieval (space embedding) baseline. Metric learning baseline on CUB and Stanford Online Products.
Stars: ✭ 39 (-91.75%)
Mutual labels:  retrieval
dgcnn
Clean & Documented TF2 implementation of "An end-to-end deep learning architecture for graph classification" (M. Zhang et al., 2018).
Stars: ✭ 21 (-95.56%)
Mutual labels:  attention-mechanism
palladian
Palladian is a Java-based toolkit with functionality for text processing, classification, information extraction, and data retrieval from the Web.
Stars: ✭ 32 (-93.23%)
Mutual labels:  retrieval
rnn-text-classification-tf
Tensorflow implementation of Attention-based Bidirectional RNN text classification.
Stars: ✭ 26 (-94.5%)
Mutual labels:  attention-mechanism
SA-DL
Sentiment Analysis with Deep Learning models. Implemented with Tensorflow and Keras.
Stars: ✭ 35 (-92.6%)
Mutual labels:  attention-mechanism
naru
Neural Relation Understanding: neural cardinality estimators for tabular data
Stars: ✭ 76 (-83.93%)
Mutual labels:  transformers
awesome-huggingface
🤗 A list of wonderful open-source projects & applications integrated with Hugging Face libraries.
Stars: ✭ 436 (-7.82%)
Mutual labels:  transformers
Im2LaTeX
An implementation of the Show, Attend and Tell paper in Tensorflow, for the OpenAI Im2LaTeX suggested problem
Stars: ✭ 16 (-96.62%)
Mutual labels:  attention-mechanism
ginza-transformers
Use custom tokenizers in spacy-transformers
Stars: ✭ 15 (-96.83%)
Mutual labels:  transformers
shrec17
Supplementary code for SHREC 2017 RGB-D Object-to-CAD Retrieval track
Stars: ✭ 27 (-94.29%)
Mutual labels:  retrieval
KB-ALBERT
KB국민은행에서 제공하는 경제/금융 도메인에 특화된 한국어 ALBERT 모델
Stars: ✭ 215 (-54.55%)
Mutual labels:  transformers
LIT
[AAAI 2022] This is the official PyTorch implementation of "Less is More: Pay Less Attention in Vision Transformers"
Stars: ✭ 79 (-83.3%)
Mutual labels:  transformers
KnowledgeEditor
Code for Editing Factual Knowledge in Language Models
Stars: ✭ 86 (-81.82%)
Mutual labels:  transformers
thermostat
Collection of NLP model explanations and accompanying analysis tools
Stars: ✭ 126 (-73.36%)
Mutual labels:  transformers
SnowflakeNet
(TPAMI 2022) Snowflake Point Deconvolution for Point Cloud Completion and Generation with Skip-Transformer
Stars: ✭ 74 (-84.36%)
Mutual labels:  transformers
oreilly-bert-nlp
This repository contains code for the O'Reilly Live Online Training for BERT
Stars: ✭ 19 (-95.98%)
Mutual labels:  transformers
clip-italian
CLIP (Contrastive Language–Image Pre-training) for Italian
Stars: ✭ 113 (-76.11%)
Mutual labels:  transformers
TianChi AIEarth
TianChi AIEarth Contest Solution
Stars: ✭ 57 (-87.95%)
Mutual labels:  attention-mechanism
1-60 of 362 similar projects