All Projects → uniformer-pytorch → Similar Projects or Alternatives

352 Open source projects that are alternatives of or similar to uniformer-pytorch

STAM-pytorch
Implementation of STAM (Space Time Attention Model), a pure and simple attention model that reaches SOTA for video classification
Stars: ✭ 109 (+21.11%)
RETRO-pytorch
Implementation of RETRO, Deepmind's Retrieval based Attention net, in Pytorch
Stars: ✭ 473 (+425.56%)
keras-deep-learning
Various implementations and projects on CNN, RNN, LSTM, GAN, etc
Stars: ✭ 22 (-75.56%)
Vit Pytorch
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
Stars: ✭ 7,199 (+7898.89%)
Reformer Pytorch
Reformer, the efficient Transformer, in Pytorch
Stars: ✭ 1,644 (+1726.67%)
nuwa-pytorch
Implementation of NÜWA, state of the art attention network for text to video synthesis, in Pytorch
Stars: ✭ 347 (+285.56%)
Dalle Pytorch
Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch
Stars: ✭ 3,661 (+3967.78%)
long-short-transformer
Implementation of Long-Short Transformer, combining local and global inductive biases for attention over long sequences, in Pytorch
Stars: ✭ 103 (+14.44%)
transganformer
Implementation of TransGanFormer, an all-attention GAN that combines the finding from the recent GanFormer and TransGan paper
Stars: ✭ 137 (+52.22%)
Transformer-Implementations
Library - Vanilla, ViT, DeiT, BERT, GPT
Stars: ✭ 34 (-62.22%)
Mutual labels:  transformers
Visual-Attention-Model
Chainer implementation of Deepmind's Visual Attention Model paper
Stars: ✭ 27 (-70%)
Mutual labels:  attention-mechanism
KoBERT-Transformers
KoBERT on 🤗 Huggingface Transformers 🤗 (with Bug Fixed)
Stars: ✭ 162 (+80%)
Mutual labels:  transformers
TianChi AIEarth
TianChi AIEarth Contest Solution
Stars: ✭ 57 (-36.67%)
Mutual labels:  attention-mechanism
C3D-tensorflow
Action recognition with C3D network implemented in tensorflow
Stars: ✭ 34 (-62.22%)
Mutual labels:  video-classification
Transformers-RL
An easy PyTorch implementation of "Stabilizing Transformers for Reinforcement Learning"
Stars: ✭ 107 (+18.89%)
Mutual labels:  attention-mechanism
LSTM-Attention
A Comparison of LSTMs and Attention Mechanisms for Forecasting Financial Time Series
Stars: ✭ 53 (-41.11%)
Mutual labels:  attention-mechanism
lstm-attention
Attention-based bidirectional LSTM for Classification Task (ICASSP)
Stars: ✭ 87 (-3.33%)
Mutual labels:  attention-mechanism
MinkLocMultimodal
MinkLoc++: Lidar and Monocular Image Fusion for Place Recognition
Stars: ✭ 65 (-27.78%)
Mutual labels:  3d-convolutional-network
nlp-papers
Must-read papers on Natural Language Processing (NLP)
Stars: ✭ 87 (-3.33%)
Mutual labels:  transformers
Nlp Architect
A model library for exploring state-of-the-art deep learning topologies and techniques for optimizing Natural Language Processing neural networks
Stars: ✭ 2,768 (+2975.56%)
Mutual labels:  transformers
text
Using Transformers from HuggingFace in R
Stars: ✭ 66 (-26.67%)
Mutual labels:  transformers
Transformer-in-PyTorch
Transformer/Transformer-XL/R-Transformer examples and explanations
Stars: ✭ 21 (-76.67%)
Mutual labels:  transformers
Pytorch Sentiment Analysis
Tutorials on getting started with PyTorch and TorchText for sentiment analysis.
Stars: ✭ 3,209 (+3465.56%)
Mutual labels:  transformers
Nn
🧑‍🏫 50! Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
Stars: ✭ 5,720 (+6255.56%)
Mutual labels:  transformers
Simpletransformers
Transformers for Classification, NER, QA, Language Modelling, Language Generation, T5, Multi-Modal, and Conversational AI
Stars: ✭ 2,881 (+3101.11%)
Mutual labels:  transformers
Video-Description-with-Spatial-Temporal-Attention
[ACM MM 2017 & IEEE TMM 2020] This is the Theano code for the paper "Video Description with Spatial Temporal Attention"
Stars: ✭ 53 (-41.11%)
Mutual labels:  attention-mechanism
Transmogrifai
TransmogrifAI (pronounced trăns-mŏgˈrə-fī) is an AutoML library for building modular, reusable, strongly typed machine learning workflows on Apache Spark with minimal hand-tuning
Stars: ✭ 2,084 (+2215.56%)
Mutual labels:  transformers
SnowflakeNet
(TPAMI 2022) Snowflake Point Deconvolution for Point Cloud Completion and Generation with Skip-Transformer
Stars: ✭ 74 (-17.78%)
Mutual labels:  transformers
TransQuest
Transformer based translation quality estimation
Stars: ✭ 85 (-5.56%)
Mutual labels:  transformers
gpl
Powerful unsupervised domain adaptation method for dense retrieval. Requires only unlabeled corpus and yields massive improvement: "GPL: Generative Pseudo Labeling for Unsupervised Domain Adaptation of Dense Retrieval" https://arxiv.org/abs/2112.07577
Stars: ✭ 216 (+140%)
Mutual labels:  transformers
organic-chemistry-reaction-prediction-using-NMT
organic chemistry reaction prediction using NMT with Attention
Stars: ✭ 30 (-66.67%)
Mutual labels:  attention-mechanism
question-generation
Neural Models for Key Phrase Detection and Question Generation
Stars: ✭ 29 (-67.78%)
Mutual labels:  attention-mechanism
MiCT-Net-PyTorch
Video Recognition using Mixed Convolutional Tube (MiCT) on PyTorch with a ResNet backbone
Stars: ✭ 48 (-46.67%)
Mutual labels:  video-classification
Fengshenbang-LM
Fengshenbang-LM(封神榜大模型)是IDEA研究院认知计算与自然语言研究中心主导的大模型开源体系,成为中文AIGC和认知智能的基础设施。
Stars: ✭ 1,813 (+1914.44%)
Mutual labels:  transformers
conv3d-video-action-recognition
My experimentation around action recognition in videos. Contains Keras implementation for C3D network based on original paper "Learning Spatiotemporal Features with 3D Convolutional Networks", Tran et al. and it includes video processing pipelines coded using mPyPl package. Model is being benchmarked on popular UCF101 dataset and achieves result…
Stars: ✭ 50 (-44.44%)
Mutual labels:  video-classification
DARNN
A Dual-Stage Attention-Based Recurrent Neural Network for Time Series Prediction
Stars: ✭ 90 (+0%)
Mutual labels:  attention-mechanism
memory-compressed-attention
Implementation of Memory-Compressed Attention, from the paper "Generating Wikipedia By Summarizing Long Sequences"
Stars: ✭ 47 (-47.78%)
Mutual labels:  attention-mechanism
COCO-LM
[NeurIPS 2021] COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining
Stars: ✭ 109 (+21.11%)
Mutual labels:  transformers
UniFormer
[ICLR2022] official implementation of UniFormer
Stars: ✭ 574 (+537.78%)
Mutual labels:  video-classification
TA3N
[ICCV 2019 Oral] TA3N: https://github.com/cmhungsteve/TA3N (Most updated repo)
Stars: ✭ 45 (-50%)
Mutual labels:  video-classification
Neural-Chatbot
A Neural Network based Chatbot
Stars: ✭ 68 (-24.44%)
Mutual labels:  attention-mechanism
clip-italian
CLIP (Contrastive Language–Image Pre-training) for Italian
Stars: ✭ 113 (+25.56%)
Mutual labels:  transformers
Fast Bert
Super easy library for BERT based NLP models
Stars: ✭ 1,678 (+1764.44%)
Mutual labels:  transformers
amta-net
Asymmetric Multi-Task Attention Network for Prostate Bed Segmentation in CT Images
Stars: ✭ 26 (-71.11%)
Mutual labels:  attention-mechanism
Spark Nlp
State of the Art Natural Language Processing
Stars: ✭ 2,518 (+2697.78%)
Mutual labels:  transformers
Optic-Disc-Unet
Attention Unet model with post process for retina optic disc segmention
Stars: ✭ 77 (-14.44%)
Mutual labels:  attention-mechanism
Clue
中文语言理解测评基准 Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard
Stars: ✭ 2,425 (+2594.44%)
Mutual labels:  transformers
SA-DL
Sentiment Analysis with Deep Learning models. Implemented with Tensorflow and Keras.
Stars: ✭ 35 (-61.11%)
Mutual labels:  attention-mechanism
ChangeFormer
Official PyTorch implementation of our IGARSS'22 paper: A Transformer-Based Siamese Network for Change Detection
Stars: ✭ 220 (+144.44%)
Mutual labels:  attention-mechanism
NARRE
This is our implementation of NARRE:Neural Attentional Regression with Review-level Explanations
Stars: ✭ 100 (+11.11%)
Mutual labels:  attention-mechanism
naru
Neural Relation Understanding: neural cardinality estimators for tabular data
Stars: ✭ 76 (-15.56%)
Mutual labels:  transformers
Haystack
🔍 Haystack is an open source NLP framework that leverages Transformer models. It enables developers to implement production-ready neural search, question answering, semantic document search and summarization for a wide range of applications.
Stars: ✭ 3,409 (+3687.78%)
Mutual labels:  transformers
Im2LaTeX
An implementation of the Show, Attend and Tell paper in Tensorflow, for the OpenAI Im2LaTeX suggested problem
Stars: ✭ 16 (-82.22%)
Mutual labels:  attention-mechanism
Tokenizers
💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
Stars: ✭ 5,077 (+5541.11%)
Mutual labels:  transformers
CogView
Text-to-Image generation. The repo for NeurIPS 2021 paper "CogView: Mastering Text-to-Image Generation via Transformers".
Stars: ✭ 708 (+686.67%)
Mutual labels:  transformers
stconvs2s
Code for the paper "STConvS2S: Spatiotemporal Convolutional Sequence to Sequence Network for Weather Forecasting" (Neurocomputing, Elsevier)
Stars: ✭ 40 (-55.56%)
Mutual labels:  3d-convolutional-network
KB-ALBERT
KB국민은행에서 제공하는 경제/금융 도메인에 특화된 한국어 ALBERT 모델
Stars: ✭ 215 (+138.89%)
Mutual labels:  transformers
VQGAN-CLIP-Docker
Zero-Shot Text-to-Image Generation VQGAN+CLIP Dockerized
Stars: ✭ 58 (-35.56%)
Mutual labels:  transformers
hashformers
Hashformers is a framework for hashtag segmentation with transformers.
Stars: ✭ 18 (-80%)
Mutual labels:  transformers
LIT
[AAAI 2022] This is the official PyTorch implementation of "Less is More: Pay Less Attention in Vision Transformers"
Stars: ✭ 79 (-12.22%)
Mutual labels:  transformers
1-60 of 352 similar projects