All Projects → Timesformer Pytorch → Similar Projects or Alternatives

807 Open source projects that are alternatives of or similar to Timesformer Pytorch

Point Transformer Pytorch
Implementation of the Point Transformer layer, in Pytorch
Stars: ✭ 199 (-11.56%)
Isab Pytorch
An implementation of (Induced) Set Attention Block, from the Set Transformers paper
Stars: ✭ 21 (-90.67%)
Bottleneck Transformer Pytorch
Implementation of Bottleneck Transformer in Pytorch
Stars: ✭ 408 (+81.33%)
Global Self Attention Network
A Pytorch implementation of Global Self-Attention Network, a fully-attention backbone for vision tasks
Stars: ✭ 64 (-71.56%)
Reformer Pytorch
Reformer, the efficient Transformer, in Pytorch
Stars: ✭ 1,644 (+630.67%)
Simplednn
SimpleDNN is a machine learning lightweight open-source library written in Kotlin designed to support relevant neural network architectures in natural language processing tasks
Stars: ✭ 81 (-64%)
Routing Transformer
Fully featured implementation of Routing Transformer
Stars: ✭ 149 (-33.78%)
Sinkhorn Transformer
Sinkhorn Transformer - Practical implementation of Sparse Sinkhorn Attention
Stars: ✭ 156 (-30.67%)
Alphafold2
To eventually become an unofficial Pytorch implementation / replication of Alphafold2, as details of the architecture get released
Stars: ✭ 298 (+32.44%)
Se3 Transformer Pytorch
Implementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.
Stars: ✭ 73 (-67.56%)
Perceiver Pytorch
Implementation of Perceiver, General Perception with Iterative Attention, in Pytorch
Stars: ✭ 130 (-42.22%)
Dalle Pytorch
Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch
Stars: ✭ 3,661 (+1527.11%)
Linear Attention Transformer
Transformer based on a variant of attention that is linear complexity in respect to sequence length
Stars: ✭ 205 (-8.89%)
Slot Attention
Implementation of Slot Attention from GoogleAI
Stars: ✭ 168 (-25.33%)
Self Attention Cv
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (-7.11%)
Vit Pytorch
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
Stars: ✭ 7,199 (+3099.56%)
Performer Pytorch
An implementation of Performer, a linear attention-based transformer, in Pytorch
Stars: ✭ 546 (+142.67%)
Lambda Networks
Implementation of LambdaNetworks, a new approach to image recognition that reaches SOTA with less compute
Stars: ✭ 1,497 (+565.33%)
X Transformers
A simple but complete full-attention transformer with a set of promising experimental features from various papers
Stars: ✭ 211 (-6.22%)
Linformer Pytorch
My take on a practical implementation of Linformer for Pytorch.
Stars: ✭ 239 (+6.22%)
transganformer
Implementation of TransGanFormer, an all-attention GAN that combines the finding from the recent GanFormer and TransGan paper
Stars: ✭ 137 (-39.11%)
Mutual labels:  attention-mechanism
Deeplearningnotes
《深度学习》花书手推笔记
Stars: ✭ 257 (+14.22%)
Mutual labels:  artificial-intelligence
vista-net
Code for the paper "VistaNet: Visual Aspect Attention Network for Multimodal Sentiment Analysis", AAAI'19
Stars: ✭ 67 (-70.22%)
Mutual labels:  attention-mechanism
attention-guided-sparsity
Attention-Based Guided Structured Sparsity of Deep Neural Networks
Stars: ✭ 26 (-88.44%)
Mutual labels:  attention-mechanism
Dreamer
Dream to Control: Learning Behaviors by Latent Imagination
Stars: ✭ 269 (+19.56%)
Mutual labels:  artificial-intelligence
Polyaxon
Machine Learning Platform for Kubernetes (MLOps tools for experimentation and automation)
Stars: ✭ 2,966 (+1218.22%)
Mutual labels:  artificial-intelligence
Attention
一些不同的Attention机制代码
Stars: ✭ 17 (-92.44%)
Mutual labels:  attention-mechanism
linformer
Implementation of Linformer for Pytorch
Stars: ✭ 119 (-47.11%)
Mutual labels:  attention-mechanism
L2c
Learning to Cluster. A deep clustering strategy.
Stars: ✭ 262 (+16.44%)
Mutual labels:  artificial-intelligence
ADL2019
Applied Deep Learning (2019 Spring) @ NTU
Stars: ✭ 20 (-91.11%)
Mutual labels:  attention-mechanism
Caffe Hrt
Heterogeneous Run Time version of Caffe. Added heterogeneous capabilities to the Caffe, uses heterogeneous computing infrastructure framework to speed up Deep Learning on Arm-based heterogeneous embedded platform. It also retains all the features of the original Caffe architecture which users deploy their applications seamlessly.
Stars: ✭ 271 (+20.44%)
Mutual labels:  artificial-intelligence
Retinal-Disease-Diagnosis-With-Residual-Attention-Networks
Using Residual Attention Networks to diagnose retinal diseases in medical images
Stars: ✭ 14 (-93.78%)
Mutual labels:  attention-mechanism
Dalle Mtf
Open-AI's DALL-E for large scale training in mesh-tensorflow.
Stars: ✭ 250 (+11.11%)
Mutual labels:  artificial-intelligence
pynmt
a simple and complete pytorch implementation of neural machine translation system
Stars: ✭ 13 (-94.22%)
Mutual labels:  attention-mechanism
Pyswip
PySwip is a Python - SWI-Prolog bridge enabling to query SWI-Prolog in your Python programs. It features an (incomplete) SWI-Prolog foreign language interface, a utility class that makes it easy querying with Prolog and also a Pythonic interface.
Stars: ✭ 276 (+22.67%)
Mutual labels:  artificial-intelligence
co-attention
Pytorch implementation of "Dynamic Coattention Networks For Question Answering"
Stars: ✭ 54 (-76%)
Mutual labels:  attention-mechanism
Atlas
An Open Source, Self-Hosted Platform For Applied Deep Learning Development
Stars: ✭ 259 (+15.11%)
Mutual labels:  artificial-intelligence
keras attention
🔖 An Attention Layer in Keras
Stars: ✭ 43 (-80.89%)
Mutual labels:  attention-mechanism
MoChA-pytorch
PyTorch Implementation of "Monotonic Chunkwise Attention" (ICLR 2018)
Stars: ✭ 65 (-71.11%)
Mutual labels:  attention-mechanism
Gophernotes
The Go kernel for Jupyter notebooks and nteract.
Stars: ✭ 3,100 (+1277.78%)
Mutual labels:  artificial-intelligence
Machine Learning And Ai In Trading
Applying Machine Learning and AI Algorithms applied to Trading for better performance and low Std.
Stars: ✭ 258 (+14.67%)
Mutual labels:  artificial-intelligence
CompareModels TRECQA
Compare six baseline deep learning models on TrecQA
Stars: ✭ 61 (-72.89%)
Mutual labels:  attention-mechanism
ttslearn
ttslearn: Library for Pythonで学ぶ音声合成 (Text-to-speech with Python)
Stars: ✭ 158 (-29.78%)
Mutual labels:  attention-mechanism
Es Dev Stack
An on-premises, bare-metal solution for deploying GPU-powered applications in containers
Stars: ✭ 257 (+14.22%)
Mutual labels:  artificial-intelligence
Transformer-in-Transformer
An Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches
Stars: ✭ 40 (-82.22%)
Mutual labels:  attention-mechanism
SelfAttentive
Implementation of A Structured Self-attentive Sentence Embedding
Stars: ✭ 107 (-52.44%)
Mutual labels:  attention-mechanism
Apc Vision Toolbox
MIT-Princeton Vision Toolbox for the Amazon Picking Challenge 2016 - RGB-D ConvNet-based object segmentation and 6D object pose estimation.
Stars: ✭ 277 (+23.11%)
Mutual labels:  artificial-intelligence
Olivia
💁‍♀️Your new best friend powered by an artificial neural network
Stars: ✭ 3,114 (+1284%)
Mutual labels:  artificial-intelligence
Awesome Ai Awesomeness
A curated list of awesome awesomeness about artificial intelligence
Stars: ✭ 268 (+19.11%)
Mutual labels:  artificial-intelligence
Iamdinosaur
🦄 An Artificial Inteligence to teach Google's Dinosaur to jump cactus
Stars: ✭ 2,767 (+1129.78%)
Mutual labels:  artificial-intelligence
Image-Caption
Using LSTM or Transformer to solve Image Captioning in Pytorch
Stars: ✭ 36 (-84%)
Mutual labels:  attention-mechanism
nuwa-pytorch
Implementation of NÜWA, state of the art attention network for text to video synthesis, in Pytorch
Stars: ✭ 347 (+54.22%)
Mutual labels:  attention-mechanism
Ai Job Notes
AI算法岗求职攻略(涵盖准备攻略、刷题指南、内推和AI公司清单等资料)
Stars: ✭ 3,191 (+1318.22%)
Mutual labels:  artificial-intelligence
QuantumForest
Fast Differentiable Forest lib with the advantages of both decision trees and neural networks
Stars: ✭ 63 (-72%)
Mutual labels:  attention-mechanism
PAM
[TPAMI 2020] Parallax Attention for Unsupervised Stereo Correspondence Learning
Stars: ✭ 62 (-72.44%)
Mutual labels:  attention-mechanism
Shogun
Shōgun
Stars: ✭ 2,859 (+1170.67%)
Mutual labels:  artificial-intelligence
Da Rnn
📃 **Unofficial** PyTorch Implementation of DA-RNN (arXiv:1704.02971)
Stars: ✭ 256 (+13.78%)
Mutual labels:  attention-mechanism
Video-Cap
🎬 Video Captioning: ICCV '15 paper implementation
Stars: ✭ 44 (-80.44%)
Mutual labels:  attention-mechanism
attention-mechanism-keras
attention mechanism in keras, like Dense and RNN...
Stars: ✭ 19 (-91.56%)
Mutual labels:  attention-mechanism
Amazing Python Scripts
🚀 Curated collection of Amazing Python scripts from Basics to Advance with automation task scripts.
Stars: ✭ 229 (+1.78%)
Mutual labels:  artificial-intelligence
1-60 of 807 similar projects