All Projects β†’ ppriyank β†’ Video-Action-Transformer-Network-Pytorch-

ppriyank / Video-Action-Transformer-Network-Pytorch-

Licence: other
Implementation of the paper Video Action Transformer Network

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Video-Action-Transformer-Network-Pytorch-

Transformers
πŸ€— Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Stars: ✭ 55,742 (+44139.68%)
Mutual labels:  transformer, pytorch-transformers
trapper
State-of-the-art NLP through transformer models in a modular design and consistent APIs.
Stars: ✭ 28 (-77.78%)
Mutual labels:  transformer, pytorch-transformers
ru-dalle
Generate images from texts. In Russian
Stars: ✭ 1,606 (+1174.6%)
Mutual labels:  transformer
transformer-ls
Official PyTorch Implementation of Long-Short Transformer (NeurIPS 2021).
Stars: ✭ 201 (+59.52%)
Mutual labels:  transformer
sister
SImple SenTence EmbeddeR
Stars: ✭ 66 (-47.62%)
Mutual labels:  transformer
project-code-py
Leetcode using AI
Stars: ✭ 100 (-20.63%)
Mutual labels:  transformer
TokenLabeling
Pytorch implementation of "All Tokens Matter: Token Labeling for Training Better Vision Transformers"
Stars: ✭ 385 (+205.56%)
Mutual labels:  transformer
ViTs-vs-CNNs
[NeurIPS 2021]: Are Transformers More Robust Than CNNs? (Pytorch implementation & checkpoints)
Stars: ✭ 145 (+15.08%)
Mutual labels:  transformer
dingo-serializer-switch
A middleware to switch fractal serializers in dingo
Stars: ✭ 49 (-61.11%)
Mutual labels:  transformer
TransBTS
This repo provides the official code for : 1) TransBTS: Multimodal Brain Tumor Segmentation Using Transformer (https://arxiv.org/abs/2103.04430) , accepted by MICCAI2021. 2) TransBTSV2: Towards Better and More Efficient Volumetric Segmentation of Medical Images(https://arxiv.org/abs/2201.12785).
Stars: ✭ 254 (+101.59%)
Mutual labels:  transformer
Transformer Temporal Tagger
Code and data form the paper BERT Got a Date: Introducing Transformers to Temporal Tagging
Stars: ✭ 55 (-56.35%)
Mutual labels:  transformer
kaggle-champs
Code for the CHAMPS Predicting Molecular Properties Kaggle competition
Stars: ✭ 49 (-61.11%)
Mutual labels:  transformer
Cross-lingual-Summarization
Zero-Shot Cross-Lingual Abstractive Sentence Summarization through Teaching Generation and Attention
Stars: ✭ 28 (-77.78%)
Mutual labels:  transformer
les-military-mrc-rank7
θŽ±ζ–―ζ―οΌšε…¨ε›½η¬¬δΊŒε±Šβ€œε†›δΊ‹ζ™Ίθƒ½ζœΊε™¨ι˜…θ―»β€ζŒ‘ζˆ˜θ΅› - Rank7 θ§£ε†³ζ–Ήζ‘ˆ
Stars: ✭ 37 (-70.63%)
Mutual labels:  transformer
php-serializer
Serialize PHP variables, including objects, in any format. Support to unserialize it too.
Stars: ✭ 47 (-62.7%)
Mutual labels:  transformer
VideoTransformer-pytorch
PyTorch implementation of a collections of scalable Video Transformer Benchmarks.
Stars: ✭ 159 (+26.19%)
Mutual labels:  transformer
fastT5
⚑ boost inference speed of T5 models by 5x & reduce the model size by 3x.
Stars: ✭ 421 (+234.13%)
Mutual labels:  transformer
cape
Continuous Augmented Positional Embeddings (CAPE) implementation for PyTorch
Stars: ✭ 29 (-76.98%)
Mutual labels:  transformer
sparql-transformer
A more handy way to use SPARQL data in your web app
Stars: ✭ 38 (-69.84%)
Mutual labels:  transformer
text simplification
Text Simplification Model based on Encoder-Decoder (includes Transformer and Seq2Seq) model.
Stars: ✭ 66 (-47.62%)
Mutual labels:  transformer

Video-Action-Transformer-Network-Pytorch-

Pytorch and Tensorflow Implementation of the paper Video Action Transformer Network
Rohit Girdhar, Joao Carreira, Carl Doersch, Andrew Zisserman

Retasked Video transformer (uses resnet as base) transformer_v1.py is more like real transformer, transformer.py more true to what paper advertises Usage :

from transformer_v1 import Semi_Transformer
model = Semi_Transformer(num_classes=num_classes , num_frames = max_seq_len)
outputs, features = model(imgs) # outputs is the classification layer output (do cross entropy loss)
                                #features are used as video embedding
                                
##################### or ###################
from transformer_v2 import Semi_Transformer
model = Semi_Transformer(num_classes=625 , seq_len = max_seq_len)

In case you find any discrepency, please raise an issue. If any one was able to reproduce the paper results kindly help me with this issue. If possible please meantion the changes needs to be further added.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].