All Projects → leaderj1001 → Action-Localization

leaderj1001 / Action-Localization

Licence: MIT license
Action-Localization, Atomic Visual Actions (AVA) Dataset

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Action-Localization

MSAF
Offical implementation of paper "MSAF: Multimodal Split Attention Fusion"
Stars: ✭ 47 (+113.64%)
Mutual labels:  action-recognition, ntu-rgbd
pose2action
experiments on classifying actions using poses
Stars: ✭ 24 (+9.09%)
Mutual labels:  action-recognition, ntu-rgbd
Ms G3d
[CVPR 2020 Oral] PyTorch implementation of "Disentangling and Unifying Graph Convolutions for Skeleton-Based Action Recognition"
Stars: ✭ 225 (+922.73%)
Mutual labels:  action-recognition
VideoTransformer-pytorch
PyTorch implementation of a collections of scalable Video Transformer Benchmarks.
Stars: ✭ 159 (+622.73%)
Mutual labels:  action-recognition
Keras-for-Co-occurrence-Feature-Learning-from-Skeleton-Data-for-Action-Recognition
Keras implementation for Co-occurrence-Feature-Learning-from-Skeleton-Data-for-Action-Recognition
Stars: ✭ 44 (+100%)
Mutual labels:  action-recognition
Lintel
A Python module to decode video frames directly, using the FFmpeg C API.
Stars: ✭ 240 (+990.91%)
Mutual labels:  action-recognition
weakly-action-localization
No description or website provided.
Stars: ✭ 30 (+36.36%)
Mutual labels:  action-recognition
Ican
[BMVC 2018] iCAN: Instance-Centric Attention Network for Human-Object Interaction Detection
Stars: ✭ 225 (+922.73%)
Mutual labels:  action-recognition
conv3d-video-action-recognition
My experimentation around action recognition in videos. Contains Keras implementation for C3D network based on original paper "Learning Spatiotemporal Features with 3D Convolutional Networks", Tran et al. and it includes video processing pipelines coded using mPyPl package. Model is being benchmarked on popular UCF101 dataset and achieves result…
Stars: ✭ 50 (+127.27%)
Mutual labels:  action-recognition
temporal-binding-network
Implementation of "EPIC-Fusion: Audio-Visual Temporal Binding for Egocentric Action Recognition, ICCV, 2019" in PyTorch
Stars: ✭ 95 (+331.82%)
Mutual labels:  action-recognition
bLVNet-TAM
The official Codes for NeurIPS 2019 paper. Quanfu Fan, Ricarhd Chen, Hilde Kuehne, Marco Pistoia, David Cox, "More Is Less: Learning Efficient Video Representations by Temporal Aggregation Modules"
Stars: ✭ 54 (+145.45%)
Mutual labels:  action-recognition
temporal-ssl
Video Representation Learning by Recognizing Temporal Transformations. In ECCV, 2020.
Stars: ✭ 46 (+109.09%)
Mutual labels:  action-recognition
Attentionalpoolingaction
Code/Model release for NIPS 2017 paper "Attentional Pooling for Action Recognition"
Stars: ✭ 248 (+1027.27%)
Mutual labels:  action-recognition
MiCT-Net-PyTorch
Video Recognition using Mixed Convolutional Tube (MiCT) on PyTorch with a ResNet backbone
Stars: ✭ 48 (+118.18%)
Mutual labels:  action-recognition
Alphaction
Spatio-Temporal Action Localization System
Stars: ✭ 221 (+904.55%)
Mutual labels:  action-recognition
MUSES
[CVPR 2021] Multi-shot Temporal Event Localization: a Benchmark
Stars: ✭ 51 (+131.82%)
Mutual labels:  action-recognition
Action recognition zoo
Codes for popular action recognition models, verified on the something-something data set.
Stars: ✭ 227 (+931.82%)
Mutual labels:  action-recognition
two-stream-action-recognition-keras
Two-stream CNNs for video action recognition implemented in Keras
Stars: ✭ 116 (+427.27%)
Mutual labels:  action-recognition
ViCC
[WACV'22] Code repository for the paper "Self-supervised Video Representation Learning with Cross-Stream Prototypical Contrasting", https://arxiv.org/abs/2106.10137.
Stars: ✭ 33 (+50%)
Mutual labels:  action-recognition
sparseprop
Temporal action proposals
Stars: ✭ 46 (+109.09%)
Mutual labels:  action-recognition
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].