All Projects → HopefulRational → DeepCaps-PyTorch

HopefulRational / DeepCaps-PyTorch

Licence: other
PyTorch Implementation of "DeepCaps: Going Deeper with Capsule Networks" by Jathushan Rajasegaran et al.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to DeepCaps-PyTorch

Capsule Networks
A PyTorch implementation of the NIPS 2017 paper "Dynamic Routing Between Capsules".
Stars: ✭ 1,618 (+6123.08%)
Mutual labels:  capsule-networks
Speech-Command-Recognition-with-Capsule-Network
Speech command recognition with capsule network & various NNs / KWS on Google Speech Command Dataset.
Stars: ✭ 20 (-23.08%)
Mutual labels:  capsule-networks
cvaecaposr
Code for the Paper: "Conditional Variational Capsule Network for Open Set Recognition", Y. Guo, G. Camporese, W. Yang, A. Sperduti, L. Ballan, arXiv:2104.09159, 2021.
Stars: ✭ 29 (+11.54%)
Mutual labels:  capsule-networks
TensorMONK
A collection of deep learning models (PyTorch implemtation)
Stars: ✭ 21 (-19.23%)
Mutual labels:  capsule-networks
Kapsul-Aglari-ile-Isaret-Dili-Tanima
Recognition of Sign Language using Capsule Networks
Stars: ✭ 42 (+61.54%)
Mutual labels:  capsule-networks
SegCaps
A Clone version from Original SegCaps source code with enhancements on MS COCO dataset.
Stars: ✭ 62 (+138.46%)
Mutual labels:  capsule-networks
glcapsnet
Global-Local Capsule Network (GLCapsNet) is a capsule-based architecture able to provide context-based eye fixation prediction for several autonomous driving scenarios, while offering interpretability both globally and locally.
Stars: ✭ 33 (+26.92%)
Mutual labels:  capsule-networks

DeepCaps PyTorch

PyTorch Implementation of "DeepCaps: Going Deeper with Capsule Networks" by J. Rajasegaran et al. [CVPR 2019]

Say Thanks!

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].