Dalle PytorchImplementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch
Stars: ✭ 3,661 (-49.15%)
Reformer PytorchReformer, the efficient Transformer, in Pytorch
Stars: ✭ 1,644 (-77.16%)
Global Self Attention NetworkA Pytorch implementation of Global Self-Attention Network, a fully-attention backbone for vision tasks
Stars: ✭ 64 (-99.11%)
Lightnet🌓 Bringing pjreddie's DarkNet out of the shadows #yolo
Stars: ✭ 322 (-95.53%)
backpropBackprop makes it simple to use, finetune, and deploy state-of-the-art ML models.
Stars: ✭ 229 (-96.82%)
SianetAn easy to use C# deep learning library with CUDA/OpenCL support
Stars: ✭ 353 (-95.1%)
Meme GeneratorMemeGen is a web application where the user gives an image as input and our tool generates a meme at one click for the user.
Stars: ✭ 57 (-99.21%)
The Third EyeAn AI based application to identify currency and gives audio feedback.
Stars: ✭ 63 (-99.12%)
Malware ClassificationTowards Building an Intelligent Anti-Malware System: A Deep Learning Approach using Support Vector Machine for Malware Classification
Stars: ✭ 88 (-98.78%)
Lambda NetworksImplementation of LambdaNetworks, a new approach to image recognition that reaches SOTA with less compute
Stars: ✭ 1,497 (-79.21%)
Transfer Learning SuiteTransfer Learning Suite in Keras. Perform transfer learning using any built-in Keras image classification model easily!
Stars: ✭ 212 (-97.06%)
Linformer PytorchMy take on a practical implementation of Linformer for Pytorch.
Stars: ✭ 239 (-96.68%)
Alphafold2To eventually become an unofficial Pytorch implementation / replication of Alphafold2, as details of the architecture get released
Stars: ✭ 298 (-95.86%)
RETRO-pytorchImplementation of RETRO, Deepmind's Retrieval based Attention net, in Pytorch
Stars: ✭ 473 (-93.43%)
Isab PytorchAn implementation of (Induced) Set Attention Block, from the Set Transformers paper
Stars: ✭ 21 (-99.71%)
Hardhat DetectorA convolutional neural network implementation of a script that detects whether an individual is wearing a hardhat or not.
Stars: ✭ 41 (-99.43%)
SimplednnSimpleDNN is a machine learning lightweight open-source library written in Kotlin designed to support relevant neural network architectures in natural language processing tasks
Stars: ✭ 81 (-98.87%)
IresnetImproved Residual Networks (https://arxiv.org/pdf/2004.04989.pdf)
Stars: ✭ 163 (-97.74%)
Transformer In TransformerImplementation of Transformer in Transformer, pixel level attention paired with patch level attention for image classification, in Pytorch
Stars: ✭ 176 (-97.56%)
Slot AttentionImplementation of Slot Attention from GoogleAI
Stars: ✭ 168 (-97.67%)
X TransformersA simple but complete full-attention transformer with a set of promising experimental features from various papers
Stars: ✭ 211 (-97.07%)
Linear Attention TransformerTransformer based on a variant of attention that is linear complexity in respect to sequence length
Stars: ✭ 205 (-97.15%)
Self Attention CvImplementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (-97.1%)
STAM-pytorchImplementation of STAM (Space Time Attention Model), a pure and simple attention model that reaches SOTA for video classification
Stars: ✭ 109 (-98.49%)
long-short-transformerImplementation of Long-Short Transformer, combining local and global inductive biases for attention over long sequences, in Pytorch
Stars: ✭ 103 (-98.57%)
Image classifierCNN image classifier implemented in Keras Notebook 🖼️.
Stars: ✭ 139 (-98.07%)
nuwa-pytorchImplementation of NÜWA, state of the art attention network for text to video synthesis, in Pytorch
Stars: ✭ 347 (-95.18%)
ArtificioDeep Learning Computer Vision Algorithms for Real-World Use
Stars: ✭ 326 (-95.47%)
Sinkhorn TransformerSinkhorn Transformer - Practical implementation of Sparse Sinkhorn Attention
Stars: ✭ 156 (-97.83%)
Performer PytorchAn implementation of Performer, a linear attention-based transformer, in Pytorch
Stars: ✭ 546 (-92.42%)
PbaEfficient Learning of Augmentation Policy Schedules
Stars: ✭ 461 (-93.6%)
Computervision RecipesBest Practices, code samples, and documentation for Computer Vision.
Stars: ✭ 8,214 (+14.1%)
CaerHigh-performance Vision library in Python. Scale your research, not boilerplate.
Stars: ✭ 452 (-93.72%)
Se3 Transformer PytorchImplementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.
Stars: ✭ 73 (-98.99%)
pytorch-vitAn Image is Worth 16x16 Words: Transformers for Image Recognition at Scale
Stars: ✭ 250 (-96.53%)
Transformer-in-TransformerAn Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches
Stars: ✭ 40 (-99.44%)
transganformerImplementation of TransGanFormer, an all-attention GAN that combines the finding from the recent GanFormer and TransGan paper
Stars: ✭ 137 (-98.1%)
Perceiver PytorchImplementation of Perceiver, General Perception with Iterative Attention, in Pytorch
Stars: ✭ 130 (-98.19%)
uniformer-pytorchImplementation of Uniformer, a simple attention and 3d convolutional net that achieved SOTA in a number of video classification tasks, debuted in ICLR 2022
Stars: ✭ 90 (-98.75%)
HugsVisionHugsVision is a easy to use huggingface wrapper for state-of-the-art computer vision
Stars: ✭ 154 (-97.86%)
Timesformer PytorchImplementation of TimeSformer from Facebook AI, a pure attention-based solution for video classification
Stars: ✭ 225 (-96.87%)
HubA library for transfer learning by reusing parts of TensorFlow models.
Stars: ✭ 3,007 (-58.23%)
StripsAI Automated Planning with STRIPS and PDDL in Node.js
Stars: ✭ 272 (-96.22%)
AtlasAn Open Source, Self-Hosted Platform For Applied Deep Learning Development
Stars: ✭ 259 (-96.4%)
Awesome Computer Vision ModelsA list of popular deep learning models related to classification, segmentation and detection problems
Stars: ✭ 278 (-96.14%)
Es Dev StackAn on-premises, bare-metal solution for deploying GPU-powered applications in containers
Stars: ✭ 257 (-96.43%)
Iamdinosaur🦄 An Artificial Inteligence to teach Google's Dinosaur to jump cactus
Stars: ✭ 2,767 (-61.56%)
Caffe HrtHeterogeneous Run Time version of Caffe. Added heterogeneous capabilities to the Caffe, uses heterogeneous computing infrastructure framework to speed up Deep Learning on Arm-based heterogeneous embedded platform. It also retains all the features of the original Caffe architecture which users deploy their applications seamlessly.
Stars: ✭ 271 (-96.24%)
Ai Job NotesAI算法岗求职攻略(涵盖准备攻略、刷题指南、内推和AI公司清单等资料)
Stars: ✭ 3,191 (-55.67%)
Da Rnn📃 **Unofficial** PyTorch Implementation of DA-RNN (arXiv:1704.02971)
Stars: ✭ 256 (-96.44%)
Deepkit MlThe collaborative real-time open-source machine learning devtool and training suite: Experiment execution, tracking, and debugging. With server and project management tools.
Stars: ✭ 286 (-96.03%)
DreamerDream to Control: Learning Behaviors by Latent Imagination
Stars: ✭ 269 (-96.26%)