Linear Attention TransformerTransformer based on a variant of attention that is linear complexity in respect to sequence length
Stars: ✭ 205 (-31.21%)
Global Self Attention NetworkA Pytorch implementation of Global Self-Attention Network, a fully-attention backbone for vision tasks
Stars: ✭ 64 (-78.52%)
Reformer PytorchReformer, the efficient Transformer, in Pytorch
Stars: ✭ 1,644 (+451.68%)
Isab PytorchAn implementation of (Induced) Set Attention Block, from the Set Transformers paper
Stars: ✭ 21 (-92.95%)
Vit PytorchImplementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
Stars: ✭ 7,199 (+2315.77%)
SimplednnSimpleDNN is a machine learning lightweight open-source library written in Kotlin designed to support relevant neural network architectures in natural language processing tasks
Stars: ✭ 81 (-72.82%)
Perceiver PytorchImplementation of Perceiver, General Perception with Iterative Attention, in Pytorch
Stars: ✭ 130 (-56.38%)
Se3 Transformer PytorchImplementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.
Stars: ✭ 73 (-75.5%)
Dalle PytorchImplementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch
Stars: ✭ 3,661 (+1128.52%)
Sinkhorn TransformerSinkhorn Transformer - Practical implementation of Sparse Sinkhorn Attention
Stars: ✭ 156 (-47.65%)
Slot AttentionImplementation of Slot Attention from GoogleAI
Stars: ✭ 168 (-43.62%)
Self Attention CvImplementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (-29.87%)
Timesformer PytorchImplementation of TimeSformer from Facebook AI, a pure attention-based solution for video classification
Stars: ✭ 225 (-24.5%)
Performer PytorchAn implementation of Performer, a linear attention-based transformer, in Pytorch
Stars: ✭ 546 (+83.22%)
Lambda NetworksImplementation of LambdaNetworks, a new approach to image recognition that reaches SOTA with less compute
Stars: ✭ 1,497 (+402.35%)
X TransformersA simple but complete full-attention transformer with a set of promising experimental features from various papers
Stars: ✭ 211 (-29.19%)
Linformer PytorchMy take on a practical implementation of Linformer for Pytorch.
Stars: ✭ 239 (-19.8%)
L2cLearning to Cluster. A deep clustering strategy.
Stars: ✭ 262 (-12.08%)
Dalle MtfOpen-AI's DALL-E for large scale training in mesh-tensorflow.
Stars: ✭ 250 (-16.11%)
AtlasAn Open Source, Self-Hosted Platform For Applied Deep Learning Development
Stars: ✭ 259 (-13.09%)
Dreamerv2Mastering Atari with Discrete World Models
Stars: ✭ 287 (-3.69%)
Apc Vision ToolboxMIT-Princeton Vision Toolbox for the Amazon Picking Challenge 2016 - RGB-D ConvNet-based object segmentation and 6D object pose estimation.
Stars: ✭ 277 (-7.05%)
Es Dev StackAn on-premises, bare-metal solution for deploying GPU-powered applications in containers
Stars: ✭ 257 (-13.76%)
ShufflenetShuffleNet in PyTorch. Based on https://arxiv.org/abs/1707.01083
Stars: ✭ 262 (-12.08%)
Multi Scale AttentionCode for our paper "Multi-scale Guided Attention for Medical Image Segmentation"
Stars: ✭ 281 (-5.7%)
PolyaxonMachine Learning Platform for Kubernetes (MLOps tools for experimentation and automation)
Stars: ✭ 2,966 (+895.3%)
Gophersatgophersat, a SAT solver in Go
Stars: ✭ 300 (+0.67%)
TransformerA Pytorch Implementation of "Attention is All You Need" and "Weighted Transformer Network for Machine Translation"
Stars: ✭ 271 (-9.06%)
Iamdinosaur🦄 An Artificial Inteligence to teach Google's Dinosaur to jump cactus
Stars: ✭ 2,767 (+828.52%)
Ai Job NotesAI算法岗求职攻略(涵盖准备攻略、刷题指南、内推和AI公司清单等资料)
Stars: ✭ 3,191 (+970.81%)
Da Rnn📃 **Unofficial** PyTorch Implementation of DA-RNN (arXiv:1704.02971)
Stars: ✭ 256 (-14.09%)
PyswipPySwip is a Python - SWI-Prolog bridge enabling to query SWI-Prolog in your Python programs. It features an (incomplete) SWI-Prolog foreign language interface, a utility class that makes it easy querying with Prolog and also a Pythonic interface.
Stars: ✭ 276 (-7.38%)
Amazing Python Scripts🚀 Curated collection of Amazing Python scripts from Basics to Advance with automation task scripts.
Stars: ✭ 229 (-23.15%)
FakenewscorpusA dataset of millions of news articles scraped from a curated list of data sources.
Stars: ✭ 255 (-14.43%)
Olivia💁♀️Your new best friend powered by an artificial neural network
Stars: ✭ 3,114 (+944.97%)
ArticutapiAPI of Articut 中文斷詞 (兼具語意詞性標記):「斷詞」又稱「分詞」,是中文資訊處理的基礎。Articut 不用機器學習,不需資料模型,只用現代白話中文語法規則,即能達到 SIGHAN 2005 F1-measure 94% 以上,Recall 96% 以上的成績。
Stars: ✭ 252 (-15.44%)
MirnetOfficial repository for "Learning Enriched Features for Real Image Restoration and Enhancement" (ECCV 2020). SOTA results for image denoising, super-resolution, and image enhancement.
Stars: ✭ 247 (-17.11%)
GraphbrainLanguage, Knowledge, Cognition
Stars: ✭ 294 (-1.34%)
StripsAI Automated Planning with STRIPS and PDDL in Node.js
Stars: ✭ 272 (-8.72%)
galerkin-transformer[NeurIPS 2021] Galerkin Transformer: a linear attention without softmax
Stars: ✭ 111 (-62.75%)
linformerImplementation of Linformer for Pytorch
Stars: ✭ 119 (-60.07%)
transganformerImplementation of TransGanFormer, an all-attention GAN that combines the finding from the recent GanFormer and TransGan paper
Stars: ✭ 137 (-54.03%)
Awesome Blockchain AiA curated list of Blockchain projects for Artificial Intelligence and Machine Learning
Stars: ✭ 283 (-5.03%)
Caffe HrtHeterogeneous Run Time version of Caffe. Added heterogeneous capabilities to the Caffe, uses heterogeneous computing infrastructure framework to speed up Deep Learning on Arm-based heterogeneous embedded platform. It also retains all the features of the original Caffe architecture which users deploy their applications seamlessly.
Stars: ✭ 271 (-9.06%)
ADL2019Applied Deep Learning (2019 Spring) @ NTU
Stars: ✭ 20 (-93.29%)
vista-netCode for the paper "VistaNet: Visual Aspect Attention Network for Multimodal Sentiment Analysis", AAAI'19
Stars: ✭ 67 (-77.52%)
DreamerDream to Control: Learning Behaviors by Latent Imagination
Stars: ✭ 269 (-9.73%)