Isab PytorchAn implementation of (Induced) Set Attention Block, from the Set Transformers paper
Stars: ✭ 21 (-90.67%)
Global Self Attention NetworkA Pytorch implementation of Global Self-Attention Network, a fully-attention backbone for vision tasks
Stars: ✭ 64 (-71.56%)
Reformer PytorchReformer, the efficient Transformer, in Pytorch
Stars: ✭ 1,644 (+630.67%)
SimplednnSimpleDNN is a machine learning lightweight open-source library written in Kotlin designed to support relevant neural network architectures in natural language processing tasks
Stars: ✭ 81 (-64%)
Sinkhorn TransformerSinkhorn Transformer - Practical implementation of Sparse Sinkhorn Attention
Stars: ✭ 156 (-30.67%)
Alphafold2To eventually become an unofficial Pytorch implementation / replication of Alphafold2, as details of the architecture get released
Stars: ✭ 298 (+32.44%)
Se3 Transformer PytorchImplementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.
Stars: ✭ 73 (-67.56%)
Perceiver PytorchImplementation of Perceiver, General Perception with Iterative Attention, in Pytorch
Stars: ✭ 130 (-42.22%)
Dalle PytorchImplementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch
Stars: ✭ 3,661 (+1527.11%)
Linear Attention TransformerTransformer based on a variant of attention that is linear complexity in respect to sequence length
Stars: ✭ 205 (-8.89%)
Slot AttentionImplementation of Slot Attention from GoogleAI
Stars: ✭ 168 (-25.33%)
Self Attention CvImplementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (-7.11%)
Vit PytorchImplementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
Stars: ✭ 7,199 (+3099.56%)
Performer PytorchAn implementation of Performer, a linear attention-based transformer, in Pytorch
Stars: ✭ 546 (+142.67%)
Lambda NetworksImplementation of LambdaNetworks, a new approach to image recognition that reaches SOTA with less compute
Stars: ✭ 1,497 (+565.33%)
X TransformersA simple but complete full-attention transformer with a set of promising experimental features from various papers
Stars: ✭ 211 (-6.22%)
Linformer PytorchMy take on a practical implementation of Linformer for Pytorch.
Stars: ✭ 239 (+6.22%)
transganformerImplementation of TransGanFormer, an all-attention GAN that combines the finding from the recent GanFormer and TransGan paper
Stars: ✭ 137 (-39.11%)
vista-netCode for the paper "VistaNet: Visual Aspect Attention Network for Multimodal Sentiment Analysis", AAAI'19
Stars: ✭ 67 (-70.22%)
DreamerDream to Control: Learning Behaviors by Latent Imagination
Stars: ✭ 269 (+19.56%)
PolyaxonMachine Learning Platform for Kubernetes (MLOps tools for experimentation and automation)
Stars: ✭ 2,966 (+1218.22%)
Attention一些不同的Attention机制代码
Stars: ✭ 17 (-92.44%)
linformerImplementation of Linformer for Pytorch
Stars: ✭ 119 (-47.11%)
L2cLearning to Cluster. A deep clustering strategy.
Stars: ✭ 262 (+16.44%)
ADL2019Applied Deep Learning (2019 Spring) @ NTU
Stars: ✭ 20 (-91.11%)
Caffe HrtHeterogeneous Run Time version of Caffe. Added heterogeneous capabilities to the Caffe, uses heterogeneous computing infrastructure framework to speed up Deep Learning on Arm-based heterogeneous embedded platform. It also retains all the features of the original Caffe architecture which users deploy their applications seamlessly.
Stars: ✭ 271 (+20.44%)
Dalle MtfOpen-AI's DALL-E for large scale training in mesh-tensorflow.
Stars: ✭ 250 (+11.11%)
pynmta simple and complete pytorch implementation of neural machine translation system
Stars: ✭ 13 (-94.22%)
PyswipPySwip is a Python - SWI-Prolog bridge enabling to query SWI-Prolog in your Python programs. It features an (incomplete) SWI-Prolog foreign language interface, a utility class that makes it easy querying with Prolog and also a Pythonic interface.
Stars: ✭ 276 (+22.67%)
co-attentionPytorch implementation of "Dynamic Coattention Networks For Question Answering"
Stars: ✭ 54 (-76%)
AtlasAn Open Source, Self-Hosted Platform For Applied Deep Learning Development
Stars: ✭ 259 (+15.11%)
MoChA-pytorchPyTorch Implementation of "Monotonic Chunkwise Attention" (ICLR 2018)
Stars: ✭ 65 (-71.11%)
GophernotesThe Go kernel for Jupyter notebooks and nteract.
Stars: ✭ 3,100 (+1277.78%)
ttslearnttslearn: Library for Pythonで学ぶ音声合成 (Text-to-speech with Python)
Stars: ✭ 158 (-29.78%)
Es Dev StackAn on-premises, bare-metal solution for deploying GPU-powered applications in containers
Stars: ✭ 257 (+14.22%)
Transformer-in-TransformerAn Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches
Stars: ✭ 40 (-82.22%)
SelfAttentiveImplementation of A Structured Self-attentive Sentence Embedding
Stars: ✭ 107 (-52.44%)
Apc Vision ToolboxMIT-Princeton Vision Toolbox for the Amazon Picking Challenge 2016 - RGB-D ConvNet-based object segmentation and 6D object pose estimation.
Stars: ✭ 277 (+23.11%)
Olivia💁♀️Your new best friend powered by an artificial neural network
Stars: ✭ 3,114 (+1284%)
Awesome Ai AwesomenessA curated list of awesome awesomeness about artificial intelligence
Stars: ✭ 268 (+19.11%)
Iamdinosaur🦄 An Artificial Inteligence to teach Google's Dinosaur to jump cactus
Stars: ✭ 2,767 (+1129.78%)
Image-CaptionUsing LSTM or Transformer to solve Image Captioning in Pytorch
Stars: ✭ 36 (-84%)
nuwa-pytorchImplementation of NÜWA, state of the art attention network for text to video synthesis, in Pytorch
Stars: ✭ 347 (+54.22%)
Ai Job NotesAI算法岗求职攻略(涵盖准备攻略、刷题指南、内推和AI公司清单等资料)
Stars: ✭ 3,191 (+1318.22%)
QuantumForestFast Differentiable Forest lib with the advantages of both decision trees and neural networks
Stars: ✭ 63 (-72%)
PAM[TPAMI 2020] Parallax Attention for Unsupervised Stereo Correspondence Learning
Stars: ✭ 62 (-72.44%)
ShogunShōgun
Stars: ✭ 2,859 (+1170.67%)
Da Rnn📃 **Unofficial** PyTorch Implementation of DA-RNN (arXiv:1704.02971)
Stars: ✭ 256 (+13.78%)
Video-Cap🎬 Video Captioning: ICCV '15 paper implementation
Stars: ✭ 44 (-80.44%)
Amazing Python Scripts🚀 Curated collection of Amazing Python scripts from Basics to Advance with automation task scripts.
Stars: ✭ 229 (+1.78%)