All Projects → Lambda Networks → Similar Projects or Alternatives

952 Open source projects that are alternatives of or similar to Lambda Networks

Performer Pytorch
An implementation of Performer, a linear attention-based transformer, in Pytorch
Stars: ✭ 546 (-63.53%)
Global Self Attention Network
A Pytorch implementation of Global Self-Attention Network, a fully-attention backbone for vision tasks
Stars: ✭ 64 (-95.72%)
Isab Pytorch
An implementation of (Induced) Set Attention Block, from the Set Transformers paper
Stars: ✭ 21 (-98.6%)
Self Attention Cv
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (-86.04%)
Multimodal Sentiment Analysis
Attention-based multimodal fusion for sentiment analysis
Stars: ✭ 172 (-88.51%)
Mutual labels:  attention-mechanism, attention
Pytorch Original Transformer
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (-72.55%)
Mutual labels:  attention-mechanism, attention
Reformer Pytorch
Reformer, the efficient Transformer, in Pytorch
Stars: ✭ 1,644 (+9.82%)
Slot Attention
Implementation of Slot Attention from GoogleAI
Stars: ✭ 168 (-88.78%)
Point Transformer Pytorch
Implementation of the Point Transformer layer, in Pytorch
Stars: ✭ 199 (-86.71%)
Linformer Pytorch
My take on a practical implementation of Linformer for Pytorch.
Stars: ✭ 239 (-84.03%)
lstm-attention
Attention-based bidirectional LSTM for Classification Task (ICASSP)
Stars: ✭ 87 (-94.19%)
Mutual labels:  attention, attention-mechanism
h-transformer-1d
Implementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (-91.92%)
Mutual labels:  attention, attention-mechanism
Attention
一些不同的Attention机制代码
Stars: ✭ 17 (-98.86%)
Mutual labels:  attention, attention-mechanism
Prediction Flow
Deep-Learning based CTR models implemented by PyTorch
Stars: ✭ 138 (-90.78%)
Mutual labels:  attention-mechanism, attention
Datastories Semeval2017 Task4
Deep-learning model presented in "DataStories at SemEval-2017 Task 4: Deep LSTM with Attention for Message-level and Topic-based Sentiment Analysis".
Stars: ✭ 184 (-87.71%)
Mutual labels:  attention-mechanism, attention
Perceiver Pytorch
Implementation of Perceiver, General Perception with Iterative Attention, in Pytorch
Stars: ✭ 130 (-91.32%)
Routing Transformer
Fully featured implementation of Routing Transformer
Stars: ✭ 149 (-90.05%)
X Transformers
A simple but complete full-attention transformer with a set of promising experimental features from various papers
Stars: ✭ 211 (-85.91%)
Graph attention pool
Attention over nodes in Graph Neural Networks using PyTorch (NeurIPS 2019)
Stars: ✭ 186 (-87.58%)
Mutual labels:  attention-mechanism, attention
visualization
a collection of visualization function
Stars: ✭ 189 (-87.37%)
Mutual labels:  attention, attention-mechanism
datastories-semeval2017-task6
Deep-learning model presented in "DataStories at SemEval-2017 Task 6: Siamese LSTM with Attention for Humorous Text Comparison".
Stars: ✭ 20 (-98.66%)
Mutual labels:  attention, attention-mechanism
ntua-slp-semeval2018
Deep-learning models of NTUA-SLP team submitted in SemEval 2018 tasks 1, 2 and 3.
Stars: ✭ 79 (-94.72%)
Mutual labels:  attention, attention-mechanism
Timesformer Pytorch
Implementation of TimeSformer from Facebook AI, a pure attention-based solution for video classification
Stars: ✭ 225 (-84.97%)
AoA-pytorch
A Pytorch implementation of Attention on Attention module (both self and guided variants), for Visual Question Answering
Stars: ✭ 33 (-97.8%)
Mutual labels:  attention, attention-mechanism
Vit Pytorch
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
Stars: ✭ 7,199 (+380.9%)
Seq2seq Summarizer
Pointer-generator reinforced seq2seq summarization in PyTorch
Stars: ✭ 306 (-79.56%)
Mutual labels:  attention-mechanism, attention
Alphafold2
To eventually become an unofficial Pytorch implementation / replication of Alphafold2, as details of the architecture get released
Stars: ✭ 298 (-80.09%)
NTUA-slp-nlp
💻Speech and Natural Language Processing (SLP & NLP) Lab Assignments for ECE NTUA
Stars: ✭ 19 (-98.73%)
Mutual labels:  attention, attention-mechanism
Structured Self Attention
A Structured Self-attentive Sentence Embedding
Stars: ✭ 459 (-69.34%)
Mutual labels:  attention-mechanism, attention
Bottleneck Transformer Pytorch
Implementation of Bottleneck Transformer in Pytorch
Stars: ✭ 408 (-72.75%)
Pytorch Gat
My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!
Stars: ✭ 908 (-39.35%)
Mutual labels:  attention-mechanism, attention
Image Caption Generator
A neural network to generate captions for an image using CNN and RNN with BEAM Search.
Stars: ✭ 126 (-91.58%)
Mutual labels:  attention-mechanism, attention
Absa keras
Keras Implementation of Aspect based Sentiment Analysis
Stars: ✭ 126 (-91.58%)
Mutual labels:  attention-mechanism, attention
CrabNet
Predict materials properties using only the composition information!
Stars: ✭ 57 (-96.19%)
Mutual labels:  attention, attention-mechanism
Neat Vision
Neat (Neural Attention) Vision, is a visualization tool for the attention mechanisms of deep-learning models for Natural Language Processing (NLP) tasks. (framework-agnostic)
Stars: ✭ 213 (-85.77%)
Mutual labels:  attention-mechanism, attention
Guided Attention Inference Network
Contains implementation of Guided Attention Inference Network (GAIN) presented in Tell Me Where to Look(CVPR 2018). This repository aims to apply GAIN on fcn8 architecture used for segmentation.
Stars: ✭ 204 (-86.37%)
Mutual labels:  attention-mechanism, attention
Sinkhorn Transformer
Sinkhorn Transformer - Practical implementation of Sparse Sinkhorn Attention
Stars: ✭ 156 (-89.58%)
Hnatt
Train and visualize Hierarchical Attention Networks
Stars: ✭ 192 (-87.17%)
Mutual labels:  attention-mechanism, attention
Dalle Pytorch
Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch
Stars: ✭ 3,661 (+144.56%)
Linear Attention Transformer
Transformer based on a variant of attention that is linear complexity in respect to sequence length
Stars: ✭ 205 (-86.31%)
automatic-personality-prediction
[AAAI 2020] Modeling Personality with Attentive Networks and Contextual Embeddings
Stars: ✭ 43 (-97.13%)
Mutual labels:  attention, attention-mechanism
Simplednn
SimpleDNN is a machine learning lightweight open-source library written in Kotlin designed to support relevant neural network architectures in natural language processing tasks
Stars: ✭ 81 (-94.59%)
Linear-Attention-Mechanism
Attention mechanism
Stars: ✭ 27 (-98.2%)
Mutual labels:  attention, attention-mechanism
Hierarchical-Word-Sense-Disambiguation-using-WordNet-Senses
Word Sense Disambiguation using Word Specific models, All word models and Hierarchical models in Tensorflow
Stars: ✭ 33 (-97.8%)
Mutual labels:  attention, attention-mechanism
Im2LaTeX
An implementation of the Show, Attend and Tell paper in Tensorflow, for the OpenAI Im2LaTeX suggested problem
Stars: ✭ 16 (-98.93%)
Mutual labels:  attention, attention-mechanism
Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (-72.75%)
Mutual labels:  attention-mechanism, attention
Se3 Transformer Pytorch
Implementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.
Stars: ✭ 73 (-95.12%)
Attend infer repeat
A Tensorfflow implementation of Attend, Infer, Repeat
Stars: ✭ 82 (-94.52%)
Mutual labels:  attention-mechanism, attention
Njunmt Tf
An open-source neural machine translation system developed by Natural Language Processing Group, Nanjing University.
Stars: ✭ 97 (-93.52%)
Mutual labels:  attention
Conformer
Implementation of the convolutional module from the Conformer paper, for use in Transformers
Stars: ✭ 103 (-93.12%)
Mutual labels:  artificial-intelligence
Rlai Exercises
Exercise Solutions for Reinforcement Learning: An Introduction [2nd Edition]
Stars: ✭ 97 (-93.52%)
Mutual labels:  artificial-intelligence
Papers Literature Ml Dl Rl Ai
Highly cited and useful papers related to machine learning, deep learning, AI, game theory, reinforcement learning
Stars: ✭ 1,341 (-10.42%)
Mutual labels:  artificial-intelligence
Awesome Ai Residency
List of AI Residency Programs
Stars: ✭ 1,653 (+10.42%)
Mutual labels:  artificial-intelligence
Talos
Hyperparameter Optimization for TensorFlow, Keras and PyTorch
Stars: ✭ 1,382 (-7.68%)
Mutual labels:  artificial-intelligence
Jupyterlab Prodigy
🧬 A JupyterLab extension for annotating data with Prodigy
Stars: ✭ 97 (-93.52%)
Mutual labels:  artificial-intelligence
Happy Transformer
A package built on top of Hugging Face's transformer library that makes it easy to utilize state-of-the-art NLP models
Stars: ✭ 97 (-93.52%)
Mutual labels:  artificial-intelligence
Milestones
The Automagic Project Planner
Stars: ✭ 102 (-93.19%)
Mutual labels:  artificial-intelligence
Ai fps
AI system to simulate combat behaviors in a FPS game using Behavior Trees (UE4)
Stars: ✭ 96 (-93.59%)
Mutual labels:  artificial-intelligence
Susi linux
Hardware for SUSI AI https://susi.ai
Stars: ✭ 1,527 (+2%)
Mutual labels:  artificial-intelligence
Blurr
Data transformations for the ML era
Stars: ✭ 96 (-93.59%)
Mutual labels:  artificial-intelligence
1-60 of 952 similar projects