All Git Users → lucidrains

67 open source projects by lucidrains

51. h-transformer-1d
Implementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
52. egnn-pytorch
Implementation of E(n)-Equivariant Graph Neural Networks, in Pytorch
53. RETRO-pytorch
Implementation of RETRO, Deepmind's Retrieval based Attention net, in Pytorch
54. glom-pytorch
An attempt at the implementation of Glom, Geoffrey Hinton's new idea that integrates concepts from neural fields, top-down-bottom-up processing, and attention (consensus between columns), for emergent part-whole heirarchies from data
55. axial-attention
Implementation of Axial attention - attending to multi-dimensional data efficiently
56. mlp-mixer-pytorch
An All-MLP solution for Vision, from Google AI
57. rotary-embedding-torch
Implementation of Rotary Embeddings, from the Roformer paper, in Pytorch
59. En-transformer
Implementation of E(n)-Transformer, which extends the ideas of Welling's E(n)-Equivariant Graph Neural Network to attention
60. uniformer-pytorch
Implementation of Uniformer, a simple attention and 3d convolutional net that achieved SOTA in a number of video classification tasks, debuted in ICLR 2022
61. res-mlp-pytorch
Implementation of ResMLP, an all MLP solution to image classification, in Pytorch
62. g-mlp-pytorch
Implementation of gMLP, an all-MLP replacement for Transformers, in Pytorch
63. progen
Implementation and replication of ProGen, Language Modeling for Protein Generation, in Jax
64. memory-compressed-attention
Implementation of Memory-Compressed Attention, from the paper "Generating Wikipedia By Summarizing Long Sequences"
65. invariant-point-attention
Implementation of Invariant Point Attention, used for coordinate refinement in the structure module of Alphafold2, as a standalone Pytorch module
67. STAM-pytorch
Implementation of STAM (Space Time Attention Model), a pure and simple attention model that reaches SOTA for video classification
51-67 of 67 user projects