All Categories → No Category → transformers

Top 124 transformers open source projects

pytorch-vit
An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale
molecule-attention-transformer
Pytorch reimplementation of Molecule Attention Transformer, which uses a transformer to tackle the graph-like structure of molecules
geometry-free-view-synthesis
Is a geometric model required to synthesize novel views from a single image?
modules
The official repository for our paper "Are Neural Nets Modular? Inspecting Functional Modularity Through Differentiable Weight Masks". We develop a method for analyzing emerging functional modularity in neural networks based on differentiable weight masks and use it to point out important issues in current-day neural networks.
xpandas
Universal 1d/2d data containers with Transformers functionality for data analysis.
WellcomeML
Repository for Machine Learning utils at the Wellcome Trust
bert-squeeze
🛠️ Tools for Transformers compression using PyTorch Lightning ⚡
lightning-transformers
Flexible components pairing 🤗 Transformers with Pytorch Lightning
long-short-transformer
Implementation of Long-Short Transformer, combining local and global inductive biases for attention over long sequences, in Pytorch
deepconsensus
DeepConsensus uses gap-aware sequence transformers to correct errors in Pacific Biosciences (PacBio) Circular Consensus Sequencing (CCS) data.
Transformers-Tutorials
This repository contains demos I made with the Transformers library by HuggingFace.
wechsel
Code for WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models.
pysentimiento
A Python multilingual toolkit for Sentiment Analysis and Social NLP tasks
transformer generalization
The official repository for our paper "The Devil is in the Detail: Simple Tricks Improve Systematic Generalization of Transformers". We significantly improve the systematic generalization of transformer models on a variety of datasets using simple tricks and careful considerations.
RETRO-pytorch
Implementation of RETRO, Deepmind's Retrieval based Attention net, in Pytorch
Chinese-Minority-PLM
CINO: Pre-trained Language Models for Chinese Minority (少数民族语言预训练模型)
code-transformer
Implementation of the paper "Language-agnostic representation learning of source code from structure and context".
awesome-huggingface
🤗 A list of wonderful open-source projects & applications integrated with Hugging Face libraries.
transformers-lightning
A collection of Models, Datasets, DataModules, Callbacks, Metrics, Losses and Loggers to better integrate pytorch-lightning with transformers.
X-Transformer
X-Transformer: Taming Pretrained Transformers for eXtreme Multi-label Text Classification
oreilly-bert-nlp
This repository contains code for the O'Reilly Live Online Training for BERT
nlp workshop odsc europe20
Extensive tutorials for the Advanced NLP Workshop in Open Data Science Conference Europe 2020. We will leverage machine learning, deep learning and deep transfer learning to learn and solve popular tasks using NLP including NER, Classification, Recommendation \ Information Retrieval, Summarization, Classification, Language Translation, Q&A and T…
danish transformers
A collection of Danish Transformers
Transformer-MM-Explainability
[ICCV 2021- Oral] Official PyTorch implementation for Generic Attention-model Explainability for Interpreting Bi-Modal and Encoder-Decoder Transformers, a novel method to visualize any Transformer-based network. Including examples for DETR, VQA.
uniformer-pytorch
Implementation of Uniformer, a simple attention and 3d convolutional net that achieved SOTA in a number of video classification tasks, debuted in ICLR 2022
KnowledgeEditor
Code for Editing Factual Knowledge in Language Models
Basic-UI-for-GPT-J-6B-with-low-vram
A repository to run gpt-j-6b on low vram machines (4.2 gb minimum vram for 2000 token context, 3.5 gb for 1000 token context). Model loading takes 12gb free ram.
text
Using Transformers from HuggingFace in R
Transformer-in-PyTorch
Transformer/Transformer-XL/R-Transformer examples and explanations
TransQuest
Transformer based translation quality estimation
STAM-pytorch
Implementation of STAM (Space Time Attention Model), a pure and simple attention model that reaches SOTA for video classification
KB-ALBERT
KB국민은행에서 제공하는 경제/금융 도메인에 특화된 한국어 ALBERT 모델
LIT
[AAAI 2022] This is the official PyTorch implementation of "Less is More: Pay Less Attention in Vision Transformers"
SnowflakeNet
(TPAMI 2022) Snowflake Point Deconvolution for Point Cloud Completion and Generation with Skip-Transformer
gpl
Powerful unsupervised domain adaptation method for dense retrieval. Requires only unlabeled corpus and yields massive improvement: "GPL: Generative Pseudo Labeling for Unsupervised Domain Adaptation of Dense Retrieval" https://arxiv.org/abs/2112.07577
61-120 of 124 transformers projects