All Projects → robustness-vit → Similar Projects or Alternatives

217 Open source projects that are alternatives of or similar to robustness-vit

iPerceive
Applying Common-Sense Reasoning to Multi-Modal Dense Video Captioning and Video Question Answering | Python3 | PyTorch | CNNs | Causality | Reasoning | LSTMs | Transformers | Multi-Head Self Attention | Published in IEEE Winter Conference on Applications of Computer Vision (WACV) 2021
Stars: ✭ 52 (-33.33%)
Mutual labels:  transformers, self-attention
jax-models
Unofficial JAX implementations of deep learning research papers
Stars: ✭ 108 (+38.46%)
Mutual labels:  transformers, jax
chef-transformer
Chef Transformer 🍲 .
Stars: ✭ 29 (-62.82%)
Mutual labels:  transformers, jax
Transformer-in-PyTorch
Transformer/Transformer-XL/R-Transformer examples and explanations
Stars: ✭ 21 (-73.08%)
Mutual labels:  transformers, self-attention
modules
The official repository for our paper "Are Neural Nets Modular? Inspecting Functional Modularity Through Differentiable Weight Masks". We develop a method for analyzing emerging functional modularity in neural networks based on differentiable weight masks and use it to point out important issues in current-day neural networks.
Stars: ✭ 25 (-67.95%)
Mutual labels:  transformers
lightning-transformers
Flexible components pairing 🤗 Transformers with Pytorch Lightning
Stars: ✭ 551 (+606.41%)
Mutual labels:  transformers
label-studio-transformers
Label data using HuggingFace's transformers and automatically get a prediction service
Stars: ✭ 117 (+50%)
Mutual labels:  transformers
ShinRL
ShinRL: A Library for Evaluating RL Algorithms from Theoretical and Practical Perspectives (Deep RL Workshop 2021)
Stars: ✭ 30 (-61.54%)
Mutual labels:  jax
TransCenter
This is the official implementation of TransCenter. The code and pretrained models are now available here: https://gitlab.inria.fr/yixu/TransCenter_official.
Stars: ✭ 82 (+5.13%)
Mutual labels:  transformers
PyTorch-Model-Compare
Compare neural networks by their feature similarity
Stars: ✭ 119 (+52.56%)
Mutual labels:  transformers
lightweight-temporal-attention-pytorch
A PyTorch implementation of the Light Temporal Attention Encoder (L-TAE) for satellite image time series. classification
Stars: ✭ 43 (-44.87%)
Mutual labels:  self-attention
TIGER
Python toolbox to evaluate graph vulnerability and robustness (CIKM 2021)
Stars: ✭ 103 (+32.05%)
Mutual labels:  robustness
converse
Conversational text Analysis using various NLP techniques
Stars: ✭ 147 (+88.46%)
Mutual labels:  transformers
long-short-transformer
Implementation of Long-Short Transformer, combining local and global inductive biases for attention over long sequences, in Pytorch
Stars: ✭ 103 (+32.05%)
Mutual labels:  transformers
Awesome-Vision-Transformer-Collection
Variants of Vision Transformer and its downstream tasks
Stars: ✭ 124 (+58.97%)
Mutual labels:  self-attention
deepconsensus
DeepConsensus uses gap-aware sequence transformers to correct errors in Pacific Biosciences (PacBio) Circular Consensus Sequencing (CCS) data.
Stars: ✭ 124 (+58.97%)
Mutual labels:  transformers
Transformers-Tutorials
This repository contains demos I made with the Transformers library by HuggingFace.
Stars: ✭ 2,828 (+3525.64%)
Mutual labels:  transformers
remixer-pytorch
Implementation of the Remixer Block from the Remixer paper, in Pytorch
Stars: ✭ 37 (-52.56%)
Mutual labels:  transformers
xpandas
Universal 1d/2d data containers with Transformers functionality for data analysis.
Stars: ✭ 25 (-67.95%)
Mutual labels:  transformers
wechsel
Code for WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models.
Stars: ✭ 39 (-50%)
Mutual labels:  transformers
GPJax
A didactic Gaussian process package for researchers in Jax.
Stars: ✭ 159 (+103.85%)
Mutual labels:  jax
transformers-interpret
Model explainability that works seamlessly with 🤗 transformers. Explain your transformers model in just 2 lines of code.
Stars: ✭ 861 (+1003.85%)
Mutual labels:  transformers
deepfrog
An NLP-suite powered by deep learning
Stars: ✭ 16 (-79.49%)
Mutual labels:  transformers
WellcomeML
Repository for Machine Learning utils at the Wellcome Trust
Stars: ✭ 31 (-60.26%)
Mutual labels:  transformers
pysentimiento
A Python multilingual toolkit for Sentiment Analysis and Social NLP tasks
Stars: ✭ 274 (+251.28%)
Mutual labels:  transformers
Metric Learning Adversarial Robustness
Code for NeurIPS 2019 Paper
Stars: ✭ 44 (-43.59%)
Mutual labels:  robustness
molecule-attention-transformer
Pytorch reimplementation of Molecule Attention Transformer, which uses a transformer to tackle the graph-like structure of molecules
Stars: ✭ 46 (-41.03%)
Mutual labels:  transformers
dm pix
PIX is an image processing library in JAX, for JAX.
Stars: ✭ 271 (+247.44%)
Mutual labels:  jax
transformer generalization
The official repository for our paper "The Devil is in the Detail: Simple Tricks Improve Systematic Generalization of Transformers". We significantly improve the systematic generalization of transformer models on a variety of datasets using simple tricks and careful considerations.
Stars: ✭ 58 (-25.64%)
Mutual labels:  transformers
ATMC
[NeurIPS'2019] Shupeng Gui, Haotao Wang, Haichuan Yang, Chen Yu, Zhangyang Wang, Ji Liu, “Model Compression with Adversarial Robustness: A Unified Optimization Framework”
Stars: ✭ 41 (-47.44%)
Mutual labels:  robustness
geometry-free-view-synthesis
Is a geometric model required to synthesize novel views from a single image?
Stars: ✭ 265 (+239.74%)
Mutual labels:  transformers
MISE
Multimodal Image Synthesis and Editing: A Survey
Stars: ✭ 214 (+174.36%)
Mutual labels:  transformers
bert-squeeze
🛠️ Tools for Transformers compression using PyTorch Lightning ⚡
Stars: ✭ 56 (-28.21%)
Mutual labels:  transformers
text-classification-transformers
Easy text classification for everyone : Bert based models via Huggingface transformers (KR / EN)
Stars: ✭ 32 (-58.97%)
Mutual labels:  transformers
efficientnet-jax
EfficientNet, MobileNetV3, MobileNetV2, MixNet, etc in JAX w/ Flax Linen and Objax
Stars: ✭ 114 (+46.15%)
Mutual labels:  jax
wax-ml
A Python library for machine-learning and feedback loops on streaming data
Stars: ✭ 36 (-53.85%)
Mutual labels:  jax
Text-Summarization
Abstractive and Extractive Text summarization using Transformers.
Stars: ✭ 38 (-51.28%)
Mutual labels:  transformers
Pytorch-NLU
Pytorch-NLU,一个中文文本分类、序列标注工具包,支持中文长文本、短文本的多类、多标签分类任务,支持中文命名实体识别、词性标注、分词等序列标注任务。 Ptorch NLU, a Chinese text classification and sequence annotation toolkit, supports multi class and multi label classification tasks of Chinese long text and short text, and supports sequence annotation tasks such as Chinese named entity recognition, part of speech ta…
Stars: ✭ 151 (+93.59%)
Mutual labels:  transformers
anonymisation
Anonymization of legal cases (Fr) based on Flair embeddings
Stars: ✭ 85 (+8.97%)
Mutual labels:  transformers
gnn-lspe
Source code for GNN-LSPE (Graph Neural Networks with Learnable Structural and Positional Representations), ICLR 2022
Stars: ✭ 165 (+111.54%)
Mutual labels:  transformers
Relational Deep Reinforcement Learning
No description or website provided.
Stars: ✭ 44 (-43.59%)
Mutual labels:  self-attention
GoEmotions-pytorch
Pytorch Implementation of GoEmotions 😍😢😱
Stars: ✭ 95 (+21.79%)
Mutual labels:  transformers
CrabNet
Predict materials properties using only the composition information!
Stars: ✭ 57 (-26.92%)
Mutual labels:  self-attention
DocSum
A tool to automatically summarize documents abstractively using the BART or PreSumm Machine Learning Model.
Stars: ✭ 58 (-25.64%)
Mutual labels:  transformers
VAENAR-TTS
PyTorch Implementation of VAENAR-TTS: Variational Auto-Encoder based Non-AutoRegressive Text-to-Speech Synthesis.
Stars: ✭ 66 (-15.38%)
Mutual labels:  self-attention
course-content-dl
NMA deep learning course
Stars: ✭ 537 (+588.46%)
Mutual labels:  transformers
DUN
Code for "Depth Uncertainty in Neural Networks" (https://arxiv.org/abs/2006.08437)
Stars: ✭ 65 (-16.67%)
Mutual labels:  robustness
jaxfg
Factor graphs and nonlinear optimization for JAX
Stars: ✭ 124 (+58.97%)
Mutual labels:  jax
elastic transformers
Making BERT stretchy. Semantic Elasticsearch with Sentence Transformers
Stars: ✭ 153 (+96.15%)
Mutual labels:  transformers
optimum
🏎️ Accelerate training and inference of 🤗 Transformers with easy to use hardware optimization tools
Stars: ✭ 567 (+626.92%)
Mutual labels:  transformers
NeuralNetworkAnalysis.jl
Reachability analysis for closed-loop control systems
Stars: ✭ 37 (-52.56%)
Mutual labels:  robustness
Generalization-Causality
关于domain generalization,domain adaptation,causality,robutness,prompt,optimization,generative model各式各样研究的阅读笔记
Stars: ✭ 482 (+517.95%)
Mutual labels:  robustness
safe-control-gym
PyBullet CartPole and Quadrotor environments—with CasADi symbolic a priori dynamics—for learning-based control and RL
Stars: ✭ 272 (+248.72%)
Mutual labels:  robustness
language-planner
Official Code for "Language Models as Zero-Shot Planners: Extracting Actionable Knowledge for Embodied Agents"
Stars: ✭ 84 (+7.69%)
Mutual labels:  transformers
CIL-ReID
Benchmarks for Corruption Invariant Person Re-identification. [NeurIPS 2021 Track on Datasets and Benchmarks]
Stars: ✭ 71 (-8.97%)
Mutual labels:  robustness
eleanor
Code used during my Chaos Engineering and Resiliency Patterns talk.
Stars: ✭ 14 (-82.05%)
Mutual labels:  robustness
cycle-confusion
Code and models for ICCV2021 paper "Robust Object Detection via Instance-Level Temporal Cycle Confusion".
Stars: ✭ 67 (-14.1%)
Mutual labels:  robustness
BERT-NER
Using pre-trained BERT models for Chinese and English NER with 🤗Transformers
Stars: ✭ 114 (+46.15%)
Mutual labels:  transformers
robust-local-lipschitz
A Closer Look at Accuracy vs. Robustness
Stars: ✭ 75 (-3.85%)
Mutual labels:  robustness
1-60 of 217 similar projects