All Projects → robustness-vit → Similar Projects or Alternatives

217 Open source projects that are alternatives of or similar to robustness-vit

molecule-attention-transformer
Pytorch reimplementation of Molecule Attention Transformer, which uses a transformer to tackle the graph-like structure of molecules
Stars: ✭ 46 (-41.03%)
Mutual labels:  transformers
awesome-huggingface
🤗 A list of wonderful open-source projects & applications integrated with Hugging Face libraries.
Stars: ✭ 436 (+458.97%)
Mutual labels:  transformers
dm pix
PIX is an image processing library in JAX, for JAX.
Stars: ✭ 271 (+247.44%)
Mutual labels:  jax
X-Transformer
X-Transformer: Taming Pretrained Transformers for eXtreme Multi-label Text Classification
Stars: ✭ 127 (+62.82%)
Mutual labels:  transformers
Fengshenbang-LM
Fengshenbang-LM(封神榜大模型)是IDEA研究院认知计算与自然语言研究中心主导的大模型开源体系,成为中文AIGC和认知智能的基础设施。
Stars: ✭ 1,813 (+2224.36%)
Mutual labels:  transformers
Introduction-to-Deep-Learning-and-Neural-Networks-Course
Code snippets and solutions for the Introduction to Deep Learning and Neural Networks Course hosted in educative.io
Stars: ✭ 33 (-57.69%)
Mutual labels:  transformers
ATMC
[NeurIPS'2019] Shupeng Gui, Haotao Wang, Haichuan Yang, Chen Yu, Zhangyang Wang, Ji Liu, “Model Compression with Adversarial Robustness: A Unified Optimization Framework”
Stars: ✭ 41 (-47.44%)
Mutual labels:  robustness
rA9
JAX-based Spiking Neural Network framework
Stars: ✭ 60 (-23.08%)
Mutual labels:  jax
geometry-free-view-synthesis
Is a geometric model required to synthesize novel views from a single image?
Stars: ✭ 265 (+239.74%)
Mutual labels:  transformers
Multi-Hop-Knowledge-Paths-Human-Needs
Ranking and Selecting Multi-Hop Knowledge Paths to Better Predict Human Needs
Stars: ✭ 17 (-78.21%)
Mutual labels:  self-attention
Awesome-Vision-Transformer-Collection
Variants of Vision Transformer and its downstream tasks
Stars: ✭ 124 (+58.97%)
Mutual labels:  self-attention
MISE
Multimodal Image Synthesis and Editing: A Survey
Stars: ✭ 214 (+174.36%)
Mutual labels:  transformers
eleanor
Code used during my Chaos Engineering and Resiliency Patterns talk.
Stars: ✭ 14 (-82.05%)
Mutual labels:  robustness
uvadlc notebooks
Repository of Jupyter notebook tutorials for teaching the Deep Learning Course at the University of Amsterdam (MSc AI), Fall 2022/Spring 2022
Stars: ✭ 901 (+1055.13%)
Mutual labels:  jax
danish transformers
A collection of Danish Transformers
Stars: ✭ 30 (-61.54%)
Mutual labels:  transformers
efficientnet-jax
EfficientNet, MobileNetV3, MobileNetV2, MixNet, etc in JAX w/ Flax Linen and Objax
Stars: ✭ 114 (+46.15%)
Mutual labels:  jax
aileen-core
Sensor data aggregation tool for any numerical sensor data. Robust and privacy-friendly.
Stars: ✭ 15 (-80.77%)
Mutual labels:  robustness
wax-ml
A Python library for machine-learning and feedback loops on streaming data
Stars: ✭ 36 (-53.85%)
Mutual labels:  jax
uniformer-pytorch
Implementation of Uniformer, a simple attention and 3d convolutional net that achieved SOTA in a number of video classification tasks, debuted in ICLR 2022
Stars: ✭ 90 (+15.38%)
Mutual labels:  transformers
Text-Summarization
Abstractive and Extractive Text summarization using Transformers.
Stars: ✭ 38 (-51.28%)
Mutual labels:  transformers
clip-italian
CLIP (Contrastive Language–Image Pre-training) for Italian
Stars: ✭ 113 (+44.87%)
Mutual labels:  transformers
Pytorch-NLU
Pytorch-NLU,一个中文文本分类、序列标注工具包,支持中文长文本、短文本的多类、多标签分类任务,支持中文命名实体识别、词性标注、分词等序列标注任务。 Ptorch NLU, a Chinese text classification and sequence annotation toolkit, supports multi class and multi label classification tasks of Chinese long text and short text, and supports sequence annotation tasks such as Chinese named entity recognition, part of speech ta…
Stars: ✭ 151 (+93.59%)
Mutual labels:  transformers
adversarial-robustness-public
Code for AAAI 2018 accepted paper: "Improving the Adversarial Robustness and Interpretability of Deep Neural Networks by Regularizing their Input Gradients"
Stars: ✭ 49 (-37.18%)
Mutual labels:  robustness
anonymisation
Anonymization of legal cases (Fr) based on Flair embeddings
Stars: ✭ 85 (+8.97%)
Mutual labels:  transformers
POPQORN
An Algorithm to Quantify Robustness of Recurrent Neural Networks
Stars: ✭ 44 (-43.59%)
Mutual labels:  robustness
gnn-lspe
Source code for GNN-LSPE (Graph Neural Networks with Learnable Structural and Positional Representations), ICLR 2022
Stars: ✭ 165 (+111.54%)
Mutual labels:  transformers
text
Using Transformers from HuggingFace in R
Stars: ✭ 66 (-15.38%)
Mutual labels:  transformers
Relational Deep Reinforcement Learning
No description or website provided.
Stars: ✭ 44 (-43.59%)
Mutual labels:  self-attention
GoEmotions-pytorch
Pytorch Implementation of GoEmotions 😍😢😱
Stars: ✭ 95 (+21.79%)
Mutual labels:  transformers
revisiting rainbow
Revisiting Rainbow
Stars: ✭ 71 (-8.97%)
Mutual labels:  jax
CrabNet
Predict materials properties using only the composition information!
Stars: ✭ 57 (-26.92%)
Mutual labels:  self-attention
backprop
Backprop makes it simple to use, finetune, and deploy state-of-the-art ML models.
Stars: ✭ 229 (+193.59%)
Mutual labels:  transformers
DocSum
A tool to automatically summarize documents abstractively using the BART or PreSumm Machine Learning Model.
Stars: ✭ 58 (-25.64%)
Mutual labels:  transformers
jax-cfd
Computational Fluid Dynamics in JAX
Stars: ✭ 399 (+411.54%)
Mutual labels:  jax
VAENAR-TTS
PyTorch Implementation of VAENAR-TTS: Variational Auto-Encoder based Non-AutoRegressive Text-to-Speech Synthesis.
Stars: ✭ 66 (-15.38%)
Mutual labels:  self-attention
jax-rl
JAX implementations of core Deep RL algorithms
Stars: ✭ 61 (-21.79%)
Mutual labels:  jax
course-content-dl
NMA deep learning course
Stars: ✭ 537 (+588.46%)
Mutual labels:  transformers
ADAM
ADAM implements a collection of algorithms for calculating rigid-body dynamics in Jax, CasADi, PyTorch, and Numpy.
Stars: ✭ 51 (-34.62%)
Mutual labels:  jax
DUN
Code for "Depth Uncertainty in Neural Networks" (https://arxiv.org/abs/2006.08437)
Stars: ✭ 65 (-16.67%)
Mutual labels:  robustness
ViTs-vs-CNNs
[NeurIPS 2021]: Are Transformers More Robust Than CNNs? (Pytorch implementation & checkpoints)
Stars: ✭ 145 (+85.9%)
Mutual labels:  robustness
jaxfg
Factor graphs and nonlinear optimization for JAX
Stars: ✭ 124 (+58.97%)
Mutual labels:  jax
naru
Neural Relation Understanding: neural cardinality estimators for tabular data
Stars: ✭ 76 (-2.56%)
Mutual labels:  transformers
elastic transformers
Making BERT stretchy. Semantic Elasticsearch with Sentence Transformers
Stars: ✭ 153 (+96.15%)
Mutual labels:  transformers
KB-ALBERT
KB국민은행에서 제공하는 경제/금융 도메인에 특화된 한국어 ALBERT 모델
Stars: ✭ 215 (+175.64%)
Mutual labels:  transformers
optimum
🏎️ Accelerate training and inference of 🤗 Transformers with easy to use hardware optimization tools
Stars: ✭ 567 (+626.92%)
Mutual labels:  transformers
omd
JAX code for the paper "Control-Oriented Model-Based Reinforcement Learning with Implicit Differentiation"
Stars: ✭ 43 (-44.87%)
Mutual labels:  jax
NeuralNetworkAnalysis.jl
Reachability analysis for closed-loop control systems
Stars: ✭ 37 (-52.56%)
Mutual labels:  robustness
thermostat
Collection of NLP model explanations and accompanying analysis tools
Stars: ✭ 126 (+61.54%)
Mutual labels:  transformers
Generalization-Causality
关于domain generalization,domain adaptation,causality,robutness,prompt,optimization,generative model各式各样研究的阅读笔记
Stars: ✭ 482 (+517.95%)
Mutual labels:  robustness
SnowflakeNet
(TPAMI 2022) Snowflake Point Deconvolution for Point Cloud Completion and Generation with Skip-Transformer
Stars: ✭ 74 (-5.13%)
Mutual labels:  transformers
safe-control-gym
PyBullet CartPole and Quadrotor environments—with CasADi symbolic a priori dynamics—for learning-based control and RL
Stars: ✭ 272 (+248.72%)
Mutual labels:  robustness
query-selector
LONG-TERM SERIES FORECASTING WITH QUERYSELECTOR – EFFICIENT MODEL OF SPARSEATTENTION
Stars: ✭ 63 (-19.23%)
Mutual labels:  self-attention
language-planner
Official Code for "Language Models as Zero-Shot Planners: Extracting Actionable Knowledge for Embodied Agents"
Stars: ✭ 84 (+7.69%)
Mutual labels:  transformers
CIL-ReID
Benchmarks for Corruption Invariant Person Re-identification. [NeurIPS 2021 Track on Datasets and Benchmarks]
Stars: ✭ 71 (-8.97%)
Mutual labels:  robustness
cycle-confusion
Code and models for ICCV2021 paper "Robust Object Detection via Instance-Level Temporal Cycle Confusion".
Stars: ✭ 67 (-14.1%)
Mutual labels:  robustness
BERT-NER
Using pre-trained BERT models for Chinese and English NER with 🤗Transformers
Stars: ✭ 114 (+46.15%)
Mutual labels:  transformers
robust-local-lipschitz
A Closer Look at Accuracy vs. Robustness
Stars: ✭ 75 (-3.85%)
Mutual labels:  robustness
pytorch-vit
An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale
Stars: ✭ 250 (+220.51%)
Mutual labels:  transformers
Object-and-Semantic-Part-Detection-pyTorch
Joint detection of Object and its Semantic parts using Attention-based Feature Fusion on PASCAL Parts 2010 dataset
Stars: ✭ 18 (-76.92%)
Mutual labels:  self-attention
RETRO-pytorch
Implementation of RETRO, Deepmind's Retrieval based Attention net, in Pytorch
Stars: ✭ 473 (+506.41%)
Mutual labels:  transformers
61-120 of 217 similar projects