All Projects → robustness-vit → Similar Projects or Alternatives

217 Open source projects that are alternatives of or similar to robustness-vit

graphsignal
Graphsignal Python agent
Stars: ✭ 158 (+102.56%)
Mutual labels:  jax
Pytorch Sentiment Analysis
Tutorials on getting started with PyTorch and TorchText for sentiment analysis.
Stars: ✭ 3,209 (+4014.1%)
Mutual labels:  transformers
Chinese-Minority-PLM
CINO: Pre-trained Language Models for Chinese Minority (少数民族语言预训练模型)
Stars: ✭ 133 (+70.51%)
Mutual labels:  transformers
Dalle Pytorch
Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch
Stars: ✭ 3,661 (+4593.59%)
Mutual labels:  transformers
Metric Learning Adversarial Robustness
Code for NeurIPS 2019 Paper
Stars: ✭ 44 (-43.59%)
Mutual labels:  robustness
Spark Nlp
State of the Art Natural Language Processing
Stars: ✭ 2,518 (+3128.21%)
Mutual labels:  transformers
BottleneckTransformers
Bottleneck Transformers for Visual Recognition
Stars: ✭ 231 (+196.15%)
Mutual labels:  transformers
Clue
中文语言理解测评基准 Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard
Stars: ✭ 2,425 (+3008.97%)
Mutual labels:  transformers
molecule-attention-transformer
Pytorch reimplementation of Molecule Attention Transformer, which uses a transformer to tackle the graph-like structure of molecules
Stars: ✭ 46 (-41.03%)
Mutual labels:  transformers
Haystack
🔍 Haystack is an open source NLP framework that leverages Transformer models. It enables developers to implement production-ready neural search, question answering, semantic document search and summarization for a wide range of applications.
Stars: ✭ 3,409 (+4270.51%)
Mutual labels:  transformers
awesome-huggingface
🤗 A list of wonderful open-source projects & applications integrated with Hugging Face libraries.
Stars: ✭ 436 (+458.97%)
Mutual labels:  transformers
Tokenizers
💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
Stars: ✭ 5,077 (+6408.97%)
Mutual labels:  transformers
dm pix
PIX is an image processing library in JAX, for JAX.
Stars: ✭ 271 (+247.44%)
Mutual labels:  jax
CogView
Text-to-Image generation. The repo for NeurIPS 2021 paper "CogView: Mastering Text-to-Image Generation via Transformers".
Stars: ✭ 708 (+807.69%)
Mutual labels:  transformers
X-Transformer
X-Transformer: Taming Pretrained Transformers for eXtreme Multi-label Text Classification
Stars: ✭ 127 (+62.82%)
Mutual labels:  transformers
hashformers
Hashformers is a framework for hashtag segmentation with transformers.
Stars: ✭ 18 (-76.92%)
Mutual labels:  transformers
policy-data-analyzer
Building a model to recognize incentives for landscape restoration in environmental policies from Latin America, the US and India. Bringing NLP to the world of policy analysis through an extensible framework that includes scraping, preprocessing, active learning and text analysis pipelines.
Stars: ✭ 22 (-71.79%)
Mutual labels:  transformers
Introduction-to-Deep-Learning-and-Neural-Networks-Course
Code snippets and solutions for the Introduction to Deep Learning and Neural Networks Course hosted in educative.io
Stars: ✭ 33 (-57.69%)
Mutual labels:  transformers
BangalASR
Transformer based Bangla Speech Recognition
Stars: ✭ 20 (-74.36%)
Mutual labels:  transformers
ATMC
[NeurIPS'2019] Shupeng Gui, Haotao Wang, Haichuan Yang, Chen Yu, Zhangyang Wang, Ji Liu, “Model Compression with Adversarial Robustness: A Unified Optimization Framework”
Stars: ✭ 41 (-47.44%)
Mutual labels:  robustness
erc
Emotion recognition in conversation
Stars: ✭ 34 (-56.41%)
Mutual labels:  transformers
rA9
JAX-based Spiking Neural Network framework
Stars: ✭ 60 (-23.08%)
Mutual labels:  jax
bangla-bert
Bangla-Bert is a pretrained bert model for Bengali language
Stars: ✭ 41 (-47.44%)
Mutual labels:  transformers
geometry-free-view-synthesis
Is a geometric model required to synthesize novel views from a single image?
Stars: ✭ 265 (+239.74%)
Mutual labels:  transformers
knowledge-neurons
A library for finding knowledge neurons in pretrained transformer models.
Stars: ✭ 72 (-7.69%)
Mutual labels:  transformers
Multi-Hop-Knowledge-Paths-Human-Needs
Ranking and Selecting Multi-Hop Knowledge Paths to Better Predict Human Needs
Stars: ✭ 17 (-78.21%)
Mutual labels:  self-attention
HVT
[ICCV 2021] Official implementation of "Scalable Vision Transformers with Hierarchical Pooling"
Stars: ✭ 26 (-66.67%)
Mutual labels:  transformers
Awesome-Vision-Transformer-Collection
Variants of Vision Transformer and its downstream tasks
Stars: ✭ 124 (+58.97%)
Mutual labels:  self-attention
classy
classy is a simple-to-use library for building high-performance Machine Learning models in NLP.
Stars: ✭ 61 (-21.79%)
Mutual labels:  transformers
question generator
An NLP system for generating reading comprehension questions
Stars: ✭ 188 (+141.03%)
Mutual labels:  transformers
MISE
Multimodal Image Synthesis and Editing: A Survey
Stars: ✭ 214 (+174.36%)
Mutual labels:  transformers
nlp-papers
Must-read papers on Natural Language Processing (NLP)
Stars: ✭ 87 (+11.54%)
Mutual labels:  transformers
smaller-transformers
Load What You Need: Smaller Multilingual Transformers for Pytorch and TensorFlow 2.0.
Stars: ✭ 66 (-15.38%)
Mutual labels:  transformers
danish transformers
A collection of Danish Transformers
Stars: ✭ 30 (-61.54%)
Mutual labels:  transformers
text2class
Multi-class text categorization using state-of-the-art pre-trained contextualized language models, e.g. BERT
Stars: ✭ 15 (-80.77%)
Mutual labels:  transformers
efficientnet-jax
EfficientNet, MobileNetV3, MobileNetV2, MixNet, etc in JAX w/ Flax Linen and Objax
Stars: ✭ 114 (+46.15%)
Mutual labels:  jax
minicons
Utility for analyzing Transformer based representations of language.
Stars: ✭ 28 (-64.1%)
Mutual labels:  transformers
aileen-core
Sensor data aggregation tool for any numerical sensor data. Robust and privacy-friendly.
Stars: ✭ 15 (-80.77%)
Mutual labels:  robustness
nuwa-pytorch
Implementation of NÜWA, state of the art attention network for text to video synthesis, in Pytorch
Stars: ✭ 347 (+344.87%)
Mutual labels:  transformers
wax-ml
A Python library for machine-learning and feedback loops on streaming data
Stars: ✭ 36 (-53.85%)
Mutual labels:  jax
spark-transformers
Spark-Transformers: Library for exporting Apache Spark MLLIB models to use them in any Java application with no other dependencies.
Stars: ✭ 39 (-50%)
Mutual labels:  transformers
uniformer-pytorch
Implementation of Uniformer, a simple attention and 3d convolutional net that achieved SOTA in a number of video classification tasks, debuted in ICLR 2022
Stars: ✭ 90 (+15.38%)
Mutual labels:  transformers
Product-Categorization-NLP
Multi-Class Text Classification for products based on their description with Machine Learning algorithms and Neural Networks (MLP, CNN, Distilbert).
Stars: ✭ 30 (-61.54%)
Mutual labels:  transformers
Text-Summarization
Abstractive and Extractive Text summarization using Transformers.
Stars: ✭ 38 (-51.28%)
Mutual labels:  transformers
golgotha
Contextualised Embeddings and Language Modelling using BERT and Friends using R
Stars: ✭ 39 (-50%)
Mutual labels:  transformers
clip-italian
CLIP (Contrastive Language–Image Pre-training) for Italian
Stars: ✭ 113 (+44.87%)
Mutual labels:  transformers
small-text
Active Learning for Text Classification in Python
Stars: ✭ 241 (+208.97%)
Mutual labels:  transformers
Pytorch-NLU
Pytorch-NLU,一个中文文本分类、序列标注工具包,支持中文长文本、短文本的多类、多标签分类任务,支持中文命名实体识别、词性标注、分词等序列标注任务。 Ptorch NLU, a Chinese text classification and sequence annotation toolkit, supports multi class and multi label classification tasks of Chinese long text and short text, and supports sequence annotation tasks such as Chinese named entity recognition, part of speech ta…
Stars: ✭ 151 (+93.59%)
Mutual labels:  transformers
ParsBigBird
Persian Bert For Long-Range Sequences
Stars: ✭ 58 (-25.64%)
Mutual labels:  transformers
adversarial-robustness-public
Code for AAAI 2018 accepted paper: "Improving the Adversarial Robustness and Interpretability of Deep Neural Networks by Regularizing their Input Gradients"
Stars: ✭ 49 (-37.18%)
Mutual labels:  robustness
square-attack
Square Attack: a query-efficient black-box adversarial attack via random search [ECCV 2020]
Stars: ✭ 89 (+14.1%)
Mutual labels:  robustness
anonymisation
Anonymization of legal cases (Fr) based on Flair embeddings
Stars: ✭ 85 (+8.97%)
Mutual labels:  transformers
POPQORN
An Algorithm to Quantify Robustness of Recurrent Neural Networks
Stars: ✭ 44 (-43.59%)
Mutual labels:  robustness
cycle-confusion
Code and models for ICCV2021 paper "Robust Object Detection via Instance-Level Temporal Cycle Confusion".
Stars: ✭ 67 (-14.1%)
Mutual labels:  robustness
BERT-NER
Using pre-trained BERT models for Chinese and English NER with 🤗Transformers
Stars: ✭ 114 (+46.15%)
Mutual labels:  transformers
robust-local-lipschitz
A Closer Look at Accuracy vs. Robustness
Stars: ✭ 75 (-3.85%)
Mutual labels:  robustness
pytorch-vit
An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale
Stars: ✭ 250 (+220.51%)
Mutual labels:  transformers
Object-and-Semantic-Part-Detection-pyTorch
Joint detection of Object and its Semantic parts using Attention-based Feature Fusion on PASCAL Parts 2010 dataset
Stars: ✭ 18 (-76.92%)
Mutual labels:  self-attention
RETRO-pytorch
Implementation of RETRO, Deepmind's Retrieval based Attention net, in Pytorch
Stars: ✭ 473 (+506.41%)
Mutual labels:  transformers
Nlp Architect
A model library for exploring state-of-the-art deep learning topologies and techniques for optimizing Natural Language Processing neural networks
Stars: ✭ 2,768 (+3448.72%)
Mutual labels:  transformers
121-180 of 217 similar projects