thermostatCollection of NLP model explanations and accompanying analysis tools
Stars: ✭ 126 (+75%)
Transformer-MM-Explainability[ICCV 2021- Oral] Official PyTorch implementation for Generic Attention-model Explainability for Interpreting Bi-Modal and Encoder-Decoder Transformers, a novel method to visualize any Transformer-based network. Including examples for DETR, VQA.
Stars: ✭ 484 (+572.22%)
transformers-interpretModel explainability that works seamlessly with 🤗 transformers. Explain your transformers model in just 2 lines of code.
Stars: ✭ 861 (+1095.83%)
iPerceiveApplying Common-Sense Reasoning to Multi-Modal Dense Video Captioning and Video Question Answering | Python3 | PyTorch | CNNs | Causality | Reasoning | LSTMs | Transformers | Multi-Head Self Attention | Published in IEEE Winter Conference on Applications of Computer Vision (WACV) 2021
Stars: ✭ 52 (-27.78%)
Product-Categorization-NLPMulti-Class Text Classification for products based on their description with Machine Learning algorithms and Neural Networks (MLP, CNN, Distilbert).
Stars: ✭ 30 (-58.33%)
pytorch-vitAn Image is Worth 16x16 Words: Transformers for Image Recognition at Scale
Stars: ✭ 250 (+247.22%)
text2keywordsTrained T5 and T5-large model for creating keywords from text
Stars: ✭ 53 (-26.39%)
remixer-pytorchImplementation of the Remixer Block from the Remixer paper, in Pytorch
Stars: ✭ 37 (-48.61%)
spark-transformersSpark-Transformers: Library for exporting Apache Spark MLLIB models to use them in any Java application with no other dependencies.
Stars: ✭ 39 (-45.83%)
Pytorch-NLUPytorch-NLU,一个中文文本分类、序列标注工具包,支持中文长文本、短文本的多类、多标签分类任务,支持中文命名实体识别、词性标注、分词等序列标注任务。 Ptorch NLU, a Chinese text classification and sequence annotation toolkit, supports multi class and multi label classification tasks of Chinese long text and short text, and supports sequence annotation tasks such as Chinese named entity recognition, part of speech ta…
Stars: ✭ 151 (+109.72%)
zennitZennit is a high-level framework in Python using PyTorch for explaining/exploring neural networks using attribution methods like LRP.
Stars: ✭ 57 (-20.83%)
tttA package for fine-tuning Transformers with TPUs, written in Tensorflow2.0+
Stars: ✭ 35 (-51.39%)
molecule-attention-transformerPytorch reimplementation of Molecule Attention Transformer, which uses a transformer to tackle the graph-like structure of molecules
Stars: ✭ 46 (-36.11%)
C-TranGeneral Multi-label Image Classification with Transformers
Stars: ✭ 106 (+47.22%)
eve-botEVE bot, a customer service chatbot to enhance virtual engagement for Twitter Apple Support
Stars: ✭ 31 (-56.94%)
TorchBlocksA PyTorch-based toolkit for natural language processing
Stars: ✭ 85 (+18.06%)
DocSumA tool to automatically summarize documents abstractively using the BART or PreSumm Machine Learning Model.
Stars: ✭ 58 (-19.44%)
sageFor calculating global feature importance using Shapley values.
Stars: ✭ 129 (+79.17%)
WellcomeMLRepository for Machine Learning utils at the Wellcome Trust
Stars: ✭ 31 (-56.94%)
lightning-transformersFlexible components pairing 🤗 Transformers with Pytorch Lightning
Stars: ✭ 551 (+665.28%)
MISEMultimodal Image Synthesis and Editing: A Survey
Stars: ✭ 214 (+197.22%)
robo-vlnPytorch code for ICRA'21 paper: "Hierarchical Cross-Modal Agent for Robotics Vision-and-Language Navigation"
Stars: ✭ 34 (-52.78%)
optimum🏎️ Accelerate training and inference of 🤗 Transformers with easy to use hardware optimization tools
Stars: ✭ 567 (+687.5%)
smaller-transformersLoad What You Need: Smaller Multilingual Transformers for Pytorch and TensorFlow 2.0.
Stars: ✭ 66 (-8.33%)
TransCenterThis is the official implementation of TransCenter. The code and pretrained models are now available here: https://gitlab.inria.fr/yixu/TransCenter_official.
Stars: ✭ 82 (+13.89%)
classyclassy is a simple-to-use library for building high-performance Machine Learning models in NLP.
Stars: ✭ 61 (-15.28%)
deepfrogAn NLP-suite powered by deep learning
Stars: ✭ 16 (-77.78%)
summit🏔️ Summit: Scaling Deep Learning Interpretability by Visualizing Activation and Attribution Summarizations
Stars: ✭ 95 (+31.94%)
language-plannerOfficial Code for "Language Models as Zero-Shot Planners: Extracting Actionable Knowledge for Embodied Agents"
Stars: ✭ 84 (+16.67%)
text2classMulti-class text categorization using state-of-the-art pre-trained contextualized language models, e.g. BERT
Stars: ✭ 15 (-79.17%)
ProtoTreeProtoTrees: Neural Prototype Trees for Interpretable Fine-grained Image Recognition, published at CVPR2021
Stars: ✭ 47 (-34.72%)
golgothaContextualised Embeddings and Language Modelling using BERT and Friends using R
Stars: ✭ 39 (-45.83%)
converseConversational text Analysis using various NLP techniques
Stars: ✭ 147 (+104.17%)
HVT[ICCV 2021] Official implementation of "Scalable Vision Transformers with Hierarchical Pooling"
Stars: ✭ 26 (-63.89%)
modulesThe official repository for our paper "Are Neural Nets Modular? Inspecting Functional Modularity Through Differentiable Weight Masks". We develop a method for analyzing emerging functional modularity in neural networks based on differentiable weight masks and use it to point out important issues in current-day neural networks.
Stars: ✭ 25 (-65.28%)
small-textActive Learning for Text Classification in Python
Stars: ✭ 241 (+234.72%)
gnn-lspeSource code for GNN-LSPE (Graph Neural Networks with Learnable Structural and Positional Representations), ICLR 2022
Stars: ✭ 165 (+129.17%)
miniconsUtility for analyzing Transformer based representations of language.
Stars: ✭ 28 (-61.11%)
ParsBigBirdPersian Bert For Long-Range Sequences
Stars: ✭ 58 (-19.44%)
xpandasUniversal 1d/2d data containers with Transformers functionality for data analysis.
Stars: ✭ 25 (-65.28%)
n-grammer-pytorchImplementation of N-Grammer, augmenting Transformers with latent n-grams, in Pytorch
Stars: ✭ 50 (-30.56%)
bert-squeeze🛠️ Tools for Transformers compression using PyTorch Lightning ⚡
Stars: ✭ 56 (-22.22%)
yggdrasil-decision-forestsA collection of state-of-the-art algorithms for the training, serving and interpretation of Decision Forest models.
Stars: ✭ 156 (+116.67%)
nuwa-pytorchImplementation of NÜWA, state of the art attention network for text to video synthesis, in Pytorch
Stars: ✭ 347 (+381.94%)
long-short-transformerImplementation of Long-Short Transformer, combining local and global inductive biases for attention over long sequences, in Pytorch
Stars: ✭ 103 (+43.06%)
robustness-vitContains code for the paper "Vision Transformers are Robust Learners" (AAAI 2022).
Stars: ✭ 78 (+8.33%)
label-studio-transformersLabel data using HuggingFace's transformers and automatically get a prediction service
Stars: ✭ 117 (+62.5%)
deepconsensusDeepConsensus uses gap-aware sequence transformers to correct errors in Pacific Biosciences (PacBio) Circular Consensus Sequencing (CCS) data.
Stars: ✭ 124 (+72.22%)
TermiNetwork🌏 A zero-dependency networking solution for building modern and secure iOS, watchOS, macOS and tvOS applications.
Stars: ✭ 80 (+11.11%)
OpenDialogAn Open-Source Package for Chinese Open-domain Conversational Chatbot (中文闲聊对话系统,一键部署微信闲聊机器人)
Stars: ✭ 94 (+30.56%)