iPerceiveApplying Common-Sense Reasoning to Multi-Modal Dense Video Captioning and Video Question Answering | Python3 | PyTorch | CNNs | Causality | Reasoning | LSTMs | Transformers | Multi-Head Self Attention | Published in IEEE Winter Conference on Applications of Computer Vision (WACV) 2021
Stars: ✭ 52 (-33.33%)
jax-modelsUnofficial JAX implementations of deep learning research papers
Stars: ✭ 108 (+38.46%)
Transformer-in-PyTorchTransformer/Transformer-XL/R-Transformer examples and explanations
Stars: ✭ 21 (-73.08%)
modulesThe official repository for our paper "Are Neural Nets Modular? Inspecting Functional Modularity Through Differentiable Weight Masks". We develop a method for analyzing emerging functional modularity in neural networks based on differentiable weight masks and use it to point out important issues in current-day neural networks.
Stars: ✭ 25 (-67.95%)
lightning-transformersFlexible components pairing 🤗 Transformers with Pytorch Lightning
Stars: ✭ 551 (+606.41%)
label-studio-transformersLabel data using HuggingFace's transformers and automatically get a prediction service
Stars: ✭ 117 (+50%)
ShinRLShinRL: A Library for Evaluating RL Algorithms from Theoretical and Practical Perspectives (Deep RL Workshop 2021)
Stars: ✭ 30 (-61.54%)
TransCenterThis is the official implementation of TransCenter. The code and pretrained models are now available here: https://gitlab.inria.fr/yixu/TransCenter_official.
Stars: ✭ 82 (+5.13%)
TIGERPython toolbox to evaluate graph vulnerability and robustness (CIKM 2021)
Stars: ✭ 103 (+32.05%)
converseConversational text Analysis using various NLP techniques
Stars: ✭ 147 (+88.46%)
long-short-transformerImplementation of Long-Short Transformer, combining local and global inductive biases for attention over long sequences, in Pytorch
Stars: ✭ 103 (+32.05%)
deepconsensusDeepConsensus uses gap-aware sequence transformers to correct errors in Pacific Biosciences (PacBio) Circular Consensus Sequencing (CCS) data.
Stars: ✭ 124 (+58.97%)
Transformers-TutorialsThis repository contains demos I made with the Transformers library by HuggingFace.
Stars: ✭ 2,828 (+3525.64%)
remixer-pytorchImplementation of the Remixer Block from the Remixer paper, in Pytorch
Stars: ✭ 37 (-52.56%)
xpandasUniversal 1d/2d data containers with Transformers functionality for data analysis.
Stars: ✭ 25 (-67.95%)
wechselCode for WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models.
Stars: ✭ 39 (-50%)
GPJaxA didactic Gaussian process package for researchers in Jax.
Stars: ✭ 159 (+103.85%)
transformers-interpretModel explainability that works seamlessly with 🤗 transformers. Explain your transformers model in just 2 lines of code.
Stars: ✭ 861 (+1003.85%)
deepfrogAn NLP-suite powered by deep learning
Stars: ✭ 16 (-79.49%)
WellcomeMLRepository for Machine Learning utils at the Wellcome Trust
Stars: ✭ 31 (-60.26%)
pysentimientoA Python multilingual toolkit for Sentiment Analysis and Social NLP tasks
Stars: ✭ 274 (+251.28%)
molecule-attention-transformerPytorch reimplementation of Molecule Attention Transformer, which uses a transformer to tackle the graph-like structure of molecules
Stars: ✭ 46 (-41.03%)
dm pixPIX is an image processing library in JAX, for JAX.
Stars: ✭ 271 (+247.44%)
transformer generalizationThe official repository for our paper "The Devil is in the Detail: Simple Tricks Improve Systematic Generalization of Transformers". We significantly improve the systematic generalization of transformer models on a variety of datasets using simple tricks and careful considerations.
Stars: ✭ 58 (-25.64%)
ATMC[NeurIPS'2019] Shupeng Gui, Haotao Wang, Haichuan Yang, Chen Yu, Zhangyang Wang, Ji Liu, “Model Compression with Adversarial Robustness: A Unified Optimization Framework”
Stars: ✭ 41 (-47.44%)
MISEMultimodal Image Synthesis and Editing: A Survey
Stars: ✭ 214 (+174.36%)
bert-squeeze🛠️ Tools for Transformers compression using PyTorch Lightning ⚡
Stars: ✭ 56 (-28.21%)
efficientnet-jaxEfficientNet, MobileNetV3, MobileNetV2, MixNet, etc in JAX w/ Flax Linen and Objax
Stars: ✭ 114 (+46.15%)
wax-mlA Python library for machine-learning and feedback loops on streaming data
Stars: ✭ 36 (-53.85%)
Text-SummarizationAbstractive and Extractive Text summarization using Transformers.
Stars: ✭ 38 (-51.28%)
Pytorch-NLUPytorch-NLU,一个中文文本分类、序列标注工具包,支持中文长文本、短文本的多类、多标签分类任务,支持中文命名实体识别、词性标注、分词等序列标注任务。 Ptorch NLU, a Chinese text classification and sequence annotation toolkit, supports multi class and multi label classification tasks of Chinese long text and short text, and supports sequence annotation tasks such as Chinese named entity recognition, part of speech ta…
Stars: ✭ 151 (+93.59%)
anonymisationAnonymization of legal cases (Fr) based on Flair embeddings
Stars: ✭ 85 (+8.97%)
gnn-lspeSource code for GNN-LSPE (Graph Neural Networks with Learnable Structural and Positional Representations), ICLR 2022
Stars: ✭ 165 (+111.54%)
CrabNetPredict materials properties using only the composition information!
Stars: ✭ 57 (-26.92%)
DocSumA tool to automatically summarize documents abstractively using the BART or PreSumm Machine Learning Model.
Stars: ✭ 58 (-25.64%)
VAENAR-TTSPyTorch Implementation of VAENAR-TTS: Variational Auto-Encoder based Non-AutoRegressive Text-to-Speech Synthesis.
Stars: ✭ 66 (-15.38%)
DUNCode for "Depth Uncertainty in Neural Networks" (https://arxiv.org/abs/2006.08437)
Stars: ✭ 65 (-16.67%)
jaxfgFactor graphs and nonlinear optimization for JAX
Stars: ✭ 124 (+58.97%)
elastic transformersMaking BERT stretchy. Semantic Elasticsearch with Sentence Transformers
Stars: ✭ 153 (+96.15%)
optimum🏎️ Accelerate training and inference of 🤗 Transformers with easy to use hardware optimization tools
Stars: ✭ 567 (+626.92%)
Generalization-Causality关于domain generalization,domain adaptation,causality,robutness,prompt,optimization,generative model各式各样研究的阅读笔记
Stars: ✭ 482 (+517.95%)
safe-control-gymPyBullet CartPole and Quadrotor environments—with CasADi symbolic a priori dynamics—for learning-based control and RL
Stars: ✭ 272 (+248.72%)
language-plannerOfficial Code for "Language Models as Zero-Shot Planners: Extracting Actionable Knowledge for Embodied Agents"
Stars: ✭ 84 (+7.69%)
CIL-ReIDBenchmarks for Corruption Invariant Person Re-identification. [NeurIPS 2021 Track on Datasets and Benchmarks]
Stars: ✭ 71 (-8.97%)
eleanorCode used during my Chaos Engineering and Resiliency Patterns talk.
Stars: ✭ 14 (-82.05%)
cycle-confusionCode and models for ICCV2021 paper "Robust Object Detection via Instance-Level Temporal Cycle Confusion".
Stars: ✭ 67 (-14.1%)
BERT-NERUsing pre-trained BERT models for Chinese and English NER with 🤗Transformers
Stars: ✭ 114 (+46.15%)