lightning-hydra-templatePyTorch Lightning + Hydra. A very user-friendly template for rapid and reproducible ML experimentation with best practices. ⚡🔥⚡
Stars: ✭ 1,905 (+245.74%)
bert-squeeze🛠️ Tools for Transformers compression using PyTorch Lightning ⚡
Stars: ✭ 56 (-89.84%)
lightning-asrModular and extensible speech recognition library leveraging pytorch-lightning and hydra.
Stars: ✭ 36 (-93.47%)
pytorch tempestMy repo for training neural nets using pytorch-lightning and hydra
Stars: ✭ 124 (-77.5%)
classyclassy is a simple-to-use library for building high-performance Machine Learning models in NLP.
Stars: ✭ 61 (-88.93%)
Ask2TransformersA Framework for Textual Entailment based Zero Shot text classification
Stars: ✭ 102 (-81.49%)
pysentimientoA Python multilingual toolkit for Sentiment Analysis and Social NLP tasks
Stars: ✭ 274 (-50.27%)
Transformer-QG-on-SQuADImplement Question Generator with SOTA pre-trained Language Models (RoBERTa, BERT, GPT, BART, T5, etc.)
Stars: ✭ 28 (-94.92%)
hydra-hppHydra Hot Potato Player (game)
Stars: ✭ 12 (-97.82%)
pytorch-lightning-templateAn easy/swift-to-adapt PyTorch-Lighting template. 套壳模板,简单易用,稍改原来Pytorch代码,即可适配Lightning。You can translate your previous Pytorch code much easier using this template, and keep your freedom to edit all the functions as well. Big-project-friendly as well.
Stars: ✭ 555 (+0.73%)
Transformer-MM-Explainability[ICCV 2021- Oral] Official PyTorch implementation for Generic Attention-model Explainability for Interpreting Bi-Modal and Encoder-Decoder Transformers, a novel method to visualize any Transformer-based network. Including examples for DETR, VQA.
Stars: ✭ 484 (-12.16%)
transformers-lightningA collection of Models, Datasets, DataModules, Callbacks, Metrics, Losses and Loggers to better integrate pytorch-lightning with transformers.
Stars: ✭ 45 (-91.83%)
transformers-interpretModel explainability that works seamlessly with 🤗 transformers. Explain your transformers model in just 2 lines of code.
Stars: ✭ 861 (+56.26%)
hydra-jsDOES NOT WORK WITH VERSIONS > 0.10.0 - A simple library to help you build node-based identity providers that work with Hydra.
Stars: ✭ 17 (-96.91%)
skillful nowcastingImplementation of DeepMind's Deep Generative Model of Radar (DGMR) https://arxiv.org/abs/2104.00954
Stars: ✭ 117 (-78.77%)
weaselWeakly Supervised End-to-End Learning (NeurIPS 2021)
Stars: ✭ 117 (-78.77%)
KnowledgeEditorCode for Editing Factual Knowledge in Language Models
Stars: ✭ 86 (-84.39%)
backpropBackprop makes it simple to use, finetune, and deploy state-of-the-art ML models.
Stars: ✭ 229 (-58.44%)
Basic-UI-for-GPT-J-6B-with-low-vramA repository to run gpt-j-6b on low vram machines (4.2 gb minimum vram for 2000 token context, 3.5 gb for 1000 token context). Model loading takes 12gb free ram.
Stars: ✭ 90 (-83.67%)
textUsing Transformers from HuggingFace in R
Stars: ✭ 66 (-88.02%)
labml🔎 Monitor deep learning model training and hardware usage from your mobile phone 📱
Stars: ✭ 1,213 (+120.15%)
BrainMaGeBrain extraction in presence of abnormalities, using single and multiple MRI modalities
Stars: ✭ 23 (-95.83%)
VideoTransformer-pytorchPyTorch implementation of a collections of scalable Video Transformer Benchmarks.
Stars: ✭ 159 (-71.14%)
awesome-huggingface🤗 A list of wonderful open-source projects & applications integrated with Hugging Face libraries.
Stars: ✭ 436 (-20.87%)
specificationRDF vocabulary and specification
Stars: ✭ 21 (-96.19%)
X-TransformerX-Transformer: Taming Pretrained Transformers for eXtreme Multi-label Text Classification
Stars: ✭ 127 (-76.95%)
Transformers-TutorialsThis repository contains demos I made with the Transformers library by HuggingFace.
Stars: ✭ 2,828 (+413.25%)
hykuHyku: A multi-tenant Hyrax application built on the latest and greatest Samvera community components. Brought to you by the Hydra-in-a-Box project partners and IMLS; maintained by the Hyku Interest Group.
Stars: ✭ 83 (-84.94%)
elastic transformersMaking BERT stretchy. Semantic Elasticsearch with Sentence Transformers
Stars: ✭ 153 (-72.23%)
presentationsPresentations at the Tokyo Nixos Meetup
Stars: ✭ 57 (-89.66%)
deepconsensusDeepConsensus uses gap-aware sequence transformers to correct errors in Pacific Biosciences (PacBio) Circular Consensus Sequencing (CCS) data.
Stars: ✭ 124 (-77.5%)
oreilly-bert-nlpThis repository contains code for the O'Reilly Live Online Training for BERT
Stars: ✭ 19 (-96.55%)
transformer generalizationThe official repository for our paper "The Devil is in the Detail: Simple Tricks Improve Systematic Generalization of Transformers". We significantly improve the systematic generalization of transformer models on a variety of datasets using simple tricks and careful considerations.
Stars: ✭ 58 (-89.47%)
question generatorAn NLP system for generating reading comprehension questions
Stars: ✭ 188 (-65.88%)
hydra-antliaA collection of functions for Hydra
Stars: ✭ 45 (-91.83%)
nlp workshop odsc europe20Extensive tutorials for the Advanced NLP Workshop in Open Data Science Conference Europe 2020. We will leverage machine learning, deep learning and deep transfer learning to learn and solve popular tasks using NLP including NER, Classification, Recommendation \ Information Retrieval, Summarization, Classification, Language Translation, Q&A and T…
Stars: ✭ 127 (-76.95%)
uvadlc notebooksRepository of Jupyter notebook tutorials for teaching the Deep Learning Course at the University of Amsterdam (MSc AI), Fall 2022/Spring 2022
Stars: ✭ 901 (+63.52%)
gatlingHydra-enabled GPU path tracer that supports MaterialX and MDL.
Stars: ✭ 159 (-71.14%)
uniformer-pytorchImplementation of Uniformer, a simple attention and 3d convolutional net that achieved SOTA in a number of video classification tasks, debuted in ICLR 2022
Stars: ✭ 90 (-83.67%)
clip-italianCLIP (Contrastive Language–Image Pre-training) for Italian
Stars: ✭ 113 (-79.49%)
wechselCode for WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models.
Stars: ✭ 39 (-92.92%)
hydra-routerA service aware router for Hydra Services. Implements an API Gateway and can route web socket messages.
Stars: ✭ 59 (-89.29%)
deepfillv2-pylightningClean minimal implementation of Free-Form Image Inpainting with Gated Convolutions in pytorch lightning. Inspired from pytorch implementation by @avalonstrel.
Stars: ✭ 13 (-97.64%)
Transformer-in-PyTorchTransformer/Transformer-XL/R-Transformer examples and explanations
Stars: ✭ 21 (-96.19%)
Text-SummarizationAbstractive and Extractive Text summarization using Transformers.
Stars: ✭ 38 (-93.1%)
RETRO-pytorchImplementation of RETRO, Deepmind's Retrieval based Attention net, in Pytorch
Stars: ✭ 473 (-14.16%)
ConSSLPyTorch Implementation of SOTA SSL methods
Stars: ✭ 61 (-88.93%)
jax-modelsUnofficial JAX implementations of deep learning research papers
Stars: ✭ 108 (-80.4%)
TransQuestTransformer based translation quality estimation
Stars: ✭ 85 (-84.57%)
map-floodwater-satellite-imageryThis repository focuses on training semantic segmentation models to predict the presence of floodwater for disaster prevention. Models were trained using SageMaker and Colab.
Stars: ✭ 21 (-96.19%)
Chinese-Minority-PLMCINO: Pre-trained Language Models for Chinese Minority (少数民族语言预训练模型)
Stars: ✭ 133 (-75.86%)
hififaceUnofficial PyTorch Implementation for HifiFace (https://arxiv.org/abs/2106.09965)
Stars: ✭ 227 (-58.8%)
Fast-AgingGANA deep learning model to age faces in the wild, currently runs at 60+ fps on GPUs
Stars: ✭ 133 (-75.86%)
long-short-transformerImplementation of Long-Short Transformer, combining local and global inductive biases for attention over long sequences, in Pytorch
Stars: ✭ 103 (-81.31%)