bert-AADAdversarial Adaptation with Distillation for BERT Unsupervised Domain Adaptation
Stars: ✭ 27 (-67.47%)
Amazon Forest Computer VisionAmazon Forest Computer Vision: Satellite Image tagging code using PyTorch / Keras with lots of PyTorch tricks
Stars: ✭ 346 (+316.87%)
game-feature-learningCode for paper "Cross-Domain Self-supervised Multi-task Feature Learning using Synthetic Imagery", Ren et al., CVPR'18
Stars: ✭ 68 (-18.07%)
Gradfeat20Gradients as Features for Deep Representation Learning
Stars: ✭ 30 (-63.86%)
Mmt[ICLR-2020] Mutual Mean-Teaching: Pseudo Label Refinery for Unsupervised Domain Adaptation on Person Re-identification.
Stars: ✭ 345 (+315.66%)
XlearnTransfer Learning Library
Stars: ✭ 406 (+389.16%)
fetchA set of deep learning models for FRB/RFI binary classification.
Stars: ✭ 19 (-77.11%)
Syn2RealRepository for Transfer Learning using Deep CNNs trained with synthetic images
Stars: ✭ 16 (-80.72%)
Pathnet PytorchPyTorch implementation of PathNet: Evolution Channels Gradient Descent in Super Neural Networks
Stars: ✭ 63 (-24.1%)
MetaHeacThis is an official implementation for "Learning to Expand Audience via Meta Hybrid Experts and Critics for Recommendation and Advertising"(KDD2021).
Stars: ✭ 36 (-56.63%)
RexnetOfficial Pytorch implementation of ReXNet (Rank eXpansion Network) with pretrained models
Stars: ✭ 319 (+284.34%)
DomainadaptationRepository for the article "Unsupervised domain adaptation for medical imaging segmentation with self-ensembling".
Stars: ✭ 27 (-67.47%)
tamnun-mlAn easy to use open-source library for advanced Deep Learning and Natural Language Processing
Stars: ✭ 109 (+31.33%)
Pytorch AddaA PyTorch implementation for Adversarial Discriminative Domain Adaptation
Stars: ✭ 329 (+296.39%)
Cross Domain DetectionCross-Domain Weakly-Supervised Object Detection through Progressive Domain Adaptation [Inoue+, CVPR2018].
Stars: ✭ 320 (+285.54%)
super-gradientsEasily train or fine-tune SOTA computer vision models with one open source training library
Stars: ✭ 429 (+416.87%)
Bert language understandingPre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
Stars: ✭ 933 (+1024.1%)
CPCE-3DLow-dose CT via Transfer Learning from a 2D Trained Network, In IEEE TMI 2018
Stars: ✭ 40 (-51.81%)
task-transferabilityData and code for our paper "Exploring and Predicting Transferability across NLP Tasks", to appear at EMNLP 2020.
Stars: ✭ 35 (-57.83%)
Pytorch Nlp NotebooksLearn how to use PyTorch to solve some common NLP problems with deep learning.
Stars: ✭ 293 (+253.01%)
VisDA2020VisDA2020: 4th Visual Domain Adaptation Challenge in ECCV'20
Stars: ✭ 53 (-36.14%)
paper annotationsA place to keep track of all the annotated papers.
Stars: ✭ 96 (+15.66%)
Transfer NlpNLP library designed for reproducible experimentation management
Stars: ✭ 287 (+245.78%)
MoeFlowRepository for anime characters recognition website, powered by TensorFlow
Stars: ✭ 113 (+36.14%)
TransTQAAuthor: Wenhao Yu (
[email protected]). EMNLP'20. Transfer Learning for Technical Question Answering.
Stars: ✭ 12 (-85.54%)
Bigdata18Transfer learning for time series classification
Stars: ✭ 284 (+242.17%)
favorite-research-papersListing my favorite research papers 📝 from different fields as I read them.
Stars: ✭ 12 (-85.54%)
Context-TransformerContext-Transformer: Tackling Object Confusion for Few-Shot Detection, AAAI 2020
Stars: ✭ 89 (+7.23%)
HubA library for transfer learning by reusing parts of TensorFlow models.
Stars: ✭ 3,007 (+3522.89%)
ACANCode for NAACL 2019 paper: Adversarial Category Alignment Network for Cross-domain Sentiment Classification
Stars: ✭ 23 (-72.29%)
Big transferOfficial repository for the "Big Transfer (BiT): General Visual Representation Learning" paper.
Stars: ✭ 1,096 (+1220.48%)
WSDM2022-PTUPCDRThis is the official implementation of our paper Personalized Transfer of User Preferences for Cross-domain Recommendation (PTUPCDR), which has been accepted by WSDM2022.
Stars: ✭ 65 (-21.69%)
robustnessRobustness and adaptation of ImageNet scale models. Pre-Release, stay tuned for updates.
Stars: ✭ 63 (-24.1%)
Cutmix PytorchOfficial Pytorch implementation of CutMix regularizer
Stars: ✭ 754 (+808.43%)
digital peter aij2020Materials of the AI Journey 2020 competition dedicated to the recognition of Peter the Great's manuscripts, https://ai-journey.ru/contest/task01
Stars: ✭ 61 (-26.51%)
AITQAresources for the IBM Airlines Table-Question-Answering Benchmark
Stars: ✭ 12 (-85.54%)
Dannpytorch implementation of Domain-Adversarial Training of Neural Networks
Stars: ✭ 400 (+381.93%)
MNIST-multitask6️⃣6️⃣6️⃣ Reproduce ICLR '18 under-reviewed paper "MULTI-TASK LEARNING ON MNIST IMAGE DATASETS"
Stars: ✭ 34 (-59.04%)
wechselCode for WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models.
Stars: ✭ 39 (-53.01%)
lidar transferCode for Langer et al. "Domain Transfer for Semantic Segmentation of LiDAR Data using Deep Neural Networks", IROS, 2020.
Stars: ✭ 54 (-34.94%)
Naacl transfer learning tutorialRepository of code for the tutorial on Transfer Learning in NLP held at NAACL 2019 in Minneapolis, MN, USA
Stars: ✭ 687 (+727.71%)
neuralBlackA Multi-Class Brain Tumor Classifier using Convolutional Neural Network with 99% Accuracy achieved by applying the method of Transfer Learning using Python and Pytorch Deep Learning Framework
Stars: ✭ 36 (-56.63%)
Teacher Student TrainingThis repository stores the files used for my summer internship's work on "teacher-student learning", an experimental method for training deep neural networks using a trained teacher model.
Stars: ✭ 34 (-59.04%)
TrainyourownyoloTrain a state-of-the-art yolov3 object detector from scratch!
Stars: ✭ 399 (+380.72%)
pytorch-ardaA PyTorch implementation for Adversarial Representation Learning for Domain Adaptation
Stars: ✭ 49 (-40.96%)
EanetEANet: Enhancing Alignment for Cross-Domain Person Re-identification
Stars: ✭ 380 (+357.83%)
Transfer-LearningInception V3 for Transfer Learning on Cats and Dogs
Stars: ✭ 17 (-79.52%)
NTUA-slp-nlp💻Speech and Natural Language Processing (SLP & NLP) Lab Assignments for ECE NTUA
Stars: ✭ 19 (-77.11%)