All Projects → thuml → Transfer Learning Library

thuml / Transfer Learning Library

Licence: mit
Transfer-Learning-Library

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Transfer Learning Library

Transfer-learning-materials
resource collection for transfer learning!
Stars: ✭ 213 (-68.58%)
Mutual labels:  transfer-learning, domain-adaptation
transfer-learning-algorithms
Implementation of many transfer learning algorithms in Python with Jupyter notebooks
Stars: ✭ 42 (-93.81%)
Mutual labels:  transfer-learning, domain-adaptation
BA3US
code for our ECCV 2020 paper "A Balanced and Uncertainty-aware Approach for Partial Domain Adaptation"
Stars: ✭ 31 (-95.43%)
Mutual labels:  transfer-learning, domain-adaptation
TA3N
[ICCV 2019 Oral] TA3N: https://github.com/cmhungsteve/TA3N (Most updated repo)
Stars: ✭ 45 (-93.36%)
Mutual labels:  transfer-learning, domain-adaptation
Meta-SelfLearning
Meta Self-learning for Multi-Source Domain Adaptation: A Benchmark
Stars: ✭ 157 (-76.84%)
Mutual labels:  benchmark, domain-adaptation
transfertools
Python toolbox for transfer learning.
Stars: ✭ 22 (-96.76%)
Mutual labels:  transfer-learning, domain-adaptation
meta-learning-progress
Repository to track the progress in Meta-Learning (MtL), including the datasets and the current state-of-the-art for the most common MtL problems.
Stars: ✭ 26 (-96.17%)
Mutual labels:  transfer-learning, domain-adaptation
Pytorch Retraining
Transfer Learning Shootout for PyTorch's model zoo (torchvision)
Stars: ✭ 167 (-75.37%)
Mutual labels:  transfer-learning, benchmark
Filipino-Text-Benchmarks
Open-source benchmark datasets and pretrained transformer models in the Filipino language.
Stars: ✭ 22 (-96.76%)
Mutual labels:  benchmark, transfer-learning
KD3A
Here is the official implementation of the model KD3A in paper "KD3A: Unsupervised Multi-Source Decentralized Domain Adaptation via Knowledge Distillation".
Stars: ✭ 63 (-90.71%)
Mutual labels:  transfer-learning, domain-adaptation
Clan
( CVPR2019 Oral ) Taking A Closer Look at Domain Shift: Category-level Adversaries for Semantics Consistent Domain Adaptation
Stars: ✭ 248 (-63.42%)
Mutual labels:  transfer-learning, domain-adaptation
adapt
Awesome Domain Adaptation Python Toolbox
Stars: ✭ 46 (-93.22%)
Mutual labels:  transfer-learning, domain-adaptation
Awesome Domain Adaptation
A collection of AWESOME things about domian adaptation
Stars: ✭ 3,357 (+395.13%)
Mutual labels:  transfer-learning, domain-adaptation
pykale
Knowledge-Aware machine LEarning (KALE): accessible machine learning from multiple sources for interdisciplinary research, part of the 🔥PyTorch ecosystem
Stars: ✭ 381 (-43.81%)
Mutual labels:  transfer-learning, domain-adaptation
Seg Uncertainty
IJCAI2020 & IJCV 2020 🌇 Unsupervised Scene Adaptation with Memory Regularization in vivo
Stars: ✭ 202 (-70.21%)
Mutual labels:  transfer-learning, domain-adaptation
Transformers-Domain-Adaptation
Adapt Transformer-based language models to new text domains
Stars: ✭ 67 (-90.12%)
Mutual labels:  transfer-learning, domain-adaptation
Shot
code released for our ICML 2020 paper "Do We Really Need to Access the Source Data? Source Hypothesis Transfer for Unsupervised Domain Adaptation"
Stars: ✭ 134 (-80.24%)
Mutual labels:  transfer-learning, domain-adaptation
Transferlearning Tutorial
《迁移学习简明手册》LaTex源码
Stars: ✭ 2,122 (+212.98%)
Mutual labels:  transfer-learning, domain-adaptation
cmd
Central Moment Discrepancy for Domain-Invariant Representation Learning (ICLR 2017, keras)
Stars: ✭ 53 (-92.18%)
Mutual labels:  transfer-learning, domain-adaptation
SHOT-plus
code for our TPAMI 2021 paper "Source Data-absent Unsupervised Domain Adaptation through Hypothesis Transfer and Labeling Transfer"
Stars: ✭ 46 (-93.22%)
Mutual labels:  transfer-learning, domain-adaptation

Introduction

Trans-Learn is an open-source and well-documented library for Transfer Learning. It is based on pure PyTorch with high performance and friendly API. Our code is pythonic, and the design is consistent with torchvision. You can easily develop new algorithms, or readily apply existing algorithms.

On July 24th, 2020, we released the v0.1 (preview version), the first sub-library is for Domain Adaptation (DALIB). The currently supported algorithms include:

The performance of these algorithms were fairly evaluated in this benchmark.

Installation

For flexible use and modification, please git clone the library.

Documentation

You can find the tutorial and API documentation on the website: DALIB API

Also, we have examples in the directory examples. A typical usage is

# Train a DANN on Office-31 Amazon -> Webcam task using ResNet 50.
# Assume you have put the datasets under the path `data/office-31`, 
# or you are glad to download the datasets automatically from the Internet to this path
python examples/dann.py data/office31 -d Office31 -s A -t W -a resnet50  --epochs 20

In the directory examples, you can find all the necessary running scripts to reproduce the benchmarks with specified hyper-parameters.

Contributing

We appreciate all contributions. If you are planning to contribute back bug-fixes, please do so without any further discussion. If you plan to contribute new features, utility functions or extensions, please first open an issue and discuss the feature with us.

You can find the latest code on the dev branch.

Disclaimer on Datasets

This is a utility library that downloads and prepares public datasets. We do not host or distribute these datasets, vouch for their quality or fairness, or claim that you have licenses to use the dataset. It is your responsibility to determine whether you have permission to use the dataset under the dataset's license.

If you're a dataset owner and wish to update any part of it (description, citation, etc.), or do not want your dataset to be included in this library, please get in touch through a GitHub issue. Thanks for your contribution to the ML community!

Contact

If you have any problem with our code or have some suggestions, including the future feature, feel free to contact

or describe it in Issues.

For Q&A in Chinese, you can choose to ask questions here before sending an email. 迁移学习算法库答疑专区

Citation

If you use this toolbox or benchmark in your research, please cite this project.

@misc{dalib,
  author = {Junguang Jiang, Bo Fu, Mingsheng Long},
  title = {Transfer-Learning-library},
  year = {2020},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/thuml/Transfer-Learning-Library}},
}

Acknowledgment

We would like to thank School of Software, Tsinghua University and The National Engineering Laboratory for Big Data Software for providing such an excellent ML research platform.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].