All Projects → Vincent-Vercruyssen → transfertools

Vincent-Vercruyssen / transfertools

Licence: Apache-2.0 license
Python toolbox for transfer learning.

Programming Languages

Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to transfertools

Transferlearning
Transfer learning / domain adaptation / domain generalization / multi-task learning etc. Papers, codes, datasets, applications, tutorials.-迁移学习
Stars: ✭ 8,481 (+38450%)
Mutual labels:  transfer-learning, domain-adaptation
Awesome Domain Adaptation
A collection of AWESOME things about domian adaptation
Stars: ✭ 3,357 (+15159.09%)
Mutual labels:  transfer-learning, domain-adaptation
Cross Domain ner
Cross-domain NER using cross-domain language modeling, code for ACL 2019 paper
Stars: ✭ 67 (+204.55%)
Mutual labels:  transfer-learning, domain-adaptation
TA3N
[ICCV 2019 Oral] TA3N: https://github.com/cmhungsteve/TA3N (Most updated repo)
Stars: ✭ 45 (+104.55%)
Mutual labels:  transfer-learning, domain-adaptation
Convolutional Handwriting Gan
ScrabbleGAN: Semi-Supervised Varying Length Handwritten Text Generation (CVPR20)
Stars: ✭ 107 (+386.36%)
Mutual labels:  transfer-learning, domain-adaptation
Transfer Learning Library
Transfer-Learning-Library
Stars: ✭ 678 (+2981.82%)
Mutual labels:  transfer-learning, domain-adaptation
Clan
( CVPR2019 Oral ) Taking A Closer Look at Domain Shift: Category-level Adversaries for Semantics Consistent Domain Adaptation
Stars: ✭ 248 (+1027.27%)
Mutual labels:  transfer-learning, domain-adaptation
KD3A
Here is the official implementation of the model KD3A in paper "KD3A: Unsupervised Multi-Source Decentralized Domain Adaptation via Knowledge Distillation".
Stars: ✭ 63 (+186.36%)
Mutual labels:  transfer-learning, domain-adaptation
Awesome Transfer Learning
Best transfer learning and domain adaptation resources (papers, tutorials, datasets, etc.)
Stars: ✭ 1,349 (+6031.82%)
Mutual labels:  transfer-learning, domain-adaptation
Ddc Transfer Learning
A simple implementation of Deep Domain Confusion: Maximizing for Domain Invariance
Stars: ✭ 83 (+277.27%)
Mutual labels:  transfer-learning, domain-adaptation
Multitask Learning
Awesome Multitask Learning Resources
Stars: ✭ 361 (+1540.91%)
Mutual labels:  transfer-learning, domain-adaptation
Transferlearning Tutorial
《迁移学习简明手册》LaTex源码
Stars: ✭ 2,122 (+9545.45%)
Mutual labels:  transfer-learning, domain-adaptation
adapt
Awesome Domain Adaptation Python Toolbox
Stars: ✭ 46 (+109.09%)
Mutual labels:  transfer-learning, domain-adaptation
Getting Things Done With Pytorch
Jupyter Notebook tutorials on solving real-world problems with Machine Learning & Deep Learning using PyTorch. Topics: Face detection with Detectron 2, Time Series anomaly detection with LSTM Autoencoders, Object Detection with YOLO v5, Build your first Neural Network, Time Series forecasting for Coronavirus daily cases, Sentiment Analysis with BERT.
Stars: ✭ 738 (+3254.55%)
Mutual labels:  transfer-learning, anomaly-detection
SHOT-plus
code for our TPAMI 2021 paper "Source Data-absent Unsupervised Domain Adaptation through Hypothesis Transfer and Labeling Transfer"
Stars: ✭ 46 (+109.09%)
Mutual labels:  transfer-learning, domain-adaptation
Deep Transfer Learning
Deep Transfer Learning Papers
Stars: ✭ 68 (+209.09%)
Mutual labels:  transfer-learning, domain-adaptation
transfer-learning-algorithms
Implementation of many transfer learning algorithms in Python with Jupyter notebooks
Stars: ✭ 42 (+90.91%)
Mutual labels:  transfer-learning, domain-adaptation
cmd
Central Moment Discrepancy for Domain-Invariant Representation Learning (ICLR 2017, keras)
Stars: ✭ 53 (+140.91%)
Mutual labels:  transfer-learning, domain-adaptation
Libtlda
Library of transfer learners and domain-adaptive classifiers.
Stars: ✭ 71 (+222.73%)
Mutual labels:  transfer-learning, domain-adaptation
Shot
code released for our ICML 2020 paper "Do We Really Need to Access the Source Data? Source Hypothesis Transfer for Unsupervised Domain Adaptation"
Stars: ✭ 134 (+509.09%)
Mutual labels:  transfer-learning, domain-adaptation

transfertools

transfertools is a small Python package containing recent transfer learning algorithms. Transfer learning strives to transfer information from one dataset, the source domain, to a related dataset, the target domain. Several constraints and assumptions can be placed on the domains, inspiring different algorithms to do the information transfer. The package contains four transfer learning algorithms.

Installation

Install the package directly from PyPi with the following command:

pip install transfertools

OR install the package using the setup.py file:

python setup.py install

OR install it directly from GitHub itself:

pip install git+https://github.com/Vincent-Vercruyssen/transfertools.git@master

Contents and usage

Transfer learning aims to transfer information from a source domain Ds to a related target domain Dt. A domain consists of a dataset with attributes X and labels Y. Thus, the source domain is Ds = {Xs, Ys} and the target domain is Dt = {Xt, Yt}. The fundamental assumption is that the source and target domain live in the same feature space. Different flavors of transfer learning methodologies exist. Unsupervised transfer learning, for instance, disregards label information and only uses Xs and Xt to determine what information to transfer. Supervised transfer learning uses the full domains Ds and Dt to do the transfer. Semi-supervised transfer learning uses the full source domain Ds and the target attributes Xt to do the transfer.

The actual information that is transferred also differs between methods. Domain adaptation techniques transform the source (and target) domains such that they match more closely (according to different criteria) and then combine all the data points to construct Dcombo. Instance selection techniques select a subset of the source data that should be transferred to the target data to construct Dcombo. After transfer, a classifier can be constructed using Dcombo.

Instance selection techniques

The transfertools package contains two instance selection transfer techniques tailored to anomaly detection:

  1. The LocIT (localized instance transfer) algorithm works in a completely unsupervised manner. It transfers the instances in Ds that have matching localized distributions in both domains [1]. This algorithm can also be used in other applications than anomaly detection.
  2. The CBIT (cluster-based instance transfer) algorithm works in a semi-supervised manner. It transfer the instances in Ds that fall inside a cluster defined on the target data [2].

Given a source domain {Xs, Ys} and a target domain {Xt, Yt}, the algorithms are applied as follows:

from transfertools.models import LocIT, CBIT

# train
transfor = LocIT()
transfor.fit(Xs, Xt)

# predict
Xs_trans = transfor.transfer(Xs)

# ... or immediately
Xs_trans = transfor.fit_transfer(Xs, Xt)

Domain adaptation techniques

The transfertools package contains two instance domain adaptation techniques:

  1. The CORAL (correlation alignment) algorithm is an unsupervised transfer learning technique that aligns the first and second order statistics of the source and target data [3].
  2. The TCA (transfer component analysis) algorithm is an unsupervised transfer learning technique that projects the source and target data onto a lower-dimensional subspace [4].

Given a source domain {Xs, Ys} and a target domain {Xt, Yt}, the algorithms are applied as follows:

from transfertools.models import TCA, CORAL

# train
transfor = CORAL()
transfor.fit(Xs, Xt)

# predict
Xs_trans = transfor.transfer(Xs)

# ... or immediately
Xs_trans = transfor.fit_transfer(Xs, Xt)

Package structure

The transfer learning algorithms are located in: transfertools/models/

For further examples of how to use the algorithms see the notebooks: transfertools/notebooks/

Dependencies

The transfertools package requires the following python packages to be installed:

Contact

Contact the author of the package: [email protected]

References

[1] Vercruyssen, V., Meert, W., and J. Davis. (2020) Transfer Learning for Anomaly Detection through Localized and Unsupervised Instance Selection. In 34th AAAI Conference on Artificial Intelligence, New York. To be published

[2] Vercruyssen, V., Meert, W., and Davis, J. (2017) Transfer learning for time series anomaly detection. In CEUR Workshop Proceedings, vol. 1924, pp. 27-37.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].