All Projects → thuml → DAN

thuml / DAN

Licence: other
Code release of "Learning Transferable Features with Deep Adaptation Networks" (ICML 2015)

Projects that are alternatives of or similar to DAN

speech-recognition-transfer-learning
Speech command recognition DenseNet transfer learning from UrbanSound8k in keras tensorflow
Stars: ✭ 18 (-87.92%)
Mutual labels:  transfer-learning
aml-keras-image-recognition
A sample Azure Machine Learning project for Transfer Learning-based custom image recognition by utilizing Keras.
Stars: ✭ 14 (-90.6%)
Mutual labels:  transfer-learning
EntityTargetedActiveLearning
No description or website provided.
Stars: ✭ 17 (-88.59%)
Mutual labels:  transfer-learning
AB distillation
Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)
Stars: ✭ 105 (-29.53%)
Mutual labels:  transfer-learning
Keras-MultiClass-Image-Classification
Multiclass image classification using Convolutional Neural Network
Stars: ✭ 48 (-67.79%)
Mutual labels:  transfer-learning
paper annotations
A place to keep track of all the annotated papers.
Stars: ✭ 96 (-35.57%)
Mutual labels:  transfer-learning
ReinventCommunity
No description or website provided.
Stars: ✭ 103 (-30.87%)
Mutual labels:  transfer-learning
super-gradients
Easily train or fine-tune SOTA computer vision models with one open source training library
Stars: ✭ 429 (+187.92%)
Mutual labels:  transfer-learning
TransTQA
Author: Wenhao Yu ([email protected]). EMNLP'20. Transfer Learning for Technical Question Answering.
Stars: ✭ 12 (-91.95%)
Mutual labels:  transfer-learning
CPCE-3D
Low-dose CT via Transfer Learning from a 2D Trained Network, In IEEE TMI 2018
Stars: ✭ 40 (-73.15%)
Mutual labels:  transfer-learning
deep-learning
Projects include the application of transfer learning to build a convolutional neural network (CNN) that identifies the artist of a painting, the building of predictive models for Bitcoin price data using Long Short-Term Memory recurrent neural networks (LSTMs) and a tutorial explaining how to build two types of neural network using as input the…
Stars: ✭ 43 (-71.14%)
Mutual labels:  transfer-learning
favorite-research-papers
Listing my favorite research papers 📝 from different fields as I read them.
Stars: ✭ 12 (-91.95%)
Mutual labels:  transfer-learning
LegoBrickClassification
Repository to identify Lego bricks automatically only using images
Stars: ✭ 57 (-61.74%)
Mutual labels:  transfer-learning
WSDM2022-PTUPCDR
This is the official implementation of our paper Personalized Transfer of User Preferences for Cross-domain Recommendation (PTUPCDR), which has been accepted by WSDM2022.
Stars: ✭ 65 (-56.38%)
Mutual labels:  transfer-learning
Open set domain adaptation
Tensorflow Implementation of open set domain adaptation by backpropagation
Stars: ✭ 27 (-81.88%)
Mutual labels:  transfer-learning
TransforLearning TensorFlow
使用预训练好的InceptionV3模型对自己的数据进行分类,用这个代码的同学希望可以给一个star
Stars: ✭ 58 (-61.07%)
Mutual labels:  transfer-learning
MoeFlow
Repository for anime characters recognition website, powered by TensorFlow
Stars: ✭ 113 (-24.16%)
Mutual labels:  transfer-learning
Deep-Learning-Experiments-implemented-using-Google-Colab
Colab Compatible FastAI notebooks for NLP and Computer Vision Datasets
Stars: ✭ 16 (-89.26%)
Mutual labels:  transfer-learning
NaiveNASflux.jl
Your local Flux surgeon
Stars: ✭ 20 (-86.58%)
Mutual labels:  transfer-learning
task-transferability
Data and code for our paper "Exploring and Predicting Transferability across NLP Tasks", to appear at EMNLP 2020.
Stars: ✭ 35 (-76.51%)
Mutual labels:  transfer-learning

Deep Adaptation Network (DAN)

This is a caffe repository for deep adaptation network (DAN). We fork the repository with version ID 29cdee7 from Caffe and make our modifications. The main modifications are listed as follow:

  • Add mmd layer described in paper "Learning Transferable Features with Deep Adaptation Networks".
  • Emit SOLVER_ITER_CHANGE message in solver.cpp when iter_ changes.

The value of the mmd loss could be negative since we used the linear-time unbiased estimate of the mmd, which lends us an O(n) time cost but may cause negative mmd values for some mini-batch. This negative mmd value however will not influence the final classification performance. Please refer to our paper for the technical details.

If you have any problem about this code, feel free to concact us with the following email:

Change Note

We have moved the implementations of Residual Transfer Network (NIPS '16) and Joint Adaptation Network (ICML '17) to the Xlearn library, which is our actively-maintained library for deep transfer learning.

Data Preparation

In data/office/*.txt, we give the lists of three domains in Office dataset.

Training Model

In models/DAN/amazon_to_webcam, we give an example model based on Alexnet to show how to transfer from amazon to webcam. In this model, we insert mmd layers after fc7 and fc8 individually.

The bvlc_reference_caffenet is used as the pre-trained model. If the Office dataset and pre-trained caffemodel are prepared, the example can be run with the following command:

"./build/tools/caffe train -solver models/DAN/amazon_to_webcam/solver.prototxt -weights models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel"

Resnet pre-trainded model is here. We use Resnet-50.

Parameter Tuning

In mmd-layer and jmmd-layer, parameter loss_weight can be tuned to give mmd loss different weights.

Citation

@inproceedings{DBLP:conf/icml/LongC0J15,
  author    = {Mingsheng Long and
               Yue Cao and
               Jianmin Wang and
               Michael I. Jordan},
  title     = {Learning Transferable Features with Deep Adaptation Networks},
  booktitle = {Proceedings of the 32nd International Conference on Machine Learning,
               {ICML} 2015, Lille, France, 6-11 July 2015},
  pages     = {97--105},
  year      = {2015},
  crossref  = {DBLP:conf/icml/2015},
  url       = {http://jmlr.org/proceedings/papers/v37/long15.html},
  timestamp = {Tue, 12 Jul 2016 21:51:15 +0200},
  biburl    = {http://dblp2.uni-trier.de/rec/bib/conf/icml/LongC0J15},
  bibsource = {dblp computer science bibliography, http://dblp.org}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].