All Projects β†’ GT-RIPL β†’ L2c

GT-RIPL / L2c

Licence: mit
Learning to Cluster. A deep clustering strategy.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to L2c

Free Ai Resources
πŸš€ FREE AI Resources - πŸŽ“ Courses, πŸ‘· Jobs, πŸ“ Blogs, πŸ”¬ AI Research, and many more - for everyone!
Stars: ✭ 192 (-26.72%)
Mutual labels:  artificial-intelligence, deep-neural-networks, unsupervised-learning, artificial-neural-networks, supervised-learning
Php Ml
PHP-ML - Machine Learning library for PHP
Stars: ✭ 7,900 (+2915.27%)
Mutual labels:  artificial-intelligence, unsupervised-learning, supervised-learning
All Classifiers 2019
A collection of computer vision projects for Acute Lymphoblastic Leukemia classification/early detection.
Stars: ✭ 22 (-91.6%)
Mutual labels:  artificial-intelligence, deep-neural-networks, artificial-neural-networks
machine-learning-course
Machine Learning Course @ Santa Clara University
Stars: ✭ 17 (-93.51%)
Mutual labels:  clustering, supervised-learning, unsupervised-learning
Deeplearning.ai
deeplearning.ai , By Andrew Ng, All video link
Stars: ✭ 625 (+138.55%)
Mutual labels:  artificial-intelligence, deep-neural-networks, artificial-neural-networks
Easypr
An easy, flexible, and accurate plate recognition project for Chinese licenses in unconstrained situations.
Stars: ✭ 6,046 (+2207.63%)
Mutual labels:  artificial-intelligence, supervised-learning, artificial-neural-networks
Malware Classification
Towards Building an Intelligent Anti-Malware System: A Deep Learning Approach using Support Vector Machine for Malware Classification
Stars: ✭ 88 (-66.41%)
Mutual labels:  artificial-intelligence, artificial-neural-networks, supervised-learning
Complete Life Cycle Of A Data Science Project
Complete-Life-Cycle-of-a-Data-Science-Project
Stars: ✭ 140 (-46.56%)
Mutual labels:  unsupervised-learning, transfer-learning, supervised-learning
Mariana
The Cutest Deep Learning Framework which is also a wonderful Declarative Language
Stars: ✭ 151 (-42.37%)
Mutual labels:  artificial-intelligence, deep-neural-networks, artificial-neural-networks
Dynamics
A Compositional Object-Based Approach to Learning Physical Dynamics
Stars: ✭ 159 (-39.31%)
Mutual labels:  artificial-intelligence, deep-neural-networks, unsupervised-learning
Cnn Svm
An Architecture Combining Convolutional Neural Network (CNN) and Linear Support Vector Machine (SVM) for Image Classification
Stars: ✭ 170 (-35.11%)
Mutual labels:  artificial-intelligence, artificial-neural-networks, supervised-learning
Trending Deep Learning
Top 100 trending deep learning repositories sorted by the number of stars gained on a specific day.
Stars: ✭ 543 (+107.25%)
Mutual labels:  artificial-intelligence, deep-neural-networks, artificial-neural-networks
First Steps Towards Deep Learning
This is an open sourced book on deep learning.
Stars: ✭ 376 (+43.51%)
Mutual labels:  artificial-intelligence, deep-neural-networks, artificial-neural-networks
Gans In Action
Companion repository to GANs in Action: Deep learning with Generative Adversarial Networks
Stars: ✭ 748 (+185.5%)
Mutual labels:  artificial-intelligence, deep-neural-networks, semi-supervised-learning
He4o
ε’ŒοΌˆhe for objective-cοΌ‰ β€”β€” β€œδΏ‘ζ―η†΅ε‡ζœΊη³»η»Ÿβ€
Stars: ✭ 284 (+8.4%)
Mutual labels:  artificial-intelligence, unsupervised-learning, transfer-learning
Gru Svm
[ICMLC 2018] A Neural Network Architecture Combining Gated Recurrent Unit (GRU) and Support Vector Machine (SVM) for Intrusion Detection
Stars: ✭ 76 (-70.99%)
Mutual labels:  artificial-intelligence, artificial-neural-networks, supervised-learning
Susi
SuSi: Python package for unsupervised, supervised and semi-supervised self-organizing maps (SOM)
Stars: ✭ 42 (-83.97%)
Mutual labels:  unsupervised-learning, semi-supervised-learning, supervised-learning
Top Deep Learning
Top 200 deep learning Github repositories sorted by the number of stars.
Stars: ✭ 1,365 (+420.99%)
Mutual labels:  artificial-intelligence, deep-neural-networks, artificial-neural-networks
Face.evolve.pytorch
πŸ”₯πŸ”₯High-Performance Face Recognition Library on PaddlePaddle & PyTorchπŸ”₯πŸ”₯
Stars: ✭ 2,719 (+937.79%)
Mutual labels:  artificial-intelligence, transfer-learning, supervised-learning
Revisiting-Contrastive-SSL
Revisiting Contrastive Methods for Unsupervised Learning of Visual Representations. [NeurIPS 2021]
Stars: ✭ 81 (-69.08%)
Mutual labels:  clustering, transfer-learning, unsupervised-learning

L2C: Learning to Cluster

A clustering strategy with deep neural networks. This blog article provides a generic overview.

Introduction

This repository provides the PyTorch implementation of the transfer learning schemes (L2C) and two learning criteria useful for deep clustering:

*It is renamed from CCL

This repository covers following references:

@inproceedings{Hsu19_MCL,
	title =	    {Multi-class classification without multi-class labels},
	author =    {Yen-Chang Hsu, Zhaoyang Lv, Joel Schlosser, Phillip Odom, Zsolt Kira},
	booktitle = {International Conference on Learning Representations (ICLR)},
	year =      {2019},
	url =       {https://openreview.net/forum?id=SJzR2iRcK7}
}

@inproceedings{Hsu18_L2C,
	title =     {Learning to cluster in order to transfer across domains and tasks},
	author =    {Yen-Chang Hsu and Zhaoyang Lv and Zsolt Kira},
	booktitle = {International Conference on Learning Representations (ICLR)},
	year =      {2018},
	url =       {https://openreview.net/forum?id=ByRWCqvT-}
}

@inproceedings{Hsu16_KCL,
	title =	    {Neural network-based clustering using pairwise constraints},
	author =    {Yen-Chang Hsu and Zsolt Kira},
	booktitle = {ICLR workshop},
	year =      {2016},
	url =       {https://arxiv.org/abs/1511.06321}
}

Preparation

This repository supports PyTorch 1.0, python 2.7, 3.6, and 3.7.

pip install -r requirements.txt

Demo

Supervised Classification/Clustering with only pairwise similarity

# A quick trial:
python demo.py  # Default Dataset:MNIST, Network:LeNet, Loss:MCL
python demo.py --loss KCL

# Lookup available options:
python demo.py -h

# For more examples:
./scripts/exp_supervised_MCL_vs_KCL.sh

Unsupervised Clustering (Cross-task Transfer Learning)

# Learn the Similarity Prediction Network (SPN) with Omniglot_background and then transfer to the 20 alphabets in Omniglot_evaluation.
# Default loss is MCL with an unknown number of clusters (Set a large cluster number, i.e., k=100)
# It takes about half an hour to finish.
python demo_omniglot_transfer.py

# An example of using KCL and set k=gt_#cluster
python demo_omniglot_transfer.py --loss KCL --num_cluster -1

# Lookup available options:
python demo_omniglot_transfer.py -h

# Other examples:
./scripts/exp_unsupervised_transfer_Omniglot.sh

Notes

  • The clustering results are highly dependent on the performance of the Similarity Prediction Network (SPN). For making a fair comparison, the SPN must be kept the same. Our script trains an SPN with random initialization and random data sampling. Once the SPN model is trained, the script will reuse the saved SPN and avoid training a new one.
  • The table below presents the clustering performance with the reference SPN [download]. Put the model file into /outputs folder and run demo_omniglot_transfer.py directly to generate the "MCL(k=100)" column.
  • The performance metric is clustering accuracy (for details, please see L2C paper). Each value in the table is the average of 3 clustering runs. This repository reuses most of the utilities in PyTorch and is different from the Lua-based implementation used in the reference papers. The result (the row with "--Average--") shows the same trend as the papers, but the absolute values have a mild difference. The MCL results here are better than the paper.
Dataset gt #class KCL (k=100) MCL (k=100) KCL (k=gt) MCL (k=gt)
Angelic 20 73.2% 82.2% 89.0% 91.7%
Atemayar_Qelisayer 26 73.3% 89.2% 82.5% 86.0%
Atlantean 26 65.5% 83.3% 89.4% 93.5%
Aurek_Besh 26 88.4% 92.8% 91.5% 92.4%
Avesta 26 79.0% 85.8% 85.4% 86.1%
Ge_ez 26 77.1% 84.0% 85.4% 86.6%
Glagolitic 45 83.9% 85.3% 84.9% 87.4%
Gurmukhi 45 78.8% 78.7% 77.0% 78.0%
Kannada 41 64.6% 81.1% 73.3% 81.2%
Keble 26 91.4% 95.1% 94.7% 94.3%
Malayalam 47 73.5% 75.0% 72.7% 73.0%
Manipuri 40 82.8% 81.2% 85.8% 81.5%
Mongolian 30 84.7% 89.0% 88.3% 90.2%
Old_Church_Slavonic_Cyrillic 45 89.9% 90.7% 88.7% 89.8%
Oriya 46 56.5% 73.4% 63.2% 75.3%
Sylheti 28 61.8% 68.2% 69.8% 80.6%
Syriac_Serto 23 72.1% 82.0% 85.8% 89.8%
Tengwar 25 67.7% 76.4% 82.5% 85.5%
Tibetan 42 81.8% 80.2% 84.3% 81.9%
ULOG 26 53.3% 77.1% 73.0% 89.1%
--Average-- 75.0% 82.5% 82.4% 85.7%

Compare MCL and KCL

The loss surface of MCL is more similar to the cross-entropy (CE) than KCL. Empirically, MCL converged faster than KCL. For details, please refer to the ICLR paper.

Related Applications

Lane detection for autonomous driving / Instance segmentation

@article{Hsu18_InsSeg,
	title =     {Learning to Cluster for Proposal-Free Instance Segmentation},
	author =    {Yen-Chang Hsu, Zheng Xu, Zsolt Kira, Jiawei Huang},
	booktitle = {accepted to the International Joint Conference on Neural Networks (IJCNN)},
	year =      {2018},
	url =       {https://arxiv.org/abs/1803.06459}
}

Acknowledgments

This work was supported by the National Science Foundation and National Robotics Initiative (grant # IIS-1426998) and DARPA’s Lifelong Learning Machines (L2M) program, under Cooperative Agreement HR0011-18-2-001.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].