All Projects → FlorentF9 → DESOM

FlorentF9 / DESOM

Licence: other
🌐 Deep Embedded Self-Organizing Map: Joint Representation Learning and Self-Organization

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to DESOM

kohonen-maps
Implementation of SOM and GSOM
Stars: ✭ 62 (-18.42%)
Mutual labels:  clustering, som, kohonen-map
Revisiting-Contrastive-SSL
Revisiting Contrastive Methods for Unsupervised Learning of Visual Representations. [NeurIPS 2021]
Stars: ✭ 81 (+6.58%)
Mutual labels:  clustering, representation-learning
Timeseries Clustering Vae
Variational Recurrent Autoencoder for timeseries clustering in pytorch
Stars: ✭ 190 (+150%)
Mutual labels:  clustering, autoencoder
Tensorflow Book
Accompanying source code for Machine Learning with TensorFlow. Refer to the book for step-by-step explanations.
Stars: ✭ 4,448 (+5752.63%)
Mutual labels:  clustering, autoencoder
SelfOrganizingMap-SOM
Pytorch implementation of Self-Organizing Map(SOM). Use MNIST dataset as a demo.
Stars: ✭ 33 (-56.58%)
Mutual labels:  som, self-organizing-map
ntnu-som
Using Self-Organizing Maps for Travelling Salesman Problem
Stars: ✭ 31 (-59.21%)
Mutual labels:  som, kohonen-map
Self Label
Self-labelling via simultaneous clustering and representation learning. (ICLR 2020)
Stars: ✭ 324 (+326.32%)
Mutual labels:  clustering, representation-learning
Kate
Code & data accompanying the KDD 2017 paper "KATE: K-Competitive Autoencoder for Text"
Stars: ✭ 135 (+77.63%)
Mutual labels:  autoencoder, representation-learning
Bagofconcepts
Python implementation of bag-of-concepts
Stars: ✭ 18 (-76.32%)
Mutual labels:  clustering, representation-learning
Compress
Compressing Representations for Self-Supervised Learning
Stars: ✭ 43 (-43.42%)
Mutual labels:  clustering, representation-learning
Pt Sdae
PyTorch implementation of SDAE (Stacked Denoising AutoEncoder)
Stars: ✭ 72 (-5.26%)
Mutual labels:  clustering, autoencoder
sparse-som
Efficient Self-Organizing Map for Sparse Data
Stars: ✭ 17 (-77.63%)
Mutual labels:  som, self-organizing-map
pyERA
Python implementation of the Epigenetic Robotic Architecture (ERA). It includes standalone classes for Self-Organizing Maps (SOM) and Hebbian Networks.
Stars: ✭ 68 (-10.53%)
Mutual labels:  som, self-organizing-map
Pt Dec
PyTorch implementation of DEC (Deep Embedding Clustering)
Stars: ✭ 132 (+73.68%)
Mutual labels:  clustering, autoencoder
Link Prediction
Representation learning for link prediction within social networks
Stars: ✭ 245 (+222.37%)
Mutual labels:  autoencoder, representation-learning
M-NMF
An implementation of "Community Preserving Network Embedding" (AAAI 2017)
Stars: ✭ 119 (+56.58%)
Mutual labels:  clustering, representation-learning
Codeslam
Implementation of CodeSLAM — Learning a Compact, Optimisable Representation for Dense Visual SLAM paper (https://arxiv.org/pdf/1804.00874.pdf)
Stars: ✭ 64 (-15.79%)
Mutual labels:  autoencoder, representation-learning
Srl Zoo
State Representation Learning (SRL) zoo with PyTorch - Part of S-RL Toolbox
Stars: ✭ 125 (+64.47%)
Mutual labels:  autoencoder, representation-learning
Unsupervised Classification
SCAN: Learning to Classify Images without Labels (ECCV 2020), incl. SimCLR.
Stars: ✭ 605 (+696.05%)
Mutual labels:  clustering, representation-learning
Self Supervised Learning Overview
📜 Self-Supervised Learning from Images: Up-to-date reading list.
Stars: ✭ 73 (-3.95%)
Mutual labels:  clustering, representation-learning

DESOM: Deep Embedded Self-Organizing Map

This is the official Keras implementation of the Deep Embedded Self-Organizing Map (DESOM) model.

DESOM is an unsupervised learning model that jointly learns representations and the code vectors of a self-organizing map (SOM) in order to survey, cluster and visualize large, high-dimensional datasets. Our model is composed of an autoencoder and a custom SOM layer that are optimized in a joint training procedure, motivated by the idea that the SOM prior could help learning SOM-friendly representations. Its training is fast, end-to-end and requires no pre-training.

When using this code, please cite following works:

Forest, Florent, Mustapha Lebbah, Hanene Azzag and Jérôme Lacaille (2019). Deep Embedded SOM: Joint Representation Learning and Self-Organization. In European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN 2019).

Forest, Florent, Mustapha Lebbah, Hanene Azzag and Jérôme Lacaille (2019). Deep Architectures for Joint Clustering and Visualization with Self-Organizing Maps. In Workshop on Learning Data Representations for Clustering (LDRC), PAKDD 2019.

Forest, Florent, Mustapha Lebbah, Hanene Azzag, and Jérôme Lacaille (2021). Deep Embedded Self-Organizing Maps for Joint Representation Learning and Topology-Preserving Clustering. Neural Computing and Applications 33, no. 24 (December 1, 2021): 17439–69. https://doi.org/10.1007/s00521-021-06331-w.

(see also http://florentfo.rest/publications)

Quick start

The implementation is divided into several scripts:

  • train.py: main training script.
  • DESOM.py: DESOM model class.
  • ConvDESOM.py: Convolutional DESOM model class.
  • SOM.py: SOM layer class.
  • AE.py: autoencoder models (mlp and conv2d).
  • Kerasom.py: a standard SOM in Keras, without the autoencoder part.
  • datasets.py: script for loading the datasets benchmarked in the paper (MNIST, Fashion-MNIST, USPS and REUTERS-10k).
  • evaluation.py: PerfLogger class evaluating many clustering/SOM quality metrics. Requires the external dependency SOMperf.
  • metrics.py: functions to compute metrics used in desom_benchmark.py (purity, unsupervised clustering accuracy, quantization and topographic errors).
  • desom_benchmark.py: script to perform benchmark runs of DESOM on 4 datasets and save results in a CSV file.

The data directory contains USPS and REUTERS-10k datasets.

Prerequisites

First, clone and install the SOMperf module, required to evaluate the quality metrics during training:

git clone https://github.com/FlorentF9/SOMperf
cd SOMperf
python3 setup.py install

Training instructions

The main script has several command-line arguments that are explained with:

python3 train.py --help

All arguments have default values, so DESOM training can be simply started doing:

python3 train.py

For example, to train DESOM on Fashion-MNIST with a 20x20 map, the command is:

python3 train.py --model desom --dataset fmnist --map_size 20 20

Training generates several outputs:

  • an image of the DESOM map to visualize the prototypes
  • a graph of the model architecture
  • a folder containing a log of training metrics and the model weights (by default, results/tmp)

Behavior is similar for the ConvDESOM and Kerasom models.

For information, one training run on MNIST with 10000 iterations and batch size 256 on a laptop GPU takes around 2 minutes.

A full benchmark of DESOM on the 4 datasets can be started by calling the script desom_benchmark.py. Parameters, number of runs and save directories are specified inside the script. Paper results were obtained using this script and number of runs equal to 10. Similar scripts were used for other compared models (convdesom, minisom, kerasom and with pre-trained autoencoder weights).

The main dependencies are keras, scikit-learn, numpy, pandas, matplotlib and somperf.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].