All Projects → mahdiabavisani → Deep-multimodal-subspace-clustering-networks

mahdiabavisani / Deep-multimodal-subspace-clustering-networks

Licence: other
Tensorflow implementation of "Deep Multimodal Subspace Clustering Networks"

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Deep-multimodal-subspace-clustering-networks

MVGL
TCyb 2018: Graph learning for multiview clustering
Stars: ✭ 26 (-58.06%)
Mutual labels:  clustering, multimodal
iMIX
A framework for Multimodal Intelligence research from Inspur HSSLAB.
Stars: ✭ 21 (-66.13%)
Mutual labels:  multimodal
algorithms
The All ▲lgorithms documentation website.
Stars: ✭ 114 (+83.87%)
Mutual labels:  clustering
lipnet
LipNet with gluon
Stars: ✭ 16 (-74.19%)
Mutual labels:  multimodal
RAE
基于tensorflow搭建的神经网络recursive autuencode,用于实现句子聚类
Stars: ✭ 12 (-80.65%)
Mutual labels:  clustering
acoustic-keylogger
Pipeline of a keylogging attack using just an audio signal and unsupervised learning.
Stars: ✭ 80 (+29.03%)
Mutual labels:  clustering
quartz-scheduler-hazelcast-jobstore
An implementation of a Quartz Scheduler JobStore using Hazelcast distributed Collections
Stars: ✭ 42 (-32.26%)
Mutual labels:  clustering
autoplait
Python implementation of AutoPlait (SIGMOD'14) without smoothing algorithm. NOTE: This repository is for my personal use.
Stars: ✭ 24 (-61.29%)
Mutual labels:  clustering
opensvc
The OpenSVC node agent
Stars: ✭ 27 (-56.45%)
Mutual labels:  clustering
Leaflet.MarkerCluster.LayerSupport
Sub-plugin for Leaflet.markercluster plugin; brings compatibility with Layers Control and other plugins
Stars: ✭ 53 (-14.52%)
Mutual labels:  clustering
AnnA Anki neuronal Appendix
Using machine learning on your anki collection to enhance the scheduling via semantic clustering and semantic similarity
Stars: ✭ 39 (-37.1%)
Mutual labels:  clustering
BetaML.jl
Beta Machine Learning Toolkit
Stars: ✭ 64 (+3.23%)
Mutual labels:  clustering
dtw-python
Python port of R's Comprehensive Dynamic Time Warp algorithms package
Stars: ✭ 139 (+124.19%)
Mutual labels:  clustering
subspace
Subspace Network reference implementation
Stars: ✭ 164 (+164.52%)
Mutual labels:  subspace
topometry
A comprehensive dimensional reduction framework to recover the latent topology from high-dimensional data.
Stars: ✭ 64 (+3.23%)
Mutual labels:  clustering
scarf
Toolkit for highly memory efficient analysis of single-cell RNA-Seq, scATAC-Seq and CITE-Seq data. Analyze atlas scale datasets with millions of cells on laptop.
Stars: ✭ 54 (-12.9%)
Mutual labels:  clustering
postsack
Visually cluster your emails by sender, domain, and more to identify waste
Stars: ✭ 288 (+364.52%)
Mutual labels:  clustering
Machine-learning
This repository will contain all the stuffs required for beginners in ML and DL do follow and star this repo for regular updates
Stars: ✭ 27 (-56.45%)
Mutual labels:  clustering
pyclustertend
A python package to assess cluster tendency
Stars: ✭ 38 (-38.71%)
Mutual labels:  clustering
inet ssh dist
SSH distribution for erlang
Stars: ✭ 46 (-25.81%)
Mutual labels:  clustering

Deep multimodal subspace clustering networks

fig1

Overview

This repository contains the implementation of the paper "Deep multimodal subspace clustering networks" by Mahdi Abavisani and Vishal M. Patel. The paper was posted on arXiv in May 2018.

"Deep multimodal subspace clustering networks" (DMSC) investigated various fusion methods for the task of multimodal subspace clustering, and suggested a new fusion technique called "affinity fusion" as the idea of integrating complementary information from two modalities with respect to the similarities between datapoints across different modalities.

fig1

Citation

Please use the following to refer to this work in publications:


@ARTICLE{8488484, 
author={M. {Abavisani} and V. M. {Patel}}, 
journal={IEEE Journal of Selected Topics in Signal Processing}, 
title={Deep Multimodal Subspace Clustering Networks}, 
year={2018}, 
volume={12}, 
number={6}, 
pages={1601-1614}, 
doi={10.1109/JSTSP.2018.2875385}, 
ISSN={1932-4553}, 
month={Dec},}

Setup:

Dependencies:

Tensorflow, numpy, sklearn, munkres, scipy.

Data preprocessing:

Resize the input images of all the modalities to 32 × 32, and rescale them to have pixel values between 0 and 255. This is for keeping the hyperparameter selections suggested in Deep subspace clustering networks valid.

Save the data in a .mat file that includes verctorized modalities as separate matrices with the names modality_0,modality_1, ... ; labels in a vector with the name Labels; and number of modalities in the variable num_modalities.

A sample preprocessed dataset is available in: Data/EYB_fc.mat

Running the code

Affinity fusion :

Run affinity_fusion.py to do mutlimodal subspace clustering. For demo a pretrained model trained on EYB_fc is avilable in models/EYBfc_af.ckpt

Run the demo as:

python affinity_fusion.py --mat EYB_fc --model EYBfc_af

Pretraining:

Run pretrain_affinity_fusion.py to pretrain your networks.

For example:

python pretrain_affinity_fusion.py --mat EYB_fc --model mymodel --epoch 100000
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].