All Projects → rymc → N2d

rymc / N2d

Licence: gpl-3.0
A deep clustering algorithm. Code to reproduce results for the paper N2D: (Not Too) Deep Clustering via Clustering the Local Manifold of an Autoencoded Embedding.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to N2d

Bottleneck
Job scheduler and rate limiter, supports Clustering
Stars: ✭ 1,113 (+1164.77%)
Mutual labels:  clustering
Hazelcast Cpp Client
Hazelcast IMDG C++ Client
Stars: ✭ 67 (-23.86%)
Mutual labels:  clustering
Icellr
Single (i) Cell R package (iCellR) is an interactive R package to work with high-throughput single cell sequencing technologies (i.e scRNA-seq, scVDJ-seq, ST and CITE-seq).
Stars: ✭ 80 (-9.09%)
Mutual labels:  clustering
Multilingual Latent Dirichlet Allocation Lda
A Multilingual Latent Dirichlet Allocation (LDA) Pipeline with Stop Words Removal, n-gram features, and Inverse Stemming, in Python.
Stars: ✭ 64 (-27.27%)
Mutual labels:  clustering
Text Analytics With Python
Learn how to process, classify, cluster, summarize, understand syntax, semantics and sentiment of text data with the power of Python! This repository contains code and datasets used in my book, "Text Analytics with Python" published by Apress/Springer.
Stars: ✭ 1,132 (+1186.36%)
Mutual labels:  clustering
Self Supervised Learning Overview
📜 Self-Supervised Learning from Images: Up-to-date reading list.
Stars: ✭ 73 (-17.05%)
Mutual labels:  clustering
Spyking Circus
Fast and scalable spike sorting in python
Stars: ✭ 55 (-37.5%)
Mutual labels:  clustering
Ml
A high-level machine learning and deep learning library for the PHP language.
Stars: ✭ 1,270 (+1343.18%)
Mutual labels:  clustering
Contrastive Clustering
Code for the paper "Contrastive Clustering" (AAAI 2021)
Stars: ✭ 67 (-23.86%)
Mutual labels:  clustering
Ml code
A repository for recording the machine learning code
Stars: ✭ 75 (-14.77%)
Mutual labels:  clustering
Ekka
Autocluster and Autoheal for EMQ X Broker
Stars: ✭ 63 (-28.41%)
Mutual labels:  clustering
Cluster
Easy Map Annotation Clustering 📍
Stars: ✭ 1,132 (+1186.36%)
Mutual labels:  clustering
Tgcontest
Telegram Data Clustering contest solution by Mindful Squirrel
Stars: ✭ 74 (-15.91%)
Mutual labels:  clustering
Mnesiac
Mnesia autoclustering made easy!
Stars: ✭ 62 (-29.55%)
Mutual labels:  clustering
Supercluster
A very fast geospatial point clustering library for browsers and Node.
Stars: ✭ 1,246 (+1315.91%)
Mutual labels:  clustering
Dnc
Discriminative Neural Clustering for Speaker Diarisation
Stars: ✭ 60 (-31.82%)
Mutual labels:  clustering
Pt Sdae
PyTorch implementation of SDAE (Stacked Denoising AutoEncoder)
Stars: ✭ 72 (-18.18%)
Mutual labels:  clustering
Libcluster
Automatic cluster formation/healing for Elixir applications
Stars: ✭ 1,280 (+1354.55%)
Mutual labels:  clustering
Stringlifier
Stringlifier is on Opensource ML Library for detecting random strings in raw text. It can be used in sanitising logs, detecting accidentally exposed credentials and as a pre-processing step in unsupervised ML-based analysis of application text data.
Stars: ✭ 85 (-3.41%)
Mutual labels:  clustering
Lithosphere Docker
The docker for lithosphere project
Stars: ✭ 76 (-13.64%)
Mutual labels:  clustering

N2D: (Not Too) Deep Clustering via Clustering the Local Manifold of an Autoencoded Embedding.

Abstract

Deep clustering has increasingly been demonstrating superiority over conventional shallow clustering algorithms. Deep clustering algorithms usually combine representation learning with deep neural networks to achieve this performance, typically optimizing a clustering and non-clustering loss. In such cases, an autoencoder is typically connected with a clustering network, and the final clustering is jointly learned by both the autoencoder and clustering network. Instead, we propose to learn an autoencoded embedding and then search this further for the underlying manifold. For simplicity, we then cluster this with a shallow clustering algorithm, rather than a deeper network. We study a number of local and global manifold learning methods on both the raw data and autoencoded embedding, concluding that UMAP in our framework is able to find the best clusterable manifold of the embedding. This suggests that local manifold learning on an autoencoded embedding is effective for discovering higher quality clusters. We quantitatively show across a range of image and time-series datasets that our method has competitive performance against the latest deep clustering algorithms, including out-performing current state-of-the-art on several. We postulate that these results show a promising research direction for deep clustering.

Results

N2D results

Visualizations

MNIST

HAR (Human Activity Recognition)

Note: clusters 'look' better in higher dimensions (based on clustering metrics) than they do here in 2d. The intended use of n2d is for clustering. Visualized here are the first 5000 points.

Paper

https://arxiv.org/abs/1908.05968

Install

Install Anaconda

wget https://repo.anaconda.com/archive/Anaconda3-2019.07-Linux-x86_64.sh
bash Anaconda3-2019.07-Linux-x86_64.sh
source anaconda3/bin/activate

Create environment

conda create -n n2d python=3.7  
conda activate n2d

Clone repo

git clone https://github.com/rymc/n2d.git

Install packages

pip install -r requirements.txt

Reproduce results

bash run.sh

For training a new network

If you remove the --ae_weights argument when running n2d then it will train a new network, rather than load the pretrained weights.

For adding a new dataset you should add a load function to datasets.py (you can use the existing ones to understand how) and a function to call your data loading function from n2d.py

I used the following packages for training the networks using the GPU.

conda install tensorflow-gpu=1.13.1 cudatoolkit=9.0

Visualization

If you would like to produce some plots for visualization purposes add the agument '--visualize'. I also reccomend setting the argument '--umap_dim' to be 2.

Citation

@inproceedings{McConville2020,
  author = {Ryan McConville and Raul Santos-Rodriguez and Robert J Piechocki and Ian Craddock},
  title = {N2D:(Not Too) Deep Clustering via Clustering the Local Manifold of an Autoencoded Embedding},
  booktitle = {25th International Conference on Pattern Recognition, {ICPR} 2020},
  publisher = {{IEEE} Computer Society},
  year = {2020},
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].