All Projects → bdy9527 → Sdcn

bdy9527 / Sdcn

Licence: apache-2.0
Structural Deep Clustering Network

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Sdcn

Recoder
Large scale training of factorization models for Collaborative Filtering with PyTorch
Stars: ✭ 46 (-55.34%)
Mutual labels:  autoencoder
Pt Sdae
PyTorch implementation of SDAE (Stacked Denoising AutoEncoder)
Stars: ✭ 72 (-30.1%)
Mutual labels:  autoencoder
Deepdepthdenoising
This repo includes the source code of the fully convolutional depth denoising model presented in https://arxiv.org/pdf/1909.01193.pdf (ICCV19)
Stars: ✭ 96 (-6.8%)
Mutual labels:  autoencoder
Frame Level Anomalies In Videos
Frame level anomaly detection and localization in videos using auto-encoders
Stars: ✭ 50 (-51.46%)
Mutual labels:  autoencoder
Codeslam
Implementation of CodeSLAM — Learning a Compact, Optimisable Representation for Dense Visual SLAM paper (https://arxiv.org/pdf/1804.00874.pdf)
Stars: ✭ 64 (-37.86%)
Mutual labels:  autoencoder
Image similarity
PyTorch Blog Post On Image Similarity Search
Stars: ✭ 80 (-22.33%)
Mutual labels:  autoencoder
Pytorch Mnist Vae
Stars: ✭ 32 (-68.93%)
Mutual labels:  autoencoder
Segmentation
Tensorflow implementation : U-net and FCN with global convolution
Stars: ✭ 101 (-1.94%)
Mutual labels:  autoencoder
Molencoder
Molecular AutoEncoder in PyTorch
Stars: ✭ 69 (-33.01%)
Mutual labels:  autoencoder
Pytorch sac ae
PyTorch implementation of Soft Actor-Critic + Autoencoder(SAC+AE)
Stars: ✭ 94 (-8.74%)
Mutual labels:  autoencoder
Basic nns in frameworks
several basic neural networks[mlp, autoencoder, CNNs, recurrentNN, recursiveNN] implements under several NN frameworks[ tensorflow, pytorch, theano, keras]
Stars: ✭ 58 (-43.69%)
Mutual labels:  autoencoder
Repo 2017
Python codes in Machine Learning, NLP, Deep Learning and Reinforcement Learning with Keras and Theano
Stars: ✭ 1,123 (+990.29%)
Mutual labels:  autoencoder
Sdne Keras
Keras implementation of Structural Deep Network Embedding, KDD 2016
Stars: ✭ 83 (-19.42%)
Mutual labels:  autoencoder
Lipreading
Stars: ✭ 49 (-52.43%)
Mutual labels:  autoencoder
Zerospeech Tts Without T
A Pytorch implementation for the ZeroSpeech 2019 challenge.
Stars: ✭ 100 (-2.91%)
Mutual labels:  autoencoder
Rnn Vae
Variational Autoencoder with Recurrent Neural Network based on Google DeepMind's "DRAW: A Recurrent Neural Network For Image Generation"
Stars: ✭ 39 (-62.14%)
Mutual labels:  autoencoder
Aialpha
Use unsupervised and supervised learning to predict stocks
Stars: ✭ 1,191 (+1056.31%)
Mutual labels:  autoencoder
Smrt
Handle class imbalance intelligently by using variational auto-encoders to generate synthetic observations of your minority class.
Stars: ✭ 102 (-0.97%)
Mutual labels:  autoencoder
Deep Autoencoders For Collaborative Filtering
Using Deep Autoencoders for predictions of movie ratings.
Stars: ✭ 101 (-1.94%)
Mutual labels:  autoencoder
Niftynet
[unmaintained] An open-source convolutional neural networks platform for research in medical image analysis and image-guided therapy
Stars: ✭ 1,276 (+1138.83%)
Mutual labels:  autoencoder

SDCN

Structural Deep Clustering Network

Paper

https://arxiv.org/abs/2002.01633

https://github.com/461054993/SDCN/blob/master/SDCN.pdf

Dataset

Due to the limitation of file size, the complete data can be found in Baidu Netdisk:

graph: 链接:https://pan.baidu.com/s/1MEWr1KyrtBQndVNy8_y2Lw 密码:opc1

data: 链接:https://pan.baidu.com/s/1kqoWlElbWazJyrTdv1sHNg 密码:1gd4

Code

python sdcn.py --name [usps|hhar|reut|acm|dblp|cite]

Q&A

  • Q: Why do not use distribution Q to supervise distribution P directly?
    A: The reasons are two-fold: 1) Previous method has considered to use the clustering assignments as pseudo labels to re-train the encoder in a supervised manner, i.e., deepCluster. However, in experiment, we find that the gradient of cross-entropy loss is too violent to prevent the embedding spaces from disturbance. 2) Although we can replace the cross-entropy loss with KL divergence, there is still a problem that we worried about, that is, there is no clustering information. The original intention of our research on deep clustering is to integrate the objective of clustering into the powerful representation ability of deep learning. Therefore, we introduce the distribution P to increase the cohesion of clustering performance, the details can be found in DEC.

  • Q: How to apply SDCN to other datasets?
    A: In general, if you want to apply our model to other datasets, three steps are required.

    1. Construct the KNN graph based on the similarity of features. Details can be found in calcu_graph.py.
    2. Pretrain the autoencoder and save the pre-trained model. Details can be found in data/pretrain.py.
    3. Replace the args in sdcn.py and run the code.

Reference

If you make advantage of the SDCN model in your research, please cite the following in your manuscript:

@inproceedings{sdcn2020,
  author    = {Deyu Bo and
               Xiao Wang and
               Chuan Shi and
               Meiqi Zhu and
               Emiao Lu and
               Peng Cui},
  title     = {Structural Deep Clustering Network},
  booktitle = {{WWW}},
  pages     = {1400--1410},
  publisher = {{ACM} / {IW3C2}},
  year      = {2020}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].