All Projects → smousavi05 → Unsupervised_Deep_Learning

smousavi05 / Unsupervised_Deep_Learning

Licence: other
Unsupervised (Self-Supervised) Clustering of Seismic Signals Using Deep Convolutional Autoencoders

Programming Languages

Jupyter Notebook
11667 projects

Projects that are alternatives of or similar to Unsupervised Deep Learning

Tensorflow Tutorials
텐서플로우를 기초부터 응용까지 단계별로 연습할 수 있는 소스 코드를 제공합니다
Stars: ✭ 2,096 (+5722.22%)
Mutual labels:  autoencoder
DESOM
🌐 Deep Embedded Self-Organizing Map: Joint Representation Learning and Self-Organization
Stars: ✭ 76 (+111.11%)
Mutual labels:  autoencoder
eForest
This is the official implementation for the paper 'AutoEncoder by Forest'
Stars: ✭ 71 (+97.22%)
Mutual labels:  autoencoder
Timeseries Clustering Vae
Variational Recurrent Autoencoder for timeseries clustering in pytorch
Stars: ✭ 190 (+427.78%)
Mutual labels:  autoencoder
Link Prediction
Representation learning for link prediction within social networks
Stars: ✭ 245 (+580.56%)
Mutual labels:  autoencoder
pytorch integrated cell
Integrated Cell project implemented in pytorch
Stars: ✭ 40 (+11.11%)
Mutual labels:  autoencoder
Deep image prior
Image reconstruction done with untrained neural networks.
Stars: ✭ 168 (+366.67%)
Mutual labels:  autoencoder
dltf
Hands-on in-person workshop for Deep Learning with TensorFlow
Stars: ✭ 14 (-61.11%)
Mutual labels:  autoencoder
Awesome Tensorlayer
A curated list of dedicated resources and applications
Stars: ✭ 248 (+588.89%)
Mutual labels:  autoencoder
EZyRB
Easy Reduced Basis method
Stars: ✭ 49 (+36.11%)
Mutual labels:  autoencoder
Deepinfomaxpytorch
Learning deep representations by mutual information estimation and maximization
Stars: ✭ 212 (+488.89%)
Mutual labels:  autoencoder
Pytorch Vae
A Variational Autoencoder (VAE) implemented in PyTorch
Stars: ✭ 237 (+558.33%)
Mutual labels:  autoencoder
seq2seq-autoencoder
Theano implementation of Sequence-to-Sequence Autoencoder
Stars: ✭ 12 (-66.67%)
Mutual labels:  autoencoder
Deep white balance
Reference code for the paper: Deep White-Balance Editing, CVPR 2020 (Oral). Our method is a deep learning multi-task framework for white-balance editing.
Stars: ✭ 184 (+411.11%)
Mutual labels:  autoencoder
tensorflow-mnist-AAE
Tensorflow implementation of adversarial auto-encoder for MNIST
Stars: ✭ 86 (+138.89%)
Mutual labels:  autoencoder
Tensorflow 101
中文的 tensorflow tutorial with jupyter notebooks
Stars: ✭ 172 (+377.78%)
Mutual labels:  autoencoder
Unsupervised-Classification-with-Autoencoder
Using Autoencoders for classification as unsupervised machine learning algorithms with Deep Learning.
Stars: ✭ 43 (+19.44%)
Mutual labels:  autoencoder
Face-Landmarking
Real time face landmarking using decision trees and NN autoencoders
Stars: ✭ 73 (+102.78%)
Mutual labels:  autoencoder
Image-Retrieval
Image retrieval program made in Tensorflow supporting VGG16, VGG19, InceptionV3 and InceptionV4 pretrained networks and own trained Convolutional autoencoder.
Stars: ✭ 56 (+55.56%)
Mutual labels:  autoencoder
adversarial-autoencoder
Tensorflow 2.0 implementation of Adversarial Autoencoders
Stars: ✭ 17 (-52.78%)
Mutual labels:  autoencoder

Demo codes (see the jupyter notebook) for:

GitHub last commit
GitHub stars GitHub followers GitHub forks GitHub watchers Twitter Follow

Unsupervised (Self-Supervised) Discrimination of Seismic Signals Using Deep Convolutional Autoencoders


You can get the paper from here:

Link 1:

https://ieeexplore.ieee.org/document/8704258

Link 2:

https://www.researchgate.net/publication/332814555_Unsupervised_Clustering_of_Seismic_Signals_Using_Deep_Convolutional_Autoencoders


You can get the training dataset from here:

https://drive.google.com/file/d/16itT_IZpM8w8KyFN8eL8iEfYX66Hk6Xb/view?usp=sharing


Reference:

Mousavi, S. M., W. Zhu, W. Ellsworth, G. Beroza (2019).                        
Unsupervised Clustering of Seismic Signals Using Deep Convolutional Autoencoders, 
IEEE Geoscience and Remote Sensing Letters, 1 - 5, doi:10.1109/LGRS.2019.2909218.                                                                                                       

BibTeX:

@article{mousavi2019unsupervised,
 title={Unsupervised Clustering of Seismic Signals Using Deep Convolutional Autoencoders},
 author={Mousavi, S Mostafa and Zhu, Weiqiang and Ellsworth, William and Beroza, Gregory},
 journal={IEEE Geoscience and Remote Sensing Letters},
 year={2019},
 publisher={IEEE}
}        

Abstract:

In this paper, we use deep neural networks for unsupervised clustering of seismic data.We perform the clustering in a feature space that is simultaneously optimized with the clustering assignment, resulting in learned feature representations that are effective for a specific clustering task. To demonstrate the application of this method in seismic signal processing, we design two different neural networks consisting primarily of full convolutional and pooling layers and apply them to: (1) discriminate waveforms recorded at different hypocentral distances and (2) discriminate waveforms with different first-motion polarities. Our method results in precisions that are comparable to those recently achieved by supervised methods, but without the need for labeled data, manual feature engineering, and large training sets. The applications we present here can be used in standard singlesite earthquake early warning systems to reduce the false alerts on an individual station level. However, the presented technique is general and suitable for a variety of applications including quality control of the labeling and classification results of other
supervised methods.


network architecture

Sampel data. a) and b) are two examples of the seismograms with different polarity of first motion. c) and d) are examples of local and teleseismic waveforms respectively while e) and f) are the associated Short-Time Fourier transforms.

network architecture

The architecture of fully convolutional autoencoder used in our study.

clustering results

Clustering results.

embeded features

Visualization of embeded features.

embeded features

embeded features

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].