All Projects → dfdazac → Dgi

dfdazac / Dgi

Licence: mit
TensorFlow implementation of Deep Graph Infomax

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Dgi

Variational Autoencoder
Variational autoencoder implemented in tensorflow and pytorch (including inverse autoregressive flow)
Stars: ✭ 807 (+1291.38%)
Mutual labels:  unsupervised-learning
Iva
IVA: Independent Vector Analysis implementation
Stars: ✭ 35 (-39.66%)
Mutual labels:  unsupervised-learning
Php Ml
PHP-ML - Machine Learning library for PHP
Stars: ✭ 7,900 (+13520.69%)
Mutual labels:  unsupervised-learning
Unsup3d
(CVPR'20 Oral) Unsupervised Learning of Probably Symmetric Deformable 3D Objects from Images in the Wild
Stars: ✭ 905 (+1460.34%)
Mutual labels:  unsupervised-learning
Discogan Pytorch
PyTorch implementation of "Learning to Discover Cross-Domain Relations with Generative Adversarial Networks"
Stars: ✭ 961 (+1556.9%)
Mutual labels:  unsupervised-learning
Unsuprevised seg via cnn
Stars: ✭ 38 (-34.48%)
Mutual labels:  unsupervised-learning
Simclr
PyTorch implementation of SimCLR: A Simple Framework for Contrastive Learning of Visual Representations
Stars: ✭ 750 (+1193.1%)
Mutual labels:  unsupervised-learning
Rakun
Rank-based Unsupervised Keyword Extraction via Metavertex Aggregation
Stars: ✭ 54 (-6.9%)
Mutual labels:  unsupervised-learning
Uc Davis Cs Exams Analysis
📈 Regression and Classification with UC Davis student quiz data and exam data
Stars: ✭ 33 (-43.1%)
Mutual labels:  unsupervised-learning
Tadw
An implementation of "Network Representation Learning with Rich Text Information" (IJCAI '15).
Stars: ✭ 43 (-25.86%)
Mutual labels:  unsupervised-learning
Summary loop
Codebase for the Summary Loop paper at ACL2020
Stars: ✭ 26 (-55.17%)
Mutual labels:  unsupervised-learning
Transferlearning
Transfer learning / domain adaptation / domain generalization / multi-task learning etc. Papers, codes, datasets, applications, tutorials.-迁移学习
Stars: ✭ 8,481 (+14522.41%)
Mutual labels:  unsupervised-learning
Susi
SuSi: Python package for unsupervised, supervised and semi-supervised self-organizing maps (SOM)
Stars: ✭ 42 (-27.59%)
Mutual labels:  unsupervised-learning
Bagofconcepts
Python implementation of bag-of-concepts
Stars: ✭ 18 (-68.97%)
Mutual labels:  unsupervised-learning
Voxelmorph
Unsupervised Learning for Image Registration
Stars: ✭ 1,057 (+1722.41%)
Mutual labels:  unsupervised-learning
Minisom
🔴 MiniSom is a minimalistic implementation of the Self Organizing Maps
Stars: ✭ 801 (+1281.03%)
Mutual labels:  unsupervised-learning
Gdynet
Unsupervised learning of atomic scale dynamics from molecular dynamics.
Stars: ✭ 37 (-36.21%)
Mutual labels:  unsupervised-learning
Hypergan
Composable GAN framework with api and user interface
Stars: ✭ 1,104 (+1803.45%)
Mutual labels:  unsupervised-learning
Lir For Unsupervised Ir
This is an implementation for the CVPR2020 paper "Learning Invariant Representation for Unsupervised Image Restoration"
Stars: ✭ 53 (-8.62%)
Mutual labels:  unsupervised-learning
Student Teacher Anomaly Detection
Student–Teacher Anomaly Detection with Discriminative Latent Embeddings
Stars: ✭ 43 (-25.86%)
Mutual labels:  unsupervised-learning

Deep Graph Infomax

Deep Graph Infomax (DGI) is an unsupervised algorithm for finding representations of graphs that can be used in downstream tasks like node classification.

This is a TensorFlow implementation of DGI, based on the Graph Convolutional Network implementation by Thomas Kipf.

Installation

python setup.py install

Requirements

  • tensorflow (>0.12)
  • networkx

Run

First train a DGI model:

python train.py --model dgi

Once the model is trained, the graph embeddings are saved as a pickle file in the runs folder. Take note of its path (e.g. runs/2018-11-04-164053/embeddings.p and use it to train a logistic regression model on the node classification task:

python train.py --model logreg --embeddings_path runs/2018-11-04-164053/embeddings.p

Data

In order to use your own data, you have to provide

  • an N by N adjacency matrix (N is the number of nodes),
  • an N by D feature matrix (D is the number of features per node), and
  • an N by E binary label matrix (E is the number of classes).

Have a look at the load_data() function in utils.py for an example.

In this example, we load citation network data (Cora, Citeseer or Pubmed). The original datasets can be found here: http://linqs.cs.umd.edu/projects/projects/lbc/. In our version (see data folder) we use dataset splits provided by https://github.com/kimiyoung/planetoid (Zhilin Yang, William W. Cohen, Ruslan Salakhutdinov, Revisiting Semi-Supervised Learning with Graph Embeddings, ICML 2016).

You can specify a dataset as follows:

python train.py --dataset citeseer

(or by editing train.py)

Models

You can choose between the following models:

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].