All Projects → benedekrozemberczki → Graph2vec

benedekrozemberczki / Graph2vec

Licence: gpl-3.0
A parallel implementation of "graph2vec: Learning Distributed Representations of Graphs" (MLGWorkshop 2017).

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Graph2vec

Tadw
An implementation of "Network Representation Learning with Rich Text Information" (IJCAI '15).
Stars: ✭ 43 (-92.89%)
Mutual labels:  unsupervised-learning, word2vec, matrix-factorization
NMFADMM
A sparsity aware implementation of "Alternating Direction Method of Multipliers for Non-Negative Matrix Factorization with the Beta-Divergence" (ICASSP 2014).
Stars: ✭ 39 (-93.55%)
Mutual labels:  word2vec, matrix-factorization, unsupervised-learning
Gemsec
The TensorFlow reference implementation of 'GEMSEC: Graph Embedding with Self Clustering' (ASONAM 2019).
Stars: ✭ 210 (-65.29%)
Mutual labels:  unsupervised-learning, word2vec, matrix-factorization
RolX
An alternative implementation of Recursive Feature and Role Extraction (KDD11 & KDD12)
Stars: ✭ 52 (-91.4%)
Mutual labels:  word2vec, matrix-factorization, unsupervised-learning
Bagofconcepts
Python implementation of bag-of-concepts
Stars: ✭ 18 (-97.02%)
Mutual labels:  unsupervised-learning, word2vec
Text Summarizer
Python Framework for Extractive Text Summarization
Stars: ✭ 96 (-84.13%)
Mutual labels:  unsupervised-learning, word2vec
Danmf
A sparsity aware implementation of "Deep Autoencoder-like Nonnegative Matrix Factorization for Community Detection" (CIKM 2018).
Stars: ✭ 161 (-73.39%)
Mutual labels:  unsupervised-learning, word2vec
Awesome Community Detection
A curated list of community detection research papers with implementations.
Stars: ✭ 1,874 (+209.75%)
Mutual labels:  unsupervised-learning, matrix-factorization
Graphwavemachine
A scalable implementation of "Learning Structural Node Embeddings Via Diffusion Wavelets (KDD 2018)".
Stars: ✭ 151 (-75.04%)
Mutual labels:  unsupervised-learning, word2vec
M-NMF
An implementation of "Community Preserving Network Embedding" (AAAI 2017)
Stars: ✭ 119 (-80.33%)
Mutual labels:  matrix-factorization, unsupervised-learning
altair
Assessing Source Code Semantic Similarity with Unsupervised Learning
Stars: ✭ 42 (-93.06%)
Mutual labels:  word2vec, unsupervised-learning
Attentionwalk
A PyTorch Implementation of "Watch Your Step: Learning Node Embeddings via Graph Attention" (NeurIPS 2018).
Stars: ✭ 266 (-56.03%)
Mutual labels:  word2vec, matrix-factorization
Sns auth
通用第三方登录SDK,支持微信,微信扫码,QQ,微博登录,支付宝登录,Facebook,Line,Twitter,Google
Stars: ✭ 520 (-14.05%)
Mutual labels:  line
Helenos
A portable microkernel-based multiserver operating system written from scratch.
Stars: ✭ 553 (-8.6%)
Mutual labels:  kernel
Hidt
Official repository for the paper "High-Resolution Daytime Translation Without Domain Labels" (CVPR2020, Oral)
Stars: ✭ 513 (-15.21%)
Mutual labels:  unsupervised-learning
Flashtext
Extract Keywords from sentence or Replace keywords in sentences.
Stars: ✭ 5,012 (+728.43%)
Mutual labels:  word2vec
Wavelineview
A memory-friendly recording wave animation一款性能内存友好的录音波浪动画
Stars: ✭ 597 (-1.32%)
Mutual labels:  line
Build Linux
A short tutorial about building Linux based operating systems.
Stars: ✭ 4,960 (+719.83%)
Mutual labels:  kernel
Buffalo
TOROS Buffalo: A fast and scalable production-ready open source project for recommender systems
Stars: ✭ 498 (-17.69%)
Mutual labels:  matrix-factorization
Autovc
AutoVC: Zero-Shot Voice Style Transfer with Only Autoencoder Loss
Stars: ✭ 485 (-19.83%)
Mutual labels:  unsupervised-learning

Graph2Vec

Arxiv codebeat badge repo sizebenedekrozemberczki

Abstract

Recent works on representation learning for graph structured data predominantly focus on learning distributed representations of graph substructures such as nodes and subgraphs. However, many graph analytics tasks such as graph classification and clustering require representing entire graphs as fixed length feature vectors. While the aforementioned approaches are naturally unequipped to learn such representations, graph kernels remain as the most effective way of obtaining them. However, these graph kernels use handcrafted features (e.g., shortest paths, graphlets, etc.) and hence are hampered by problems such as poor generalization. To address this limitation, in this work, we propose a neural embedding framework named graph2vec to learn data-driven distributed representations of arbitrary sized graphs. graph2vec's embeddings are learnt in an unsupervised manner and are task agnostic. Hence, they could be used for any downstream task such as graph classification, clustering and even seeding supervised representation learning approaches. Our experiments on several benchmark and large real-world datasets show that graph2vec achieves significant improvements in classification and clustering accuracies over substructure representation learning approaches and are competitive with state-of-the-art graph kernels.

The model is now also available in the package Karate Club.

This repository provides an implementation for graph2vec as it is described in:

graph2vec: Learning distributed representations of graphs. Narayanan, Annamalai and Chandramohan, Mahinthan and Venkatesan, Rajasekar and Chen, Lihui and Liu, Yang MLG 2017, 13th International Workshop on Mining and Learning with Graphs (MLGWorkshop 2017).

The original TensorFlow implementation is available [here].

Requirements

The codebase is implemented in Python 3.5.2 | Anaconda 4.2.0 (64-bit). Package versions used for development are just below.

jsonschema        2.6.0
tqdm              4.28.1
numpy             1.15.4
pandas            0.23.4
texttable         1.5.0
gensim            3.6.0
networkx          2.4
joblib            0.13.0
logging           0.4.9.6  

Datasets

The code takes an input folder with json files. Every file is a graph and files have a numeric index as a name. The json files have two keys. The first key called "edges" corresponds to the edge list of the graph. The second key "features" corresponds to the node features. If the second key is not present the WL machine defaults to use the node degree as a feature. A sample graph dataset from NCI1 is included in the `dataset/` directory.

Options

Learning of the embedding is handled by the src/graph2vec.py script which provides the following command line arguments.

Input and output options

  --input-path   STR    Input folder.           Default is `dataset/`.
  --output-path  STR    Embeddings path.        Default is `features/nci1.csv`.

Model options

  --dimensions     INT          Number of dimensions.                             Default is 128.
  --workers        INT          Number of workers.                                Default is 4.
  --epochs         INT          Number of training epochs.                        Default is 1.
  --min-count      INT          Minimal feature count to keep.                    Default is 5.
  --wl-iterations  INT          Number of feature extraction recursions.          Default is 2.
  --learning-rate  FLOAT        Initial learning rate.                            Default is 0.025.
  --down-sampling  FLOAT        Down sampling rate for frequent features.         Default is 0.0001.

Examples

The following commands learn an embedding of the graphs and writes it to disk. The node representations are ordered by the ID. Creating a graph2vec embedding of the default dataset with the default hyperparameter settings. Saving the embedding at the default path.

$ python src/graph2vec.py

Creating an embedding of an other dataset. Saving the output in a custom place.

$ python src/graph2vec.py --input-path new_data/ --output-path features/nci2.csv

Creating an embedding of the default dataset in 32 dimensions.

$ python src/graph2vec.py --dimensions 32

License

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].