All Projects → thunlp → TransNet

thunlp / TransNet

Licence: MIT license
Source code and datasets of IJCAI2017 paper "TransNet: Translation-Based Network Representation Learning for Social Relation Extraction".

Programming Languages

Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to TransNet

OpenANE
OpenANE: the first Open source framework specialized in Attributed Network Embedding. The related paper was accepted by Neurocomputing. https://doi.org/10.1016/j.neucom.2020.05.080
Stars: ✭ 39 (-62.14%)
Mutual labels:  network-embedding
MixGCF
MixGCF: An Improved Training Method for Graph Neural Network-based Recommender Systems, KDD2021
Stars: ✭ 73 (-29.13%)
Mutual labels:  network-embedding
GNE
This repository contains the tensorflow implementation of "GNE: A deep learning framework for gene network inference by aggregating biological information"
Stars: ✭ 27 (-73.79%)
Mutual labels:  network-embedding
Euler
A distributed graph deep learning framework.
Stars: ✭ 2,701 (+2522.33%)
Mutual labels:  network-embedding
Nrlpapers
Must-read papers on network representation learning (NRL) / network embedding (NE)
Stars: ✭ 2,503 (+2330.1%)
Mutual labels:  network-embedding
Awesome Network Embedding
A curated list of network embedding techniques.
Stars: ✭ 2,379 (+2209.71%)
Mutual labels:  network-embedding
Openne
An Open-Source Package for Network Embedding (NE)
Stars: ✭ 1,527 (+1382.52%)
Mutual labels:  network-embedding
Awesome Graph Classification
A collection of important graph embedding, classification and representation learning papers with implementations.
Stars: ✭ 4,309 (+4083.5%)
Mutual labels:  network-embedding
NetEmb-Datasets
A collection of real-world networks/graphs for Network Embedding
Stars: ✭ 18 (-82.52%)
Mutual labels:  network-embedding
TADW
Network Representation Learning with Rich Text Information (IJCAI 2015)
Stars: ✭ 42 (-59.22%)
Mutual labels:  network-embedding
CIKM18-LCVA
Code for CIKM'18 paper, Linked Causal Variational Autoencoder for Inferring Paired Spillover Effects.
Stars: ✭ 13 (-87.38%)
Mutual labels:  network-embedding
FSCNMF
An implementation of "Fusing Structure and Content via Non-negative Matrix Factorization for Embedding Information Networks".
Stars: ✭ 16 (-84.47%)
Mutual labels:  network-embedding
HEER
Easing Embedding Learning by Comprehensive Transcription of Heterogeneous Information Networks(KDD'18)
Stars: ✭ 60 (-41.75%)
Mutual labels:  network-embedding
RHINE
Source code for AAAI 2019 paper "Relation Structure-Aware Heterogeneous Information Network Embedding"
Stars: ✭ 49 (-52.43%)
Mutual labels:  network-embedding
Friends-Recommender-In-Social-Network
Friends Recommendation and Link Prediction in Social Netowork
Stars: ✭ 33 (-67.96%)
Mutual labels:  network-embedding
ethereum-privacy
Profiling and Deanonymizing Ethereum Users
Stars: ✭ 37 (-64.08%)
Mutual labels:  network-embedding
FEATHER
The reference implementation of FEATHER from the CIKM '20 paper "Characteristic Functions on Graphs: Birds of a Feather, from Statistical Descriptors to Parametric Models".
Stars: ✭ 34 (-66.99%)
Mutual labels:  network-embedding
TriDNR
Tri-Party Deep Network Representation, IJCAI-16
Stars: ✭ 72 (-30.1%)
Mutual labels:  network-embedding
resolutions-2019
A list of data mining and machine learning papers that I implemented in 2019.
Stars: ✭ 19 (-81.55%)
Mutual labels:  network-embedding
REGAL
Representation learning-based graph alignment based on implicit matrix factorization and structural embeddings
Stars: ✭ 78 (-24.27%)
Mutual labels:  network-embedding

TransNet

Source code and datasets of IJCAI2017 paper "TransNet: Translation-Based Network Representation Learning for Social Relation Extraction".

This work is selected as an example of the “MLTrain” training event in UAI 2017 (The Conference on Uncertainty in Artificial Intelligence). We release an ipython notebook that demonstrates the algorithm of TransNet. Details please refer to the "ipynb" directory.

Datasets

This folder "data" contains three different scales of datasets extracted from Aminer. Please unzip the "data.zip" file before using it.

  • aminer_s: 187,939 vertices, 1,619,278 edges and 100 labels.
  • aminer_m: 268,037 vertices, 2,747,386 edges and 500 labels.
  • aminer_l: 945,589 vertices, 5,056,050 edges and 500 labels.

The mapping from authors to identifiers in aminer_s/m/l is lost. We offer a raw aminer dataset which contains 5000 labels of edges and 1,712,433 authors. The dataset is extracted from AMiner. Please unzip the "aminer_raw.zip" file before using it.

Run

Run the following command for training TransNet:

python train.py name_of_dataset alpha beta warm_up_to_reload transnet_to_reload

Here is an example:

python train.py aminer_s/ 0.5 20 -1 -1

Explanations of the parameters:

  • name_of_dataset: name of dataset ("aminer_s/", "aminer_m/" or "aminer_l/")
  • alpha: the weight of autoencoder loss
  • beta: the weight of non-zero element in autoencoder
  • warm_up_to_reload: if >=0, reload saved autoencoder parameters and skip warm-up process
  • transnet_to_reload: if >=0, reload saved TransNet parameters

Dependencies

  • Tensorflow == 0.12
  • Scipy == 0.18.1
  • Numpy == 1.11.2

Cite

If you use the code, please cite this paper:

Cunchao Tu, Zhengyan Zhang, Zhiyuan Liu, Maosong Sun. TransNet: Translation-Based Network Representation Learning for Social Relation Extraction. The 26th International Joint Conference on Artificial Intelligence (IJCAI 2017).

For more related works on network representation learning, please refer to my homepage.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].