All Projects → MrVPlusOne → LambdaNet

MrVPlusOne / LambdaNet

Licence: other
Probabilistic Type Inference using Graph Neural Networks

Programming Languages

scala
5932 projects
javascript
184084 projects - #8 most used programming language
typescript
32286 projects
Mathematica
289 projects
python
139335 projects - #7 most used programming language
java
68154 projects - #9 most used programming language

Projects that are alternatives of or similar to LambdaNet

awesome-graph-self-supervised-learning
Awesome Graph Self-Supervised Learning
Stars: ✭ 805 (+1964.1%)
Mutual labels:  graph-neural-networks
InfoGraph
Official code for "InfoGraph: Unsupervised and Semi-supervised Graph-Level Representation Learning via Mutual Information Maximization" (ICLR 2020, spotlight)
Stars: ✭ 222 (+469.23%)
Mutual labels:  graph-neural-networks
DiGCL
The PyTorch implementation of Directed Graph Contrastive Learning (DiGCL), NeurIPS-2021
Stars: ✭ 27 (-30.77%)
Mutual labels:  graph-neural-networks
GNN-Recommender-Systems
An index of recommendation algorithms that are based on Graph Neural Networks.
Stars: ✭ 505 (+1194.87%)
Mutual labels:  graph-neural-networks
phpstan-dba
PHPStan based SQL static analysis and type inference for the database access layer
Stars: ✭ 163 (+317.95%)
Mutual labels:  type-inference
LPGNN
Locally Private Graph Neural Networks (ACM CCS 2021)
Stars: ✭ 30 (-23.08%)
Mutual labels:  graph-neural-networks
mdgrad
Pytorch differentiable molecular dynamics
Stars: ✭ 127 (+225.64%)
Mutual labels:  graph-neural-networks
Representation Learning on Graphs with Jumping Knowledge Networks
Representation Learning on Graphs with Jumping Knowledge Networks
Stars: ✭ 31 (-20.51%)
Mutual labels:  graph-neural-networks
graphml-tutorials
Tutorials for Machine Learning on Graphs
Stars: ✭ 125 (+220.51%)
Mutual labels:  graph-neural-networks
pcdarts-tf2
PC-DARTS (PC-DARTS: Partial Channel Connections for Memory-Efficient Differentiable Architecture Search, published in ICLR 2020) implemented in Tensorflow 2.0+. This is an unofficial implementation.
Stars: ✭ 25 (-35.9%)
Mutual labels:  iclr2020
Graph-Embeddding
Reimplementation of Graph Embedding methods by Pytorch.
Stars: ✭ 113 (+189.74%)
Mutual labels:  graph-neural-networks
deepsphere-weather
A spherical CNN for weather forecasting
Stars: ✭ 44 (+12.82%)
Mutual labels:  graph-neural-networks
awesome-graph-self-supervised-learning-based-recommendation
A curated list of awesome graph & self-supervised-learning-based recommendation.
Stars: ✭ 37 (-5.13%)
Mutual labels:  graph-neural-networks
GAug
AAAI'21: Data Augmentation for Graph Neural Networks
Stars: ✭ 139 (+256.41%)
Mutual labels:  graph-neural-networks
QGNN
Quaternion Graph Neural Networks (ACML 2021) (Pytorch and Tensorflow)
Stars: ✭ 31 (-20.51%)
Mutual labels:  graph-neural-networks
BGCN
A Tensorflow implementation of "Bayesian Graph Convolutional Neural Networks" (AAAI 2019).
Stars: ✭ 129 (+230.77%)
Mutual labels:  graph-neural-networks
GraphScope
🔨 🍇 💻 🚀 GraphScope: A One-Stop Large-Scale Graph Computing System from Alibaba 来自阿里巴巴的一站式大规模图计算系统 图分析 图查询 图机器学习
Stars: ✭ 1,899 (+4769.23%)
Mutual labels:  graph-neural-networks
graphchem
Graph-based machine learning for chemical property prediction
Stars: ✭ 21 (-46.15%)
Mutual labels:  graph-neural-networks
vanilla-lang
An implementation of a predicative polymorphic language with bidirectional type inference and algebraic data types
Stars: ✭ 73 (+87.18%)
Mutual labels:  type-inference
cwn
Message Passing Neural Networks for Simplicial and Cell Complexes
Stars: ✭ 97 (+148.72%)
Mutual labels:  graph-neural-networks

LambdaNet Header

This is the source code repo for the ICLR paper LambdaNet: Probabilistic Type Inference using Graph Neural Networks. For an overview of how LambdaNet works, see our video from ICLR 2020.

This branch contains the latest improvement and features. To produce the results presented by the paper, please see the ICLR20 branch.

Instructions

After cloning this repo, here are the steps to reproduce our experimental results:

  1. Install all the dependencies (Java, sbt, TypeScript, etc.) See the "Using Docker" section below.
    1. If you are not using docker, you will need to either set the environment variable OMP_NUM_THREADS=1 or prefix all the commands in the following step with export OMP_NUM_THREADS=1;.
  2. To run pre-trained model
    1. download the model using this link (predicts user defined type) and unzip the file.
    2. To run the model in interactive mode, which outputs (source code position, predicted type) pairs for the specified files:
      1. If not exits, create the file <project root>/configs/modelPath.txt and write the location of the model directory into it. The location should be the directory that directly contains the model.serialized file.
      2. Under project root, run sbt "runMain lambdanet.TypeInferenceService".
      3. After it finishes loading the model into memory, simply enter the directory path that contains the TypeScript files of interest. Note that in this version of LambdaNet, the model will take any existing user type annotations in the source files as part of its input and only predict the type for the locations where a type annotation is missing.
      4. Note that currently, LambdaNet only works with TypeScript files and assumes the files follow a TypeScript project structure, so if you want to run it on JavaScript files, you will need to change the file extensions to .ts.
    3. Alternatively, to run the model in batched mode, which outputs human-readable HTML files and accuracy statistics, use the code from the ICLR20 Git branch.
  3. To train LambdaNet from scratch
    1. Download the TypeScript projects used in our experiments and prepare the TS projects into a serialization format. This can be achieved using the main function defined in src/main/scala/lambdanet/PrepareRepos.scala. Check the source code and make sure you uncomment all the steps (Some steps may have been commented out), then run it using sbt "runMain lambdanet.PrepareRepos.
    2. Check the main defined in src/main/scala/lambdanet/train/Training.scala to adjust any training configuration if necessary, then run it using sbt "runMain lambdanet.train.Training".

The TypeScript files used for manual comparison with JSNice are put under the directory data/comparison/.

Using Docker

We also provide a Docker file to automatically download and install all the dependencies. (If on linux, you can also install the dependencies into your system by manually running the commands defined inside Dockerfile.) Here are the steps to run pre-trained LambdaNet model inside a Docker Container:

  1. First, make sure you have installed Docker.

  2. Put pre-trained model weights somewhere accessible to the docker instance (e.g., inside the lambdanet project root). Then config the model path following the instruction above (instruction 2.2.1).

  3. Under project root, run docker build -t lambdanet:v1 . && docker run --name lambdanet --memory 14g -t -i lambdanet:v1 . (Make sure the machine you are using has enough memory for the docker run command.)

  4. After the Docker container has successfully started, follow the steps described above.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].