All Projects → uclnlp → Ntp

uclnlp / Ntp

Licence: apache-2.0
End-to-End Differentiable Proving

Programming Languages

prolog
421 projects

Projects that are alternatives of or similar to Ntp

Deep Steganography
Hiding Images within other images using Deep Learning
Stars: ✭ 136 (+83.78%)
Mutual labels:  nips-2017
large-scale-OT-mapping-TF
Tensorflow Implementation of "Large-scale Optimal Transport and Mapping Estimation"(ICLR2018/NIPS 2017 OTML)
Stars: ✭ 18 (-75.68%)
Mutual labels:  nips-2017
Img2imggan
Implementation of the paper : "Toward Multimodal Image-to-Image Translation"
Stars: ✭ 49 (-33.78%)
Mutual labels:  nips-2017
Attentionalpoolingaction
Code/Model release for NIPS 2017 paper "Attentional Pooling for Action Recognition"
Stars: ✭ 248 (+235.14%)
Mutual labels:  nips-2017
NIPS-Global-Paper-Implementation-Challenge
Selective Classification For Deep Neural Networks.
Stars: ✭ 11 (-85.14%)
Mutual labels:  nips-2017
Prototypical Networks
Code for the NeurIPS 2017 Paper "Prototypical Networks for Few-shot Learning"
Stars: ✭ 705 (+852.7%)
Mutual labels:  nips-2017
Spherenet
Implementation for <Deep Hyperspherical Learning> in NIPS'17.
Stars: ✭ 111 (+50%)
Mutual labels:  nips-2017
Alphacsc
Convolution dictionary learning for time-series
Stars: ✭ 66 (-10.81%)
Mutual labels:  nips-2017
deep-steg
Global NIPS Paper Implementation Challenge of "Hiding Images in Plain Sight: Deep Steganography"
Stars: ✭ 43 (-41.89%)
Mutual labels:  nips-2017
Accurate Binary Convolution Network
Binary Convolution Network for faster real-time processing in ASICs
Stars: ✭ 49 (-33.78%)
Mutual labels:  nips-2017
nips rl
Code for NIPS 2017 learning to run challenge
Stars: ✭ 37 (-50%)
Mutual labels:  nips-2017
pytorch-deep-sets
PyTorch re-implementation of parts of "Deep Sets" (NIPS 2017)
Stars: ✭ 60 (-18.92%)
Mutual labels:  nips-2017
Nips2017
A list of resources for all invited talks, tutorials, workshops and presentations at NIPS 2017
Stars: ✭ 907 (+1125.68%)
Mutual labels:  nips-2017
Dynamic routing between capsules
Implementation of Dynamic Routing Between Capsules, Sara Sabour, Nicholas Frosst, Geoffrey E Hinton, NIPS 2017
Stars: ✭ 202 (+172.97%)
Mutual labels:  nips-2017
Learning2run
Our NIPS 2017: Learning to Run source code
Stars: ✭ 57 (-22.97%)
Mutual labels:  nips-2017
Prototypical Networks Tensorflow
Tensorflow implementation of NIPS 2017 Paper "Prototypical Networks for Few-shot Learning"
Stars: ✭ 122 (+64.86%)
Mutual labels:  nips-2017
Hardnet
Hardnet descriptor model - "Working hard to know your neighbor's margins: Local descriptor learning loss"
Stars: ✭ 350 (+372.97%)
Mutual labels:  nips-2017
Group Sparsity Sbp
Structured Bayesian Pruning, NIPS 2017
Stars: ✭ 72 (-2.7%)
Mutual labels:  nips-2017
Mean Teacher
A state-of-the-art semi-supervised method for image recognition
Stars: ✭ 1,130 (+1427.03%)
Mutual labels:  nips-2017
Wgan Lp Tensorflow
Reproduction code for WGAN-LP
Stars: ✭ 13 (-82.43%)
Mutual labels:  nips-2017

End-to-End Differentiable Proving

This is an implementation of the paper End-to-End Differentiable Proving. For a high-level introduction, see the NIPS oral, slides and poster.

Disclaimer

Please note that this software is not maintained. It is highly-experimental research code, not well documented and we provide no warranty of any kind. Use at your own risk!

Data Format

Data for the NTP is in nl format - basically Prolog syntax:

ntp$ head data/countries/countries.nl
locatedIn(palau,micronesia).
locatedIn(palau,oceania).
locatedIn(maldives,southern_asia).
locatedIn(maldives,asia).
locatedIn(brunei,south-eastern_asia).
locatedIn(brunei,asia).
neighborOf(brunei,malaysia).
locatedIn(japan,eastern_asia).
locatedIn(japan,asia).
locatedIn(netherlands,western_europe).
  • *.nl files represent facts and rules (example of a rule: isa(X,Y) :- isa(X,Z), isa(Z,Y))

  • *.nlt files represent rule templates (example of a rule template: #1(X,Y) :- #2(X,Z), #3(Z,Y))

ntp$ cat data/ntp/simpsons.nlt
5   #1(X, Y) :- #2(X, Y).

5   #1(X, Y) :- #1(Y, X).

5   #1(X, Y) :-
    #2(X, Z),
    #2(Z, Y).

Running

The main file for running NTP is ntp/experiments/learn.py which takes the path to a configuration file as argument.

Code Structure

The core implementation of the NTP can be found here.

The base models (neural link predictors) are implemented here.

Imortant "modules" are unify, this one and this one. It should pretty much reflect the pseudocode in the paper.

The tricky part is the tiling of batched representations for batch proving - check out this.

However, this tiling needs to happen at various points in the code, e.g. here

Implementation of tiling (and multiplexing) here and here.

An important trick in NTP for proving in larger KBs and usin complex rules, is the Kmax heuristic, implemented here.

There is a symbolic prover implementation here

  • it is probably worthwile to look at it first, and compare to NTP.

Test

nosetests

Contributors

Citation

@inproceedings{rocktaschel2017end,
  author    = {Tim Rockt{\"{a}}schel and
               Sebastian Riedel},
  title     = {End-to-end Differentiable Proving},
  booktitle = {Advances in Neural Information Processing Systems 30: Annual Conference
               on Neural Information Processing Systems 2017, 4-9 December 2017,
               Long Beach, CA, {USA}},
  pages     = {3791--3803},
  year      = {2017},
  url       = {http://papers.nips.cc/paper/6969-end-to-end-differentiable-proving},
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].