All Projects → lil-lab → nccg

lil-lab / nccg

Licence: GPL-3.0 license
Neural Shift Reduce Parser for CCG Semantic Parsing (Misra and Artzi, EMNLP 2016)

Programming Languages

java
68154 projects - #9 most used programming language
C++
36643 projects - #6 most used programming language
scala
5932 projects
Lex
420 projects
python
139335 projects - #7 most used programming language
PHP
23972 projects - #3 most used programming language

Projects that are alternatives of or similar to nccg

ami-spec
Acceptance testing your AMIs
Stars: ✭ 47 (+193.75%)
Mutual labels:  ami
amiws queue
Asterisk Queues Dashboard with amiws
Stars: ✭ 40 (+150%)
Mutual labels:  ami
tupa
Transition-based UCCA Parser
Stars: ✭ 72 (+350%)
Mutual labels:  semantic-parser
ssm-ami-automation
Automated AMI creation using SSM
Stars: ✭ 14 (-12.5%)
Mutual labels:  ami
cfn-ami-to-mapping
Generate your CloudFormation RegionMap automatically
Stars: ✭ 34 (+112.5%)
Mutual labels:  ami
circleci-packer-example
Example: packer image builder on CircleCI
Stars: ✭ 20 (+25%)
Mutual labels:  ami
Laravel Aws Eb
Ready-to-deploy configuration to run Laravel on AWS Elastic Beanstalk.
Stars: ✭ 247 (+1443.75%)
Mutual labels:  ami
xml-semantic-external-parser
A semantic external parser for XML files that can be used together with GMaster, PlasticSCM or SemanticMerge. Supports various XML formats, such as the Visual Studio project format.
Stars: ✭ 15 (-6.25%)
Mutual labels:  semantic-parser
aws-tag-sched-ops
Retired, please see https://github.com/sqlxpert/lights-off-aws
Stars: ✭ 24 (+50%)
Mutual labels:  ami
aws-utils
This repository provides utilities which are used at MiQ.
Stars: ✭ 20 (+25%)
Mutual labels:  ami
amr
Cornell AMR Semantic Parser (Artzi et al., EMNLP 2015)
Stars: ✭ 23 (+43.75%)
Mutual labels:  semantic-parser
amigen7
Set of tools to provide automation of tasks for creating STIG-partitioned EL7 AMIs
Stars: ✭ 33 (+106.25%)
Mutual labels:  ami
efi shell flash bios
use efi shell to flash_bios uefi shell强刷bios grub efi shell 解锁bios 隐藏菜单选项 ami
Stars: ✭ 38 (+137.5%)
Mutual labels:  ami
spring
SPRING is a seq2seq model for Text-to-AMR and AMR-to-Text (AAAI2021).
Stars: ✭ 103 (+543.75%)
Mutual labels:  semantic-parser
callme
No description or website provided.
Stars: ✭ 45 (+181.25%)
Mutual labels:  ami
digital champions deeplearning r mxnet
Showcase for using R + MXNET along with AWS and bitfusion for deep learning.
Stars: ✭ 20 (+25%)
Mutual labels:  ami
ami-io
Use node.js or io.js to manage Asterisk through AMI
Stars: ✭ 28 (+75%)
Mutual labels:  ami
ami
integration asterisk manager interface (AMI) in laravel
Stars: ✭ 25 (+56.25%)
Mutual labels:  ami
amiws
Asterisk Management Interface (AMI) to Web-socket proxy
Stars: ✭ 60 (+275%)
Mutual labels:  ami
goami
Asterisk Manager Interface (AMI) client in Go
Stars: ✭ 36 (+125%)
Mutual labels:  ami

Neural Shift Reduce CCG Semantic Parser for AMR Parsing

Contains implementation of Neural Shift Reduce Parser for CCG Semantic Parser of Misra and Artzi EMNLP 2016.

Author

Developed and maintained by Dipendra Misra ([email protected])

Uses the following external codebase:

  1. Cornell SPF and AMR code maintained by Yoav Artzi (Artzi, 2016).
  2. DeepLearning4j.
  3. EasyCCG (Lewis and Steedman, 2014) for CCGBank categories.
  4. SMATCH metric (Cai and Knight, 2013).
  5. Illinois NER (Ratinov and Roth, 2009)
  6. Stanford CoreNLP POS Tagger (Manning et al., 2014)

You don't have to install 1-5 above.

Prerequisite

  • Java 8.0

Section 1: Using Amazon AMI to do AMR Parsing

In this section, we will talk on how to use publically available Amazon AMI to perform parsing on test set. Later sections describe how to do customized training and testing. First, login to https://aws.amazon.com/ and find the NCCG public AMI(ID: ami-ce5387d8). You can find the AMI by clicking on launch an ec2 instance and then going to community AMIs and searching for the above AMI ID.

We will launch a master instance and several worker instance to do test time parsing on the test set.

  • Run the master instance

    Launch a master instance using the above AMI and run the following command upon ssh:

    cd /home/ubuntu/amr_exp/
    java -Xmx110g -jar NeuralAmrParser_Test.jar ./experiment_ff/dev.proxy/dev.proxy.dist.exp
    

    You can find the log file in /home/ubuntu/nccg/nn-amr-dev/experiment_ff/dev.proxy/logs/global.log It may take some time for the master to start the distributed job when running the code for the first time.

  • Run the worker instances

    Launch some x number of instances (say 20) using the same public AMI. Paste the code below to run when the instance launch. It is a good idea to use spot instances for running workers.

    cd /home/ubuntu/amr_exp/
    screen -L
    java -Xmx110g -jar ./NeuralAmrParser_Test.jar ./experiment_ff/worker1/worker.exp hostname=ip-address    
     master=ip-address masterPort=4444
    

    Supply the public IP address of master in place of ip-address. Above code runs with 110GB RAM which can be changed to any other number within the RAM limit.

The results will be printed in the dev.proxy/logs/test.log. The final number should match the numbers reported in the paper (~66.1 SMATCH). When running the code for the first time the master node may take a long time to get ready.

Section 2: Using the source code with Eclipse

Instructions below assume you are using Eclipse which is a powerful java IDE.

Import the code in Eclipse

  • Import all the java projects in nn-amr-dev.

    • To import projects, first open eclipse and change the workspace folder to the root folder of ./nccg.
    • Now go to File->Import->General->Existing Projects Into Workspace and select the nn-amd-dev folder in the root.
    • You should see the amr project. Select it. You should now see amr in the project explorer. Ignore any errors for now.
    • Now import all the java projects in nn-ccg-dev in similar fashion.
    • Close the following projects tiny, learn.ubl and learn.weakp (right click on the project and click on Close Project).
    • If you see any error then please see the FAQ section or raise an issue.

Understanding the code structure

  • Neural Shift Reduce CCG Semantic parser (NCCG) is developed on top of SPF (CCG Semantic Parsing Framework). Please see [https://github.com/cornell-lic/spf](SPF documentation) to learn more about SPF.

  • The NCCG is contained in the java project ./parser.ccg.ff.shiftreduce.

  • There are three major components that create NCCG.

    • Parser: NCCG model is a feed-forward neural network that generates probability over actions given parsing configuration. For technical details, please see Section 3 in the paper. The model file is located in: ./edu.cornell.cs.nlp.spf.parser.ff.shiftreduce.neuralparser/NeuralDotProductShiftReduceParser.java

    • Creating dataset: NCCG parser is trained on configuration and gold action pairs that are generated using a CKY parser. For technical details, please see Section 4 in the paper. This is described in the following package: ./edu.cornell.cs.nlp.spf.parser.ff.shiftreduce.dataset

    • Learning: NCCG is trained using backpropagation. For technical details, please see Section 4 in the paper. This is described in the following file: ./edu.cornell.cs.nlp.spf.parser.ff.shiftreduce.learner/NeuralFeedForwardDotProductLearner.java

Section 3: Custom Testing and Training

In order to perform testing or learning with NCCG, you will have to build a jar file. In this section, we will describe how to do this.

Build the jar file

To build the jar file do the following:

  1. Right click on amr.neural/src/edu.uw.cs.lil.amr.neural.exp/AmrNeuralGenericExp.java and right click on export-->java-->Runnable jar file. Give a file name to the jar such as NeuralAMRParsing.jar

    Select the following option for Library Handling: Copy required libraries into a sub-folder next to the generated JAR

  2. You should now see a generated JAR file :)

Text Interface

  1. AMR package comes with a clean text interface that allows one to modify hyperparamter, file names etc. without having to rebuild the JAR file. The text interface is defined in nccg/nn-amr-dev/experiment_ff

  2. There are two kind of files-- one with .exp extension and one with .inc extension. .exp files are the experiment files that define the chief experiment setup and .inc are support files that define indvidual component.

  3. nccg/nn-amr-dev/experiment_ff/dev.proxy/dev.proxy.exp defines one such experimental setup. The exp file defines several variables e.g., globalLog=logs1/global.log defines the location of global log file. dev.proxy.inc defines other components such as parser and learning modules. params.inc defines several hyperparameter such as number of epochs.

  4. Finally, dev.proxy.exp defines a job.inc which defines the job to run. This job can be a learning job or testing job or other user defined job.

Perform Testing

Use the jar file that is created and do testing as described in Section 1. You will have to of course copy the jar and its library folder to the server and also ensure that worker instances have access to the jar. This can be done by adding an rsync operation when running workers which copies the jar file from the master or by creating a new AMI and launching workers using that AMI.

Perform Learning

To Come

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].