All Projects → Alex-Fabbri → lang2logic-PyTorch

Alex-Fabbri / lang2logic-PyTorch

Licence: MIT license
PyTorch port of the paper "Language to Logical Form with Neural Attention"

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to lang2logic-PyTorch

parse seq2seq
A tensorflow implementation of neural sequence-to-sequence parser for converting natural language queries to logical form.
Stars: ✭ 26 (-23.53%)
Mutual labels:  seq2seq, semantic-parsing
SRB
Code for "Improving Semantic Relevance for Sequence-to-Sequence Learning of Chinese Social Media Text Summarization"
Stars: ✭ 41 (+20.59%)
Mutual labels:  seq2seq
spring
SPRING is a seq2seq model for Text-to-AMR and AMR-to-Text (AAAI2021).
Stars: ✭ 103 (+202.94%)
Mutual labels:  semantic-parsing
NeuralCodeTranslator
Neural Code Translator provides instructions, datasets, and a deep learning infrastructure (based on seq2seq) that aims at learning code transformations
Stars: ✭ 32 (-5.88%)
Mutual labels:  seq2seq
TextSumma
reimplementing Neural Summarization by Extracting Sentences and Words
Stars: ✭ 16 (-52.94%)
Mutual labels:  seq2seq
r2sql
🌶️ R²SQL: "Dynamic Hybrid Relation Network for Cross-Domain Context-Dependent Semantic Parsing." (AAAI 2021)
Stars: ✭ 60 (+76.47%)
Mutual labels:  semantic-parsing
MTA-LSTM-TensorFlow
TensorFlow reimplementation of Topic-to-Essay Generation with Neural Networks.
Stars: ✭ 67 (+97.06%)
Mutual labels:  seq2seq
beam search
Beam search for neural network sequence to sequence (encoder-decoder) models.
Stars: ✭ 31 (-8.82%)
Mutual labels:  seq2seq
Transformer Temporal Tagger
Code and data form the paper BERT Got a Date: Introducing Transformers to Temporal Tagging
Stars: ✭ 55 (+61.76%)
Mutual labels:  seq2seq
avsr-tf1
Audio-Visual Speech Recognition using Sequence to Sequence Models
Stars: ✭ 76 (+123.53%)
Mutual labels:  seq2seq
chinese ancient poetry
seq2seq attention tensorflow textrank context
Stars: ✭ 30 (-11.76%)
Mutual labels:  seq2seq
seq2seq-autoencoder
Theano implementation of Sequence-to-Sequence Autoencoder
Stars: ✭ 12 (-64.71%)
Mutual labels:  seq2seq
gap-text2sql
GAP-text2SQL: Learning Contextual Representations for Semantic Parsing with Generation-Augmented Pre-Training
Stars: ✭ 83 (+144.12%)
Mutual labels:  semantic-parsing
Seq2Seq-chatbot
TensorFlow Implementation of Twitter Chatbot
Stars: ✭ 18 (-47.06%)
Mutual labels:  seq2seq
convolutional seq2seq
fairseq: Convolutional Sequence to Sequence Learning (Gehring et al. 2017) by Chainer
Stars: ✭ 63 (+85.29%)
Mutual labels:  seq2seq
Naver-AI-Hackathon-Speech
2019 Clova AI Hackathon : Speech - Rank 12 / Team Kai.Lib
Stars: ✭ 26 (-23.53%)
Mutual labels:  seq2seq
S2VT-seq2seq-video-captioning-attention
S2VT (seq2seq) video captioning with bahdanau & luong attention implementation in Tensorflow
Stars: ✭ 18 (-47.06%)
Mutual labels:  seq2seq
skt
Sanskrit compound segmentation using seq2seq model
Stars: ✭ 21 (-38.24%)
Mutual labels:  seq2seq
kospeech
Open-Source Toolkit for End-to-End Korean Automatic Speech Recognition leveraging PyTorch and Hydra.
Stars: ✭ 456 (+1241.18%)
Mutual labels:  seq2seq
sede
Text-to-SQL in the Wild: A Naturally-Occurring Dataset Based on Stack Exchange Data
Stars: ✭ 83 (+144.12%)
Mutual labels:  semantic-parsing

This repo contains a PyTorch port of the lua code here for the paper "Language to Logical Form with Neural Attention." This code was written last year as part of a project with Jack Koch and is not being actively worked on or maintained. Nevertheless, I am putting the code here in case it is useful for anyone. The code runs on PyTorch 0.4.1 although it was written for an earlier version. Let me know if you encounter any errors.

For more recent PyTorch code by Li Dong, check out the GitHub repo for the paper "Coarse-to-Fine Decoding for Neural Semantic Parsing."

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].