All Projects → Sshanu → Relation Classification Using Bidirectional Lstm Tree

Sshanu / Relation Classification Using Bidirectional Lstm Tree

Licence: mit
TensorFlow Implementation of the paper "End-to-End Relation Extraction using LSTMs on Sequences and Tree Structures" and "Classifying Relations via Long Short Term Memory Networks along Shortest Dependency Paths" for classifying relations

Projects that are alternatives of or similar to Relation Classification Using Bidirectional Lstm Tree

Image Caption Generator
[DEPRECATED] A Neural Network based generative model for captioning images using Tensorflow
Stars: ✭ 141 (-15.57%)
Mutual labels:  jupyter-notebook, lstm
Rnn For Human Activity Recognition Using 2d Pose Input
Activity Recognition from 2D pose using an LSTM RNN
Stars: ✭ 165 (-1.2%)
Mutual labels:  jupyter-notebook, lstm
Array To Tree
Convert a plain array of nodes (with pointers to parent nodes) to a nested data structure
Stars: ✭ 141 (-15.57%)
Mutual labels:  tree, tree-structure
Handwriting Synthesis
Implementation of "Generating Sequences With Recurrent Neural Networks" https://arxiv.org/abs/1308.0850
Stars: ✭ 135 (-19.16%)
Mutual labels:  jupyter-notebook, lstm
Ruijin round2
瑞金医院MMC人工智能辅助构建知识图谱大赛复赛
Stars: ✭ 159 (-4.79%)
Mutual labels:  jupyter-notebook, relation-extraction
Deeplearningfornlpinpytorch
An IPython Notebook tutorial on deep learning for natural language processing, including structure prediction.
Stars: ✭ 1,744 (+944.31%)
Mutual labels:  jupyter-notebook, lstm
Stock Price Predictor
This project seeks to utilize Deep Learning models, Long-Short Term Memory (LSTM) Neural Network algorithm, to predict stock prices.
Stars: ✭ 146 (-12.57%)
Mutual labels:  jupyter-notebook, lstm
Containers
This library provides various containers. Each container has utility functions to manipulate the data it holds. This is an abstraction as to not have to manually manage and reallocate memory.
Stars: ✭ 125 (-25.15%)
Mutual labels:  tree, tree-structure
Tensorflow On Android For Human Activity Recognition With Lstms
iPython notebook and Android app that shows how to build LSTM model in TensorFlow and deploy it on Android
Stars: ✭ 157 (-5.99%)
Mutual labels:  jupyter-notebook, lstm
Tensorflow Multi Dimensional Lstm
Multi dimensional LSTM as described in Alex Graves' Paper https://arxiv.org/pdf/0705.2011.pdf
Stars: ✭ 154 (-7.78%)
Mutual labels:  jupyter-notebook, lstm
Deep Learning With Python
Example projects I completed to understand Deep Learning techniques with Tensorflow. Please note that I do no longer maintain this repository.
Stars: ✭ 134 (-19.76%)
Mutual labels:  jupyter-notebook, lstm
Poetry Seq2seq
Chinese Poetry Generation
Stars: ✭ 159 (-4.79%)
Mutual labels:  jupyter-notebook, lstm
Abstractive Summarization
Implementation of abstractive summarization using LSTM in the encoder-decoder architecture with local attention.
Stars: ✭ 128 (-23.35%)
Mutual labels:  jupyter-notebook, lstm
Ethnicolr
Predict Race and Ethnicity Based on the Sequence of Characters in a Name
Stars: ✭ 137 (-17.96%)
Mutual labels:  jupyter-notebook, lstm
Chinese Chatbot
中文聊天机器人,基于10万组对白训练而成,采用注意力机制,对一般问题都会生成一个有意义的答复。已上传模型,可直接运行,跑不起来直播吃键盘。
Stars: ✭ 124 (-25.75%)
Mutual labels:  jupyter-notebook, lstm
Bertem
论文实现(ACL2019):《Matching the Blanks: Distributional Similarity for Relation Learning》
Stars: ✭ 146 (-12.57%)
Mutual labels:  jupyter-notebook, relation-extraction
Linear Attention Recurrent Neural Network
A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Stars: ✭ 119 (-28.74%)
Mutual labels:  jupyter-notebook, lstm
Multilstm
keras attentional bi-LSTM-CRF for Joint NLU (slot-filling and intent detection) with ATIS
Stars: ✭ 122 (-26.95%)
Mutual labels:  jupyter-notebook, lstm
Graphview
Flutter GraphView is used to display data in graph structures. It can display Tree layout, Directed and Layered graph. Useful for Family Tree, Hierarchy View.
Stars: ✭ 152 (-8.98%)
Mutual labels:  tree, tree-structure
Amazon Product Recommender System
Sentiment analysis on Amazon Review Dataset available at http://snap.stanford.edu/data/web-Amazon.html
Stars: ✭ 158 (-5.39%)
Mutual labels:  jupyter-notebook, lstm

Relation Classification

MIT License

Relation classification aims to categorize into predefined classes the relations btw pairs of given entities in texts. There are two ways to represent relations between entities using deep neural networks: recurrent neural networks (RNNs) and convolutional neural networks (CNNs). We have implemented three LSTM-RNN architectures for solving the task of relation classification:

We achieve better performance for solving this task using the last approach "Relation classification using LSTMS on Sequences and Tree Structures.".

References:

End-to-End Relation Extraction using LSTMs on Sequences and Tree Structures
Makoto Miwa, Mohit Bansal
http://www.aclweb.org/anthology/P/P16/P16-1105.pdf

Abstract: We present a novel end-to-end neural model to extract entities and relations between them. Our recurrent neural network based model captures both word sequence and dependency tree substructure information by stacking bidirectional treestructured LSTM-RNNs on bidirectional sequential LSTM-RNNs. This allows our model to jointly represent both entities and relations with shared parameters in a single model. We further encourage detection of entities during training and use of entity information in relation extraction via entity pretraining and scheduled sampling. Our model improves over the stateof-the-art feature-based model on end-toend relation extraction, achieving 12.1% and 5.7% relative error reductions in F1- score on ACE2005 and ACE2004, respectively. We also show that our LSTMRNN based model compares favorably to the state-of-the-art CNN based model (in F1-score) on nominal relation classification (SemEval-2010 Task 8). Finally, we present an extensive ablation analysis of several model components

Classifying Relations via Long Short Term Memory Networks along Shortest Dependency Paths
Yan Xu, Lili Mou, Ge Li, Yunchuan Chen, Hao Peng, Zhi Jin
http://www.emnlp2015.org/proceedings/EMNLP/pdf/EMNLP206.pdf

Abstract: Relation classification is an important research arena in the field of natural language processing (NLP). In this paper, we present SDP-LSTM, a novel neural network to classify the relation of two entities in a sentence. Our neural architecture leverages the shortest dependency path (SDP) between two entities; multichannel recurrent neural networks, with long short term memory (LSTM) units, pick up heterogeneous information along the SDP. Our proposed model has several distinct features: (1) The shortest dependency paths retain most relevant information (to relation classification), while eliminating irrelevant words in the sentence. (2) The multichannel LSTM networks allow effective information integration from heterogeneous sources over the dependency paths. (3) A customized dropout strategy regularizes the neural network to alleviate overfitting. We test our model on the SemEval 2010 relation classification task, and achieve an F1-score of 83.7%, higher than competing methods in the literature.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].