All Projects → dhavalpotdar → Graph Convolution On Structured Documents

dhavalpotdar / Graph Convolution On Structured Documents

This repo contains code to convert Structured Documents to Graphs and implement a Graph Convolution Neural Network for node classification

Programming Languages

python
139335 projects - #7 most used programming language

Graph Convolution on Structured Documents

This repo contains code to convert Structured Documents to Graphs and implement a Graph Convolution Neural Network (incomplete) for Node Classification, each node being an entity in the document.

Check out the article for an intuitive explanation on Towards Data Science: Using Graph Convolutional Neural Networks on Structured Documents for Information Extraction

Code

The grapher.py file contains the code to convert a structured document to a graph.
An object map made using a Commercial OCR Tool is needed as the input which provides the bounding-box coordinates of each entity in the image along with it's recognized text. The script can then be used to generate an object_tree.png file and a connections.csv file. The script joins each object to it's nearest object to the right and underneath thus generating a graph.
Here is what the generated graph looks like: Graph

Graph Convolution Model

The implementation is still in progress and is being built using Tensorflow 1.8. The implementation details can be found in [1].

References

  1. Riba, Dutta et al - Table Detection in Invoice Documents by Graph Neural Networks - Link
  2. Adam W. Harley, Alex Ufkes, and Konstantinos G. Derpanis - Department of Computer Science, Ryerson University, Toronto, Ontario - Evaluation of Deep Convolutional Nets for Document Image Classification and Retrieval - Link
  3. Victor Garcia, Joan Bruna - Few-Shot Learning with Graph Neural Networks - Link
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].