All Projects → jpuigcerver → Laia

jpuigcerver / Laia

Licence: mit
Laia: A deep learning toolkit for HTR based on Torch

Programming Languages

shell
77523 projects

Projects that are alternatives of or similar to Laia

Pytorch Cpp
C++ Implementation of PyTorch Tutorials for Everyone
Stars: ✭ 1,014 (+830.28%)
Mutual labels:  torch
Dfc Vae
Variational Autoencoder trained by Feature Perceputal Loss
Stars: ✭ 74 (-32.11%)
Mutual labels:  torch
Tf texture net
TensorFlow implementation of DmitryUlyanov/texture_nets
Stars: ✭ 96 (-11.93%)
Mutual labels:  torch
Neuralamr
Sequence-to-sequence models for AMR parsing and generation
Stars: ✭ 60 (-44.95%)
Mutual labels:  torch
Texture nets
Code for "Texture Networks: Feed-forward Synthesis of Textures and Stylized Images" paper.
Stars: ✭ 1,147 (+952.29%)
Mutual labels:  torch
Pytorch Containers
Torch Containers simplified in PyTorch
Stars: ✭ 85 (-22.02%)
Mutual labels:  torch
Dlt
Deep Learning Toolbox for Torch
Stars: ✭ 20 (-81.65%)
Mutual labels:  torch
Torchelie
Torchélie is a set of utility functions, layers, losses, models, trainers and other things for PyTorch.
Stars: ✭ 98 (-10.09%)
Mutual labels:  torch
Colorizer
Add colors to black and white images with neural networks (GANs).
Stars: ✭ 69 (-36.7%)
Mutual labels:  torch
Grad Cam
🌈 📷 Gradient-weighted Class Activation Mapping (Grad-CAM) Demo
Stars: ✭ 91 (-16.51%)
Mutual labels:  torch
Neuralhmm
code for unsupervised learning Neural Hidden Markov Models paper
Stars: ✭ 64 (-41.28%)
Mutual labels:  torch
Sine
A PyTorch Implementation of "SINE: Scalable Incomplete Network Embedding" (ICDM 2018).
Stars: ✭ 67 (-38.53%)
Mutual labels:  torch
Deep Dream In Pytorch
Pytorch implementation of the DeepDream computer vision algorithm
Stars: ✭ 90 (-17.43%)
Mutual labels:  torch
Hzproc
torch data augmentation toolbox (supports affine transform)
Stars: ✭ 56 (-48.62%)
Mutual labels:  torch
Torchcraft
Connecting Torch to StarCraft
Stars: ✭ 1,341 (+1130.28%)
Mutual labels:  torch
Paperspace Python
Paperspace API for python
Stars: ✭ 38 (-65.14%)
Mutual labels:  torch
Torch Srgan
torch implementation of srgan
Stars: ✭ 76 (-30.28%)
Mutual labels:  torch
Pytorch gbw lm
PyTorch Language Model for 1-Billion Word (LM1B / GBW) Dataset
Stars: ✭ 101 (-7.34%)
Mutual labels:  torch
Pytorch Learners Tutorial
PyTorch tutorial for learners
Stars: ✭ 97 (-11.01%)
Mutual labels:  torch
Beauty.torch
Understanding facial beauty with deep learning.
Stars: ✭ 90 (-17.43%)
Mutual labels:  torch

Laia: A deep learning toolkit for HTR

Build Status

Laia is a deep learning toolkit to transcribe handwritten text images.

If you find this toolkit useful in your research, please cite:

@misc{laia2016,
  author = {Joan Puigcerver and
            Daniel Martin-Albo and
            Mauricio Villegas},
  title = {Laia: A deep learning toolkit for HTR},
  year = {2016},
  publisher = {GitHub},
  note = {GitHub repository},
  howpublished = {\url{https://github.com/jpuigcerver/Laia}},
}

Installation

Laia is implemented in Torch, and depends on the following:

Note that currently we only support GPU. You need to use NVIDIA's cuDNN library. Register first for the CUDA Developer Program (it's free) and download the library from NVIDIA's website.

Once Torch is installed the following luarocks are required:

And execute luarocks install https://raw.githubusercontent.com/jpuigcerver/Laia/master/rocks/laia-scm-1.rockspec.

Installation via docker

To ease the installation, there is a public docker image for Laia. To use it first install docker and nvidia-docker, and configure docker so that it can be executed without requiring sudo, see docker linux postinstall. Then the installation of Laia consists of first pulling the image and tagging it as laia:active.

docker pull mauvilsa/laia:[SOME_TAG]
docker tag mauvilsa/laia:[SOME_TAG] laia:active

Replace SOME_TAG with one of the tags available here. Then copy the command line interface script to some directory in your path for easily use from the host.

mkdir -p $HOME/bin
docker run --rm -u $(id -u):$(id -g) -v $HOME:$HOME laia:active bash -c "cp /usr/local/bin/laia-docker $HOME/bin"

After this, all Laia commands can be executed by using the laia-docker command. For further details run.

laia-docker --help

Usage

Training a Laia model using CTC:

Create an "empty" model using:

laia-create-model \
    "$CHANNELS" "$INPUT_HEIGHT" "$((NUM_SYMB+1))" "$MODEL_DIR/model.t7";

Or if installed via docker:

laia-docker create-model \
    "$CHANNELS" "$INPUT_HEIGHT" "$((NUM_SYMB+1))" "$MODEL_DIR/model.t7";

Positional arguments:

  • $CHANNELS: number of channels of the input images.
  • $INPUT_HEIGHT: height of the input images. Note: ALL image must have the same height.
  • $((NUM_SYMB+1)): number of output symbols. Note: Include ONE additional element for the CTC blank symbol.
  • $MODEL_DIR/model.t7: path to the output model.

For optional arguments check laia-create-model -h or laia-create-model -H.

Train the model using:

laia-train-ctc \
    "$MODEL_DIR/model.t7" \
    "$SYMBOLS_TABLE" \
    "$TRAIN_LST" "$TRAIN_GT" "$VALID_LST" "$VALID_GT";

Or if installed via docker:

laia-docker train-ctc \
    "$MODEL_DIR/model.t7" \
    "$SYMBOLS_TABLE" \
    "$TRAIN_LST" "$TRAIN_GT" "$VALID_LST" "$VALID_GT";

Positional arguments:

  • $MODEL_DIR/model.t7 is the path to the input model or checkpoint for training.
  • $SYMBOLS_TABLE is the list of training symbols and their id.
  • $TRAIN_LST is a file containing a list of images for training.
  • $TRAIN_GT is a file containing the list of training transcripts.
  • $VALID_LST is a file containing a list of images for validation.
  • $VALID_GT is a file containing the list of validation transcripts.

For optional arguments check laia-train-ctc -h or laia-create-model -H.

Transcribing

laia-decode "$MODEL_DIR/model.t7" "$TEST_LST";

Or if installed via docker:

laia-docker decode "$MODEL_DIR/model.t7" "$TEST_LST";

Positional arguments:

  • $MODEL_DIR/model.t7 is the path to the model.
  • $TEST_LST is a file containing a list of images for testing.

For optional arguments check laia-decode -h.

Example

For a more detailed example, see the Spanish Numbers README.md in egs/spanish-numbers folder, or the IAM README.md in egs/iam folder.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].