All Projects → Grzego → Handwriting Generation

Grzego / Handwriting Generation

Licence: mit
Implementation of handwriting generation with use of recurrent neural networks in tensorflow. Based on Alex Graves paper (https://arxiv.org/abs/1308.0850).

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Handwriting Generation

Lstm Context Embeddings
Augmenting word embeddings with their surrounding context using bidirectional RNN
Stars: ✭ 57 (-84.21%)
Mutual labels:  neural-networks, lstm
Deep Learning With Python
Example projects I completed to understand Deep Learning techniques with Tensorflow. Please note that I do no longer maintain this repository.
Stars: ✭ 134 (-62.88%)
Mutual labels:  neural-networks, lstm
Tensorflow seq2seq chatbot
Stars: ✭ 81 (-77.56%)
Mutual labels:  neural-networks, lstm
Easy Deep Learning With Keras
Keras tutorial for beginners (using TF backend)
Stars: ✭ 367 (+1.66%)
Mutual labels:  neural-networks, lstm
Deepjazz
Deep learning driven jazz generation using Keras & Theano!
Stars: ✭ 2,766 (+666.2%)
Mutual labels:  neural-networks, lstm
Sarcasmdetection
Sarcasm detection on tweets using neural network
Stars: ✭ 99 (-72.58%)
Mutual labels:  neural-networks, lstm
Numpy Ml
Machine learning, in numpy
Stars: ✭ 11,100 (+2974.79%)
Mutual labels:  neural-networks, lstm
Ncrfpp
NCRF++, a Neural Sequence Labeling Toolkit. Easy use to any sequence labeling tasks (e.g. NER, POS, Segmentation). It includes character LSTM/CNN, word LSTM/CNN and softmax/CRF components.
Stars: ✭ 1,767 (+389.47%)
Mutual labels:  neural-networks, lstm
Ntm One Shot Tf
One Shot Learning using Memory-Augmented Neural Networks (MANN) based on Neural Turing Machine architecture in Tensorflow
Stars: ✭ 238 (-34.07%)
Mutual labels:  neural-networks, lstm
Lstm anomaly thesis
Anomaly detection for temporal data using LSTMs
Stars: ✭ 178 (-50.69%)
Mutual labels:  neural-networks, lstm
Deeplearning.ai Assignments
Stars: ✭ 268 (-25.76%)
Mutual labels:  neural-networks, lstm
Carrot
🥕 Evolutionary Neural Networks in JavaScript
Stars: ✭ 261 (-27.7%)
Mutual labels:  neural-networks, lstm
Kraken
OCR engine for all the languages
Stars: ✭ 304 (-15.79%)
Mutual labels:  neural-networks, lstm
Basicocr
BasicOCR是一个致力于解决自然场景文字识别算法研究的项目。该项目由长城数字大数据应用技术研究院佟派AI团队发起和维护。
Stars: ✭ 336 (-6.93%)
Mutual labels:  lstm
Amazon Forest Computer Vision
Amazon Forest Computer Vision: Satellite Image tagging code using PyTorch / Keras with lots of PyTorch tricks
Stars: ✭ 346 (-4.16%)
Mutual labels:  neural-networks
Keras Anomaly Detection
Anomaly detection implemented in Keras
Stars: ✭ 335 (-7.2%)
Mutual labels:  lstm
Supervisely
AI for everyone! 🎉 Neural networks, tools and a library we use in Supervisely
Stars: ✭ 332 (-8.03%)
Mutual labels:  neural-networks
Predictive Maintenance Using Lstm
Example of Multiple Multivariate Time Series Prediction with LSTM Recurrent Neural Networks in Python with Keras.
Stars: ✭ 352 (-2.49%)
Mutual labels:  lstm
Thesemicolon
This repository contains Ipython notebooks and datasets for the data analytics youtube tutorials on The Semicolon.
Stars: ✭ 345 (-4.43%)
Mutual labels:  lstm
Stock Trading Ml
A stock trading bot that uses machine learning to make price predictions.
Stars: ✭ 325 (-9.97%)
Mutual labels:  lstm

Handwriting generation

Implementation of handwriting generation with use of recurrent neural networks in tensorflow. Based on Alex Graves paper (https://arxiv.org/abs/1308.0850).

How to train a model and generate handwriting

1. Download dataset

First you need to download dataset. This requires you to register on this page ("Download" section). After registration you will be able to download the data/original-xml-part.tar.gz. Unpack it in repository directory.

2. Preprocess dataset

python preprocess.py

This scipt searches local directory for xml files with handwriting data and does some preprocessing like normalizing data and spliting strokes in lines. As a result it should create data directory with preprocessed dataset.

3. Train model

python train.py

This will launch training with default settings (for experimentation look at argparse options). By default it creates summary directory with separate experiment directories for each run. If you want to restore training provide a path to the experiment you want to continue. Like:

python train.py --restore=summary\experiment-0

You can lookup losses in command line or with tensorboard. Example loss plot:

Loss plot

With default settings training took about 5h (using tensorflow 1.2, with GTX 1080).

4. Generate handwriting!

python generate.py --model=path_to_model

When model is trained you can use generate.py scipt to test how it works. Without providing --text argument this script will ask you what to generate in a loop.

Additional options for generation:

  • --bias (float) - with higher bias generated handwriting is more clear so to speak (read paper for more info)
  • --noinfo - plots only generated handwriting (without attention window)
  • --animation - animation of writing
  • --style - style of handwriting, int from 0 to 7 (functionality added thanks to @kristofbc, you can look how each style looks like in imgs folder)

Examples

python generate.py --noinfo --text="this was generated by computer" --bias=1.

example-1

python generate.py --noinfo --animation --text="example of animation " --bias=1.

example-2

Any feedback is welcome 😄

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].