All Projects → SyntaxColoring → Markov-Word-Generator

SyntaxColoring / Markov-Word-Generator

Licence: MIT license
A web app that uses Markov chains to generate pseudorandom words.

Programming Languages

coffeescript
4710 projects
HTML
75241 projects
CSS
56736 projects

Projects that are alternatives of or similar to Markov-Word-Generator

Vietnamese-Accent-Prediction
A simple/fast/accurate accent prediction for non-accented Vietnamese text
Stars: ✭ 31 (-6.06%)
Mutual labels:  markov-chain
porn-description-generator
Generates new porn descriptions based on an edited dataset of xhamster video descriptions uploaded between 2007-2016.
Stars: ✭ 40 (+21.21%)
Mutual labels:  markov-chain
insobot
C99 modular IRC bot with markov chains
Stars: ✭ 71 (+115.15%)
Mutual labels:  markov-chain
neworder
A dynamic microsimulation framework for python
Stars: ✭ 15 (-54.55%)
Mutual labels:  markov-chain
TwitchMarkovChain
Twitch Bot for generating messages based on what it learned from chat
Stars: ✭ 87 (+163.64%)
Mutual labels:  markov-chain
markovifyR
Markovify wrapper for R
Stars: ✭ 81 (+145.45%)
Mutual labels:  markov-chain
py-simple-lyric-generator
A simple Markov chains lyric generator written in Python.
Stars: ✭ 17 (-48.48%)
Mutual labels:  markov-chain
markovclick
Python package to model clickstream data as a Markov chain. Inspired by R package clickstream.
Stars: ✭ 29 (-12.12%)
Mutual labels:  markov-chain
walrus
Это самый лучший женобот на этой планете // Нейросеть феминистка
Stars: ✭ 38 (+15.15%)
Mutual labels:  markov-chain
DISCOTRESS
🦜 DISCOTRESS 🦜 is a software package to simulate and analyse the dynamics on arbitrary Markov chains
Stars: ✭ 20 (-39.39%)
Mutual labels:  markov-chain
timeline
Takes tweets from a bot's followings and markovifies them. Ruby port of sneaksnake/timeline
Stars: ✭ 13 (-60.61%)
Mutual labels:  markov-chain
presidential-rnn
Project 4 for Metis bootcamp. Objective was generation of character-level RNN trained on Donald Trump's statements using Keras. Also generated Markov chains, and quick pyTorch RNN as baseline. Attempted semi-supervised GAN, but was unable to test in time.
Stars: ✭ 26 (-21.21%)
Mutual labels:  markov-chain
MMCAcovid19.jl
Microscopic Markov Chain Approach to model the spreading of COVID-19
Stars: ✭ 15 (-54.55%)
Mutual labels:  markov-chain
markov-discord
A Markov chain Discord chat bot. Generates unique messages by learning from past messages. Also occasionally attaches images to messages.
Stars: ✭ 35 (+6.06%)
Mutual labels:  markov-chain
mchmm
Markov Chains and Hidden Markov Models in Python
Stars: ✭ 89 (+169.7%)
Mutual labels:  markov-chain
Mathematical-Modeling
A sharing of the learning process of mathematical modeling 数学建模常用工具模型算法分享:数学建模竞赛优秀论文,数学建模常用算法模型,LaTeX论文模板,SPSS工具分享。
Stars: ✭ 30 (-9.09%)
Mutual labels:  markov-chain
Nonlinear-Systems-and-Control
Files for my Nonlinear Systems and Controls class.
Stars: ✭ 16 (-51.52%)
Mutual labels:  markov-chain
markovipy
Yet another markov chain sentence generator
Stars: ✭ 24 (-27.27%)
Mutual labels:  markov-chain
Deep-Learning-Mahjong---
Reinforcement learning (RL) implementation of imperfect information game Mahjong using markov decision processes to predict future game states
Stars: ✭ 45 (+36.36%)
Mutual labels:  markov-chain
bayseg
An unsupervised machine learning algorithm for the segmentation of spatial data sets.
Stars: ✭ 46 (+39.39%)
Mutual labels:  markov-chain

Markov Word Generator

It's trivial for a computer program to generate random words. The tricky part is creating words that humans perceive as legible and pronounceable instead of mangled and cryptic. This web app solves the problem by applying a Markov chain.

View the live site here.

Why?

Because Markov chains are cool!

I originally wanted a program to help me generate science fiction names. There were plenty of websites out there that used Markov chains to generate random paragraphs, but I couldn't find any good ones to generate random words.

I probably got a little carried away with CoffeeScript and Bootstrap.

Contributions

Please contribute. I'm so lonely.

Report issues here and I'll gladly look into them.

Markov.coffee

The back-end logic for managing the Markov chain is available as a standalone CoffeeScript module. It's flexible enough for you to use in your own project, if you want to. Like the rest of this project, it's distributed under the permissive MIT license.

Below is a light introduction to the module. For more detailed documentation, see the documentation comments in the source.

Markov.coffee-specific terminology

  • element: In Markov.coffee, an "element" is a basic, indivisible building block of the material that we are working with. For example, if the end goal is to generate random sentences, then the individual words are the elements.
  • sequence: A "sequence" is simply a list of elements. Markov.coffee can accept sequences both as arrays and as strings. (If you pass a string as a sequence, then the individual characters are the elements.)

Although not strictly necessary, some familiarity with Markov chains and n-grams is also helpful here.

Setup

After including the script, Markov objects can be constructed like this:

markov = new Markov ["sassafras", "mississippi"], 1

# Or, on CommonJS:
# Markov = require "./markov"
# markov = new Markov ["sassafras", "mississippi"], 1

The first parameter to the constructor is an array of sequences. The sequences are combined together to form the corpus. The generator takes care not to link elements across sequence boundaries. In the example above, the last S in sassafras is not associated with the M in mississippi. If you really do want those letters to be associated, here's how to do it:

markov = new Markov ["sassafrasmississippi"], 1

The second parameter to the constructor is n, the Markov order - basically, how many previous elements the next element depends on. Low values make the Markov chain more random, while high values make it stick closer to the corpus.

If left unspecified, the array of sequences defaults to [] and the Markov order defaults to 2.

You can directly modify these properties later in your code, if you need to. They're not private variables.

markov.sequences.push "foo"
markov.n = 3

Generation

Make the Markov chain do something useful with .generate(). Note that it returns an array, so if you want a string you'll have to use .join("").

markov = new Markov ["sassafras", "mississippi"]
alert markov.generate().join "" # Alerted "rassippi".
alert markov.generate().join "" # Alerted "frassissafrassippi".

.generate() takes an optional maximum length parameter, e.g. markov.generate(10) to limit generated words to 10 characters long. If unspecified, it defaults to 20 elements. There always needs to be a maximum length, because otherwise, things like this could result in infinite loops:

markov = new Markov ["abba"], 1
alert markov.generate().join "" # "bbababbabbbbababababbbabababababbabbbbabbbabababab..."

Other Stuff

The Markov class supports other methods, too. I won't describe them all in detail here, but you can read about how they work in the source if you're interested.

  • .ngrams() gives you the raw list of n-grams used to build the chain.
  • .tree() gives you a probability tree representing the n-grams.
  • .continue(sequence) gives you a single next element to continue sequence.
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].