All Projects → maxhumber → marc

maxhumber / marc

Licence: MIT license
Markov chain generator for Python and/or Swift

Programming Languages

swift
15916 projects
python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to marc

TwitchMarkovChain
Twitch Bot for generating messages based on what it learned from chat
Stars: ✭ 87 (+42.62%)
Mutual labels:  markov-chain, markov
DISCOTRESS
🦜 DISCOTRESS 🦜 is a software package to simulate and analyse the dynamics on arbitrary Markov chains
Stars: ✭ 20 (-67.21%)
Mutual labels:  markov-chain, markov-chains
porn-description-generator
Generates new porn descriptions based on an edited dataset of xhamster video descriptions uploaded between 2007-2016.
Stars: ✭ 40 (-34.43%)
Mutual labels:  markov-chain, markov
markovifyR
Markovify wrapper for R
Stars: ✭ 81 (+32.79%)
Mutual labels:  markov-chain
MMCAcovid19.jl
Microscopic Markov Chain Approach to model the spreading of COVID-19
Stars: ✭ 15 (-75.41%)
Mutual labels:  markov-chain
4ti2
A software package for algebraic, geometric and combinatorial problems on linear spaces. By R. Hemmecke, R. Hemmecke, M. Köppe, P. Malkin, M. Walter
Stars: ✭ 21 (-65.57%)
Mutual labels:  markov-chains
PyBorg
Fork of PyBorg AI bot for cutie578 on EFNet
Stars: ✭ 45 (-26.23%)
Mutual labels:  markov-chain
Nonlinear-Systems-and-Control
Files for my Nonlinear Systems and Controls class.
Stars: ✭ 16 (-73.77%)
Mutual labels:  markov-chain
markovipy
Yet another markov chain sentence generator
Stars: ✭ 24 (-60.66%)
Mutual labels:  markov-chain
markov-chain
No description or website provided.
Stars: ✭ 34 (-44.26%)
Mutual labels:  markov-chains
NEMO
Modeling Password Guessability Using Markov Models
Stars: ✭ 46 (-24.59%)
Mutual labels:  markov
bayseg
An unsupervised machine learning algorithm for the segmentation of spatial data sets.
Stars: ✭ 46 (-24.59%)
Mutual labels:  markov-chain
Deep-Learning-Mahjong---
Reinforcement learning (RL) implementation of imperfect information game Mahjong using markov decision processes to predict future game states
Stars: ✭ 45 (-26.23%)
Mutual labels:  markov-chain
vk-markovify-chatbot
Бот для ВКонтакте, генерирующий сообщения Марковским процессом на основе сообщений из чата. Witless и сглыпа на минималках.
Stars: ✭ 31 (-49.18%)
Mutual labels:  markov
Awesome-Neural-Logic
Awesome Neural Logic and Causality: MLN, NLRL, NLM, etc. 因果推断,神经逻辑,强人工智能逻辑推理前沿领域。
Stars: ✭ 106 (+73.77%)
Mutual labels:  markov
mchmm
Markov Chains and Hidden Markov Models in Python
Stars: ✭ 89 (+45.9%)
Mutual labels:  markov-chain
markovclick
Python package to model clickstream data as a Markov chain. Inspired by R package clickstream.
Stars: ✭ 29 (-52.46%)
Mutual labels:  markov-chain
insobot
C99 modular IRC bot with markov chains
Stars: ✭ 71 (+16.39%)
Mutual labels:  markov-chain
AALpy
An Active Automata Learning Library Written in Python
Stars: ✭ 60 (-1.64%)
Mutual labels:  markov-chain
Markov-Word-Generator
A web app that uses Markov chains to generate pseudorandom words.
Stars: ✭ 33 (-45.9%)
Mutual labels:  markov-chain
marc

About

marc is a Markov chain generator for Python and/or Swift

Python

Install

pip install marc

Quickstart:

from marc import MarkovChain

player_throws = "RRRSRSRRPRPSPPRPSSSPRSPSPRRRPSSPRRPRSRPRPSSSPRPRPSSRPSRPRSSPRP"
sequence = [throw for throw in player_throws]
# ['R', 'R', 'R', 'S', 'R', 'S', 'R', ...]

chain = MarkovChain(sequence)
chain.update("R", "S")

chain["R"]
# {'P': 0.5, 'R': 0.25, 'S': 0.25}

player_last_throw = "R"
player_predicted_next_throw = chain.next(player_last_throw)
# 'P'

counters = {"R": "P", "P": "S", "S": "R"}
counter_throw = counters[player_predicted_next_throw]
# 'S'

For more inspiration see the python/examples/ directory

Swift

SPM:

dependencies: [
    .package(url: "https://github.com/maxhumber/marc.git", .upToNextMajor(from: "22.5.0"))
]

Quickstart:

import Marc

let playerThrows = "RRRSRSRRPRPSPPRPSSSPRSPSPRRRPSSPRRPRSRPRPSSSPRPRPSSRPSRPRSSPRP"
let sequence = playerThrows.map { String($0) }

let chain = MarkovChain(sequence)
chain.update("R", "S")

print(chain["R"])
// [("P", 0.5), ("R", 0.25), ("S", 0.25)]

let playerLastThrow = "R"
let playerPredictedNextThrow = chain.next(playerLastThrow)!

let counters = ["R": "P", "P": "S", "S": "R"]
let counterThrow = counters[playerPredictedNextThrow]!
print(counterThrow)
// "S"

For more inspiration see the swift/Examples/ directory

API/Comparison

Python Swift
Import from marc import MarkovChain import Marc
Initialize A chain = MarkovChain() chain = MarkovChain<String>()
Initialize B chain = MarkovChain(["R", "P", "S"]) let chain = MarkovChain(["R", "P", "S"])
Update chain chain.update("R", "P") chain.update("R", "P")
Lookup transitions chain["R"] chain["R"]
Generate next chain.next("R") chain.next("R")!

Why

I built the first versions of marc in the Fall of 2019. Back then I created, and used, it as a teaching tool (for how to build and upload a PyPI package). Since March 2020 I've been spending less and less time with Python and more and more time with Swift... and so, just kind of forgot about marc.

Recently, I had an iOS project come up that needed some Markov chains. After surveying GitHub and not finding any implementations that I liked (forgetting that I had already rolled my own in Python) I started from scratch on a new implementation in Swift.

Just as I was finishing the Swift package I re-discovered marc... I had a good laugh looking back through the original Python library. My feelings about the code I wrote and my abilities in 2019 can be summarized in a picture:

meme

Unable to resist a good procrasticode™ project, I cross-ported the finished Swift package to Python and polished up both codebases and documentation into this mono repo.

Honestly, I had a lot of fun trying to mirror the APIs as closely as possible while doing my best to keep the Python code "Pythonic" and the Swift code "Schwifty". The whole project/exercise was incredibly rewarding, interesting, and insightful. Crudely, here's how I found working on both packages:

Python

Like Dislike
defaultdict !! Clunky setup.py packaging
random.choice ! Setting up and working with environments
Dictionary comprehensions + sorting __init__.py and directory issues

Swift

Like Dislike
Package.swift and packaging in general Dictionary performance sucks... (surprising!!)
Don't have to think about environments Need randomness? Too bad. Go roll it yourself
XCTest is nicer/easier than unittest/pytest Playgrounds aren't as good as Hydrogen/Jupyter

So why? For fun! And procrastination. And, more seriously, because I needed some chains in Swift. And then, because I thought it could be interesting to create a Rosetta Stone for Python and Swift... So if you, Dear Reader, are looking to use Markov chains in your Python or Swift project, or are looking to jump to or from either language, I hope you find this useful.

Warning

marc 22.5+ is incompatible with marc 2.x

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].