All Projects → MaxwellRebo → ittk

MaxwellRebo / ittk

Licence: MIT License
Information Theory Toolkit

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to ittk

LinearCorex
Fast, linear version of CorEx for covariance estimation, dimensionality reduction, and subspace clustering with very under-sampled, high-dimensional data
Stars: ✭ 39 (+129.41%)
Mutual labels:  information-theory
ProbQA
Probabilistic question-asking system: the program asks, the users answer. The minimal goal of the program is to identify what the user needs (a target), even if the user is not aware of the existence of such a thing/product/service.
Stars: ✭ 43 (+152.94%)
Mutual labels:  information-theory
T-CorEx
Implementation of linear CorEx and temporal CorEx.
Stars: ✭ 31 (+82.35%)
Mutual labels:  information-theory
raisin
A simple lightweight set of implementations and bindings for compression algorithms written in Go.
Stars: ✭ 17 (+0%)
Mutual labels:  information-theory
P4J
Periodic time series analysis tools based on information theory
Stars: ✭ 42 (+147.06%)
Mutual labels:  information-theory
alchemy
Generate any a-by-( b + c ) finite rectangle SVG containing potentially Infinitely many a-by-( 2 * b ) finite rectangles animated along a number line of ( ( c - b ) / a )^n scale symmetry.
Stars: ✭ 29 (+70.59%)
Mutual labels:  information-theory
information-dropout
Implementation of Information Dropout
Stars: ✭ 36 (+111.76%)
Mutual labels:  information-theory
HSIC-bottleneck
The HSIC Bottleneck: Deep Learning without Back-Propagation
Stars: ✭ 56 (+229.41%)
Mutual labels:  information-theory
einet
Uncertainty and causal emergence in complex networks
Stars: ✭ 77 (+352.94%)
Mutual labels:  information-theory
Math Php
Powerful modern math library for PHP: Features descriptive statistics and regressions; Continuous and discrete probability distributions; Linear algebra with matrices and vectors, Numerical analysis; special mathematical functions; Algebra
Stars: ✭ 2,009 (+11717.65%)
Mutual labels:  information-theory
limit-label-memorization
Improving generalization by controlling label-noise information in neural network weights.
Stars: ✭ 34 (+100%)
Mutual labels:  information-theory
Awesome-Math-Learning
📜 Collection of the most awesome Math learning resources in the form of notes, videos and cheatsheets.
Stars: ✭ 73 (+329.41%)
Mutual labels:  information-theory
Robust-Log-Optimal-Strategy-with-Reinforcement-Learning
We propose a new Portfolio Management strategy combining Log-Optimal based Strategy and Reinforcement-Learning based Strategy.
Stars: ✭ 55 (+223.53%)
Mutual labels:  information-theory
information-theory-deep-learning
Resources and Implementations (PyTorch) for Information Theoretical concepts in Deep Learning
Stars: ✭ 30 (+76.47%)
Mutual labels:  information-theory

ittk: Information Theory Toolkit

Code Climate Join the chat at https://gitter.im/MaxwellRebo/ittk

Information-theoretic methods in Python. Intended for use in data analysis and machine learning applications.

Kept lean and very simple for transparency and ease of integration in other projects. Just import into your project and call the appropriate methods wherever you need to - no need to delve into the esoteric math books.

Please ping me if you find any errors. These functions have been tested against both Matlab and R implementation of the same kind, and found to be generally sound as of this writing.

If you have a suggestion for an algorithm or metric you'd like to see added here, please let me know and I'm happy to add it.

Testing

To run tests, simply do:

nosetests

This will automatically run all of the tests in the tests/ directory.

Usage examples

All of these examples assume discrete variables.

Make sure you're using numpy arrays.

Get the probability of each variable occuring:

import ittk

ittk.probs([1,2,3])
array([ 0.,  0.33333333,  0.33333333,  0.33333333])

Get the entropy of a variable from some discrete observations:

X = numpy.array([7, 7, 7])
ittk.entropy(X)
0.0

Y = numpy.array([0,1,2,3])
ittk.entropy(Y)
2.0

Get the mutual information and variation of information between two variables:

X = numpy.array([7,7,7,3])
Y = numpy.array([0,1,2,3])
ittk.mutual_information(X, Y)
0.8112781244591329

ittk.information_variation(X, Y)
1.1887218755408671

A = numpy.array([1,2,3,4])
B = numpy.array([1,2,3,4])
ittk.mutual_information(A, B)
2.0

Note that the above is not normalized. To normalize it to [0, 1], pass normalized as True:

ittk.mutual_information(A, B, normalized=True)
1.0

Dependencies

  • numpy

  • nose

    Do:

    pip install -r requirements.txt
    
License: MIT
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].