All Projects → CarsonScott → Online-Category-Learning

CarsonScott / Online-Category-Learning

Licence: other
ML algorithm for real-time classification

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Online-Category-Learning

Hypergan
Composable GAN framework with api and user interface
Stars: ✭ 1,104 (+1547.76%)
Mutual labels:  unsupervised-learning, online-learning
machine learning from scratch matlab python
Vectorized Machine Learning in Python 🐍 From Scratch
Stars: ✭ 28 (-58.21%)
Mutual labels:  classification, unsupervised-learning
AverageShiftedHistograms.jl
⚡ Lightning fast density estimation in Julia ⚡
Stars: ✭ 52 (-22.39%)
Mutual labels:  density-estimation, online-algorithms
Competitive-Feature-Learning
Online feature-extraction and classification algorithm that learns representations of input patterns.
Stars: ✭ 32 (-52.24%)
Mutual labels:  classification, online-learning
Php Ml
PHP-ML - Machine Learning library for PHP
Stars: ✭ 7,900 (+11691.04%)
Mutual labels:  classification, unsupervised-learning
naru
Neural Relation Understanding: neural cardinality estimators for tabular data
Stars: ✭ 76 (+13.43%)
Mutual labels:  unsupervised-learning, density-estimation
SGDLibrary
MATLAB/Octave library for stochastic optimization algorithms: Version 1.0.20
Stars: ✭ 165 (+146.27%)
Mutual labels:  classification, online-learning
MachineLearningSeries
Vídeos e códigos do Universo Discreto ensinando o fundamental de Machine Learning em Python. Para mais detalhes, acompanhar a playlist listada.
Stars: ✭ 20 (-70.15%)
Mutual labels:  classification, classification-algorithm
All About The Gan
All About the GANs(Generative Adversarial Networks) - Summarized lists for GAN
Stars: ✭ 630 (+840.3%)
Mutual labels:  classification, unsupervised-learning
Remixautoml
R package for automation of machine learning, forecasting, feature engineering, model evaluation, model interpretation, data generation, and recommenders.
Stars: ✭ 159 (+137.31%)
Mutual labels:  classification, unsupervised-learning
salt iccv2017
SALT (iccv2017) based Video Denoising Codes, Matlab implementation
Stars: ✭ 26 (-61.19%)
Mutual labels:  unsupervised-learning, online-learning
pyMCR
pyMCR: Multivariate Curve Resolution for Python
Stars: ✭ 55 (-17.91%)
Mutual labels:  unsupervised-learning
SESF-Fuse
SESF-Fuse: An Unsupervised Deep Model for Multi-Focus Image Fusion
Stars: ✭ 47 (-29.85%)
Mutual labels:  unsupervised-learning
VALIS
Vote ALlocating Immune System, an immune-inspired classification algorithm
Stars: ✭ 21 (-68.66%)
Mutual labels:  classification-algorithm
amr
Official adversarial mixup resynthesis repository
Stars: ✭ 31 (-53.73%)
Mutual labels:  unsupervised-learning
awesome-text-classification
Text classification meets word embeddings.
Stars: ✭ 27 (-59.7%)
Mutual labels:  classification
EC-GAN
EC-GAN: Low-Sample Classification using Semi-Supervised Algorithms and GANs (AAAI 2021)
Stars: ✭ 29 (-56.72%)
Mutual labels:  classification
mmselfsup
OpenMMLab Self-Supervised Learning Toolbox and Benchmark
Stars: ✭ 2,315 (+3355.22%)
Mutual labels:  unsupervised-learning
VAENAR-TTS
PyTorch Implementation of VAENAR-TTS: Variational Auto-Encoder based Non-AutoRegressive Text-to-Speech Synthesis.
Stars: ✭ 66 (-1.49%)
Mutual labels:  unsupervised-learning
Hierarchical-attention-network
My implementation of "Hierarchical Attention Networks for Document Classification" in Keras
Stars: ✭ 26 (-61.19%)
Mutual labels:  classification-algorithm

Online Category-learning and Classification

Overview

The following is an outline for an unsupervised machine-learning algorithm that classifies and adapts to input patterns in real-time. It uses two main steps to accomplish this: First, a probability density function is approximated according to a collection of samples. Second, a category is assigned to each new sample with respect to its point on the probability space.

More generally, a multidimensional probability space is generated from on a set of previous observations, and used to comprehend and interpret each incoming observation. A sample represents a point, and the specific location of the point determines how it is classified. A gradient ascent algorithm is used to traverse the space from each sample point to a maximum, resulting in a feasible category estimation for the sample.

Online learning provides a method of optimization in real-time, enabling one to adapt to environmental changes, indefinitely. A number of problems are introduced however, particularly the need for a process which disposes of irrelevant data in order to make room for important information. This is not only important when useless information is learned, but also (and perhaps more importantly) when previously-relevant information is not longer useful. In each case, the irrelevant information must be filtered out or forgotten, such that the finite memory accessible to a system at any given time is being used as efficiently as possible.

When a sample is received, a Gaussian function is multiplied by a learning rate and added to the space surrounding the sample point. Each non-zero point on the space gradually approaches zero, such that any point is reduced back to zero unless it “grows’ via sampling at a faster rate than it “decays”. The decay rate at any given point has a negative correlation with the absolute value at that point. In other words, a high-density area will decay slower than a low-density area, and is preserved more strongly and for a longer period of time than a low-density area. This causes the system to effectively “forget” the less relevant information, thus resolving the optimization problem stated above.

Density Estimation

Examples

The probability space and its derivatives are used to identify the subspaces that represent features or classes, as well as the boundaries between them. Classees are learned based on the relative frequency of a sample at any given time being an instance of that class. Therefore the goal is to generate a space of simple boundary points our lines.

The sample set was generated by choosing 1000 random values between 0 and 100. Each time a new value is selected, there's a moderate chance that instead of choosing bewteen 0 and 100 and random, we i select an element from a subset of values, which represent 'attractors', or states that the system tends toward. In other words, an attractor has a geeater probability of being chosen at any given time than any one of the non-attractor values.

Attractors: {7, 54, 87}

Probability Space

This cannot be done efficiently without some intermediate step, and in this case that deals with converting a probability space into a binary space, given a function that takes a subset of the probability space at any given time and returns the local variance passed through a threshold.

Normed Probability Space

Finally, the binary space is transformed into a sparse array that reperesents boundary points between classes. The derivative is used to assign a given sample to a class, with respect to boundaries that seperate it from the rest of the space. Instances of a given class are detected when a sample point is between the boundaries that define said class.

Feature Space

Boundaries

A feature exists if it satisfies a set of boundary constraints with respect to the spatial relations between each boundary. On one dimension, the constraints call for two boundaries- one on either side of a given point- to be considered valid. Any space between two boundaries is considered a class, and the locations for each boundary are chosen with respect to the binary space, specifically the points where a step in the space is made either up or down.

Combined

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].