All Projects → xxxnell → Flex

xxxnell / Flex

Licence: mit
Probabilistic deep learning for data streams.

Programming Languages

scala
5932 projects

Projects that are alternatives of or similar to Flex

Shendusuipian
To know stats by heart
Stars: ✭ 275 (+116.54%)
Mutual labels:  statistics, probability
Yalinqo
Yet Another LINQ to Objects for PHP [Simplified BSD]
Stars: ✭ 400 (+214.96%)
Mutual labels:  statistics, functional-programming
Stats
A C++ header-only library of statistical distribution functions.
Stars: ✭ 292 (+129.92%)
Mutual labels:  statistics, probability
data-science-notes
Open-source project hosted at https://makeuseofdata.com to crowdsource a robust collection of notes related to data science (math, visualization, modeling, etc)
Stars: ✭ 52 (-59.06%)
Mutual labels:  statistics, probability
Uc Davis Cs Exams Analysis
📈 Regression and Classification with UC Davis student quiz data and exam data
Stars: ✭ 33 (-74.02%)
Mutual labels:  statistics, probability
PyImpetus
PyImpetus is a Markov Blanket based feature subset selection algorithm that considers features both separately and together as a group in order to provide not just the best set of features but also the best combination of features
Stars: ✭ 83 (-34.65%)
Mutual labels:  statistics, probability
Stats Maths With Python
General statistics, mathematical programming, and numerical/scientific computing scripts and notebooks in Python
Stars: ✭ 381 (+200%)
Mutual labels:  statistics, probability
Stat Cookbook
📙 The probability and statistics cookbook
Stars: ✭ 1,990 (+1466.93%)
Mutual labels:  statistics, probability
Facsimile
Facsimile Simulation Library
Stars: ✭ 20 (-84.25%)
Mutual labels:  statistics, functional-programming
Python For Probability Statistics And Machine Learning
Jupyter Notebooks for Springer book "Python for Probability, Statistics, and Machine Learning"
Stars: ✭ 481 (+278.74%)
Mutual labels:  statistics, probability
Data-Science-and-Machine-Learning-Resources
List of Data Science and Machine Learning Resource that I frequently use
Stars: ✭ 19 (-85.04%)
Mutual labels:  statistics, probability
Tyche
Statistics utilities for the JVM - in Scala!
Stars: ✭ 93 (-26.77%)
Mutual labels:  statistics, functional-programming
Stanford Cme 106 Probability And Statistics
VIP cheatsheets for Stanford's CME 106 Probability and Statistics for Engineers
Stars: ✭ 242 (+90.55%)
Mutual labels:  statistics, probability
Probability Theory
A quick introduction to all most important concepts of Probability Theory, only freshman level of mathematics needed as prerequisite.
Stars: ✭ 25 (-80.31%)
Mutual labels:  statistics, probability
Quant Notes
Quantitative Interview Preparation Guide, updated version here ==>
Stars: ✭ 180 (+41.73%)
Mutual labels:  statistics, probability
Basic Mathematics For Machine Learning
The motive behind Creating this repo is to feel the fear of mathematics and do what ever you want to do in Machine Learning , Deep Learning and other fields of AI
Stars: ✭ 300 (+136.22%)
Mutual labels:  statistics, probability
Math Php
Powerful modern math library for PHP: Features descriptive statistics and regressions; Continuous and discrete probability distributions; Linear algebra with matrices and vectors, Numerical analysis; special mathematical functions; Algebra
Stars: ✭ 2,009 (+1481.89%)
Mutual labels:  statistics, probability
Teaching
Teaching Materials for Dr. Waleed A. Yousef
Stars: ✭ 435 (+242.52%)
Mutual labels:  statistics, probability
Ethzcheatsheets
Stars: ✭ 92 (-27.56%)
Mutual labels:  statistics, probability
Ptstat
Probabilistic Programming and Statistical Inference in PyTorch
Stars: ✭ 108 (-14.96%)
Mutual labels:  statistics, probability

Flex

Build Status codecov Latest version

Flex is a probabilistic deep learning library for data streams. It has the following features:

  • Fast. Flex provides probabilistic deep learning that is fast enough to solve the real-world problems.
  • Typesafe and Functional. Types and pure functions make the code easy to understand and maintain.
  • Easy. You can program with a minimal knowledge of probability theory.

Today, neural networks have been widely used for solving problems in many areas. However, classical neural networks have some limitations when you want to include uncertainties in the model. For example, suppose that input data and training data contain a lot of noise. If you need to detect whether the data contains false-positive or false-negative, the model should represent how reliable the input and the output are. To deal with this issue, probabilistic deep learning, also known as the Bayesian neural network, can be used. It is a way to treat both input and output as a probability distribution and it is one of the best approaches to represent uncertainties. However, the Bayesian neural network is so computationally slow that it cannot be readily applied to the real-world problems. Flex is fast enough to make it possible to apply the Bayesian neural network to the real-world problems.

Getting Started

WIP. Flex is published to Maven Central and built for Scala 2.12, so you can add the following to your build.sbt:

libraryDependencies ++= Seq(
  "com.xxxnell" %% "flex-core",
  "com.xxxnell" %% "flex-chain"
).map(_ % "0.0.5")

Then, you need to import the context of Flex.

import flex.implicits._
import flex.chain.implicits._

Building a Model

We will use 3 hiddel layers with 10 neurons each.

val (kin, kout) = (20, 10)
val (l0, l1, l2, l3) = (784, 10, 10, 1)
val (k0, k1, k2, k3) = (20, 20, 20, 20)
val model0 = Complex
  .empty(kin, kout)
  .addStd(l0 -> k0, l0 * l1 -> k1, l1 * l2 -> k2, l2 * l3 -> k3)
  .map { case x1 :: z1 :: rem => z1.reshape(l1, l0).mmul(x1).tanh :: rem }
  .map { case h1 :: z2 :: rem => z2.reshape(l2, l1).mmul(h1).tanh :: rem }
  .map { case h2 :: z3 :: rem => z3.reshape(l3, l2).mmul(h2) :: rem }

First, construct an empty model using Complex.empty. Second, add the variables to be used for this neural network. Here, a prior probabilities of these variables are standard normal distributions with a mean of zero and a variance of one. Third, define a transformation of each layers using map operation. In this example, tanh was used as the activator.

Contributing

Contributions are always welcome. Any kind of contribution, such as writing a unit test, documentation, bug fix, or implementing the algorithm of Flex in another language, is helpful. It is also possible to make academic collaboration works. If you need some help, please contact me via email or twitter.

The master branch of this repository contains the latest stable release of Flex. In general, pull requests should be submitted from a separate feature branch starting from the develop branch.

Fo more detail, see the contributing documentation.

License

All code of Flex is available to you under the MIT license.

Copyright the maintainers.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].