All Projects → JuliaML → Lossfunctions.jl

JuliaML / Lossfunctions.jl

Licence: other
Julia package of loss functions for machine learning.

Programming Languages

julia
2034 projects

Projects that are alternatives of or similar to Lossfunctions.jl

Yannl
Yet another neural network library
Stars: ✭ 37 (-58.43%)
Mutual labels:  classification, regression
Php Ml
PHP-ML - Machine Learning library for PHP
Stars: ✭ 7,900 (+8776.4%)
Mutual labels:  classification, regression
Chemometricstools.jl
A collection of tools for chemometrics and machine learning written in Julia.
Stars: ✭ 39 (-56.18%)
Mutual labels:  classification, regression
The Deep Learning With Keras Workshop
An Interactive Approach to Understanding Deep Learning with Keras
Stars: ✭ 34 (-61.8%)
Mutual labels:  classification, regression
Mlbox
MLBox is a powerful Automated Machine Learning python library.
Stars: ✭ 1,199 (+1247.19%)
Mutual labels:  classification, regression
Mlj.jl
A Julia machine learning framework
Stars: ✭ 982 (+1003.37%)
Mutual labels:  classification, regression
Openml R
R package to interface with OpenML
Stars: ✭ 81 (-8.99%)
Mutual labels:  classification, regression
Smile
Statistical Machine Intelligence & Learning Engine
Stars: ✭ 5,412 (+5980.9%)
Mutual labels:  classification, regression
Metriculous
Measure and visualize machine learning model performance without the usual boilerplate.
Stars: ✭ 71 (-20.22%)
Mutual labels:  classification, regression
Sru Deeplearning Workshop
دوره 12 ساعته یادگیری عمیق با چارچوب Keras
Stars: ✭ 66 (-25.84%)
Mutual labels:  classification, regression
Tribuo
Tribuo - A Java machine learning library
Stars: ✭ 882 (+891.01%)
Mutual labels:  classification, regression
Thundersvm
ThunderSVM: A Fast SVM Library on GPUs and CPUs
Stars: ✭ 1,282 (+1340.45%)
Mutual labels:  classification, regression
Bayesian Neural Networks
Pytorch implementations of Bayes By Backprop, MC Dropout, SGLD, the Local Reparametrization Trick, KF-Laplace, SG-HMC and more
Stars: ✭ 900 (+911.24%)
Mutual labels:  classification, regression
Dlcv for beginners
《深度学习与计算机视觉》配套代码
Stars: ✭ 1,244 (+1297.75%)
Mutual labels:  classification, regression
Tensorflow cookbook
Code for Tensorflow Machine Learning Cookbook
Stars: ✭ 5,984 (+6623.6%)
Mutual labels:  classification, regression
Machine Learning From Scratch
Succinct Machine Learning algorithm implementations from scratch in Python, solving real-world problems (Notebooks and Book). Examples of Logistic Regression, Linear Regression, Decision Trees, K-means clustering, Sentiment Analysis, Recommender Systems, Neural Networks and Reinforcement Learning.
Stars: ✭ 42 (-52.81%)
Mutual labels:  classification, regression
Php Ml Examples
Examples use case of PHP-ML library.
Stars: ✭ 526 (+491.01%)
Mutual labels:  classification, regression
Alphapy
Automated Machine Learning [AutoML] with Python, scikit-learn, Keras, XGBoost, LightGBM, and CatBoost
Stars: ✭ 564 (+533.71%)
Mutual labels:  classification, regression
Neuralnetplayground
A MATLAB implementation of the TensorFlow Neural Networks Playground seen on http://playground.tensorflow.org/
Stars: ✭ 60 (-32.58%)
Mutual labels:  classification, regression
Pytsetlinmachine
Implements the Tsetlin Machine, Convolutional Tsetlin Machine, Regression Tsetlin Machine, Weighted Tsetlin Machine, and Embedding Tsetlin Machine, with support for continuous features, multigranularity, and clause indexing
Stars: ✭ 80 (-10.11%)
Mutual labels:  classification, regression

LossFunctions

LossFunctions.jl is a Julia package that provides efficient and well-tested implementations for a diverse set of loss functions that are commonly used in Machine Learning.

License Docs Build Status Build status Coverage Status

Available Losses

Distance-based (Regression) Margin-based (Classification)
distance_losses margin_losses

Please consult the documentation for other losses.

Introduction

Typically, the loss functions we work with in Machine Learning fall into the category of supervised losses. These are multivariate functions of two variables, the true target y, which represents the "ground truth" (i.e. correct answer), and the predicted output ŷ, which is what our model thinks the truth is. A supervised loss function takes these two variables as input and returns a value that quantifies how "bad" our prediction is in comparison to the truth. In other words: the lower the loss, the better the prediction.

This package provides a considerable amount of carefully implemented loss functions, as well as an API to query their properties (e.g. convexity). Furthermore, we expose methods to compute their values, derivatives, and second derivatives for single observations as well as arbitrarily sized arrays of observations. In the case of arrays a user additionally has the ability to define if and how element-wise results are averaged or summed over.

Documentation

Check out the latest documentation

Additionally, you can make use of Julia's native docsystem. The following example shows how to get additional information on HingeLoss within Julia's REPL:

?HingeLoss
search: HingeLoss L2HingeLoss L1HingeLoss SmoothedL1HingeLoss

  L1HingeLoss <: MarginLoss

  The hinge loss linearly penalizes every predicition where the
  resulting agreement a = y⋅ŷ < 1 . It is Lipschitz continuous
  and convex, but not strictly convex.

  L(a) = \max \{ 0, 1 - a \}

  --------------------------------------------------------------------

                Lossfunction                     Derivative
        ┌────────────┬────────────┐      ┌────────────┬────────────┐
      3 │'\.                      │    0 │                  ┌------│
        │  ''_                    │      │                  |      │
        │     \.                  │      │                  |      │
        │       '.                │      │                  |      │
      L │         ''_             │   L' │                  |      │
        │            \.           │      │                  |      │
        │              '.         │      │                  |      │
      0 │                ''_______│   -1 │------------------┘      │
        └────────────┴────────────┘      └────────────┴────────────┘
        -2                        2      -2                        2
                   y ⋅ ŷ                            y ⋅ ŷ

Installation

Get the latest stable release with Julia's package manager:

] add LossFunctions

License

This code is free to use under the terms of the MIT license.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].