All Projects → lalvim → PartialLeastSquaresRegressor.jl

lalvim / PartialLeastSquaresRegressor.jl

Licence: MIT License
Implementation of a Partial Least Squares Regressor

Programming Languages

julia
2034 projects

Projects that are alternatives of or similar to PartialLeastSquaresRegressor.jl

regression-python
In this repository you can find many different, small, projects which demonstrate regression techniques using python programming language
Stars: ✭ 15 (-51.61%)
Mutual labels:  machine-learning-algorithms, regression, regression-algorithms
Kickstarter-Anticipator
The main aim of this project is to tell that the certain project will be successful or it will fail by applying machine learning algorithm. In this , LOGISTIC REGRESSION is used to determine the success of the project by splitting the data into training and testing models and predicting a successful one.
Stars: ✭ 13 (-58.06%)
Mutual labels:  regression, regression-algorithms, regression-analysis
Machine learning
Estudo e implementação dos principais algoritmos de Machine Learning em Jupyter Notebooks.
Stars: ✭ 161 (+419.35%)
Mutual labels:  machine-learning-algorithms, regression
Dynaml
Scala Library/REPL for Machine Learning Research
Stars: ✭ 195 (+529.03%)
Mutual labels:  machine-learning-algorithms, regression
lolo
A random forest
Stars: ✭ 37 (+19.35%)
Mutual labels:  machine-learning-algorithms, regression
Machine Learning Algorithms
A curated list of almost all machine learning algorithms and deep learning algorithms grouped by category.
Stars: ✭ 92 (+196.77%)
Mutual labels:  machine-learning-algorithms, regression
Machine Learning Concepts
Machine Learning Concepts with Concepts
Stars: ✭ 134 (+332.26%)
Mutual labels:  machine-learning-algorithms, regression
Machine-Learning-Algorithms
All Machine Learning Algorithms
Stars: ✭ 24 (-22.58%)
Mutual labels:  machine-learning-algorithms, regression
Machine Learning From Scratch
Succinct Machine Learning algorithm implementations from scratch in Python, solving real-world problems (Notebooks and Book). Examples of Logistic Regression, Linear Regression, Decision Trees, K-means clustering, Sentiment Analysis, Recommender Systems, Neural Networks and Reinforcement Learning.
Stars: ✭ 42 (+35.48%)
Mutual labels:  machine-learning-algorithms, regression
cheapml
Machine Learning algorithms coded from scratch
Stars: ✭ 17 (-45.16%)
Mutual labels:  machine-learning-algorithms, regression
pycobra
python library implementing ensemble methods for regression, classification and visualisation tools including Voronoi tesselations.
Stars: ✭ 111 (+258.06%)
Mutual labels:  machine-learning-algorithms, regression
brglm2
Estimation and inference from generalized linear models using explicit and implicit methods for bias reduction
Stars: ✭ 18 (-41.94%)
Mutual labels:  regression, regression-algorithms
Openml R
R package to interface with OpenML
Stars: ✭ 81 (+161.29%)
Mutual labels:  machine-learning-algorithms, regression
Mlkit
A simple machine learning framework written in Swift 🤖
Stars: ✭ 144 (+364.52%)
Mutual labels:  machine-learning-algorithms, regression
Php Ml
PHP-ML - Machine Learning library for PHP
Stars: ✭ 7,900 (+25383.87%)
Mutual labels:  machine-learning-algorithms, regression
ARCHModels.jl
A Julia package for estimating ARMA-GARCH models.
Stars: ✭ 63 (+103.23%)
Mutual labels:  julia-language, regression
ml course
"Learning Machine Learning" Course, Bogotá, Colombia 2019 #LML2019
Stars: ✭ 22 (-29.03%)
Mutual labels:  machine-learning-algorithms, regression
ugtm
ugtm: a Python package for Generative Topographic Mapping
Stars: ✭ 34 (+9.68%)
Mutual labels:  machine-learning-algorithms, regression
interactive-simple-linear-regression
A PureScript, browser-based implementation of simple linear regression.
Stars: ✭ 15 (-51.61%)
Mutual labels:  machine-learning-algorithms, regression
Seating Chart
Optimizing a Wedding Reception Seating Chart Using a Genetic Algorithm
Stars: ✭ 25 (-19.35%)
Mutual labels:  machine-learning-algorithms

PartialLeastSquaresRegressor.jl

The PartialLeastSquaresRegressor.jl package is a package with Partial Least Squares Regressor methods. Contains PLS1, PLS2 and Kernel PLS2 NIPALS algorithms. Can be used mainly for regression. However, for classification task, binarizing targets and then obtaining multiple targets, you can apply KPLS.

Install

The package can be installed with the Julia package manager. From the Julia REPL, type ] to enter the Pkg REPL mode and run:

pkg> add PartialLeastSquaresRegressor

Or, equivalently, via the Pkg API:

julia> import Pkg; Pkg.add("PartialLeastSquaresRegressor")

Using

PartialLeastSquaresRegressor is used with MLJ machine learning framework. Here are a few examples to show the Package functionalities:

Example 1

using MLJBase, RDatasets, MLJModels
@load PLSRegressor pkg=PartialLeastSquaresRegressor

# loading data and selecting some features
data = dataset("datasets", "longley")[:, 2:5]

# unpacking the target
y, X = unpack(data, ==(:GNP), colname -> true)

# loading the model
regressor = PLSRegressor(n_factors=2)

# building a pipeline with scaling on data
pls_model = @pipeline Standardizer regressor target=Standardizer

# a simple hould out
train, test = partition(eachindex(y), 0.7, shuffle=true)

pls_machine = machine(pls_model, X, y)

fit!(pls_machine, rows=train)

yhat = predict(pls_machine, rows=test)

mae(yhat, y[test]) |> mean

Example 2

using MLJBase, RDatasets, MLJTuning, MLJModels
@load KPLSRegressor pkg=PartialLeastSquaresRegressor

# loading data and selecting some features
data = dataset("datasets", "longley")[:, 2:5]

# unpacking the target
y, X = unpack(data, ==(:GNP), colname -> true)

# loading the model
pls_model = KPLSRegressor()

# defining hyperparams for tunning
r1 = range(pls_model, :width, lower=0.001, upper=100.0, scale=:log)

# attaching tune
self_tuning_pls_model = TunedModel(model =          pls_model,
                                   resampling = CV(nfolds = 10),
                                   tuning = Grid(resolution = 100),
                                   range = [r1],
                                   measure = mae)

# putting into the machine
self_tuning_pls = machine(self_tuning_pls_model, X, y)

# fitting with tunning
fit!(self_tuning_pls, verbosity=0)

# getting the report
report(self_tuning_pls)

What is Implemented

  • A fast linear algorithm for single targets (PLS1 - NIPALS)
  • A linear algorithm for multiple targets (PLS2 - NIPALS)
  • A non linear algorithm for multiple targets (Kernel PLS2 - NIPALS)

Model Description

  • PLS - PLS MLJ model (PLS1 or PLS2)

    • n_factors::Int = 10 - The number of latent variables to explain the data.
  • KPLS - Kernel PLS MLJ model

    • nfactors::Int = 10 - The number of latent variables to explain the data.
    • kernel::AbstractString = "rbf" - use a non linear kernel.
    • width::AbstractFloat = 1.0 - If you want to z-score columns. Recommended if not z-scored yet.

References

  • PLS1 and PLS2 based on

  • A Kernel PLS2 based on

  • NIPALS: Nonlinear Iterative Partial Least Squares

    • Wold, H. (1966). Estimation of principal components and related models by iterative least squares. In P.R. Krishnaiaah (Ed.). Multivariate Analysis. (pp.391-420) New York: Academic Press.
  • SIMPLS: more efficient, optimal result

    • Supports multivariate Y
    • De Jong, S., 1993. SIMPLS: an alternative approach to partial least squares regression. Chemometrics and Intelligent Laboratory Systems, 18: 251– 263

License

The PartialLeastSquaresRegressor.jl is free software: you can redistribute it and/or modify it under the terms of the MIT "Expat" License. A copy of this license is provided in LICENSE

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].