All Projects → tlverse → Hal9001

tlverse / Hal9001

Licence: gpl-3.0
🤠 📿 The Highly Adaptive Lasso

Programming Languages

r
7636 projects

Projects that are alternatives of or similar to Hal9001

Node2vec
Implementation of the node2vec algorithm.
Stars: ✭ 654 (+2322.22%)
Mutual labels:  machine-learning-algorithms
Machine Learning Course
💬 Machine Learning Course with Python:
Stars: ✭ 6,798 (+25077.78%)
Mutual labels:  machine-learning-algorithms
2020 algorithm intern information
2020年的算法实习岗位/校招公司信息表,和常见深度学习基础知识笔记、算法岗面试题答案,及暑期计算机视觉实习面经和总结。
Stars: ✭ 914 (+3285.19%)
Mutual labels:  machine-learning-algorithms
Android Tensorflow Lite Example
Android TensorFlow Lite Machine Learning Example
Stars: ✭ 681 (+2422.22%)
Mutual labels:  machine-learning-algorithms
Gimp Ml
AI for GNU Image Manipulation Program
Stars: ✭ 749 (+2674.07%)
Mutual labels:  machine-learning-algorithms
Machine Learning Collection
A resource for learning about ML, DL, PyTorch and TensorFlow. Feedback always appreciated :)
Stars: ✭ 883 (+3170.37%)
Mutual labels:  machine-learning-algorithms
Solid
🎯 A comprehensive gradient-free optimization framework written in Python
Stars: ✭ 546 (+1922.22%)
Mutual labels:  machine-learning-algorithms
Data mining
The Ruby DataMining Gem, is a little collection of several Data-Mining-Algorithms
Stars: ✭ 10 (-62.96%)
Mutual labels:  machine-learning-algorithms
Machine learning refined
Notes, examples, and Python demos for the textbook "Machine Learning Refined" (published by Cambridge University Press).
Stars: ✭ 750 (+2677.78%)
Mutual labels:  machine-learning-algorithms
Model Describer
model-describer : Making machine learning interpretable to humans
Stars: ✭ 22 (-18.52%)
Mutual labels:  machine-learning-algorithms
Jsat
Java Statistical Analysis Tool, a Java library for Machine Learning
Stars: ✭ 683 (+2429.63%)
Mutual labels:  machine-learning-algorithms
Tutorial
Deeplearning Algorithms Tutorial
Stars: ✭ 742 (+2648.15%)
Mutual labels:  machine-learning-algorithms
Daily Neural Network Practice 2
Daily Dose of Neural Network that Everyone Needs
Stars: ✭ 18 (-33.33%)
Mutual labels:  machine-learning-algorithms
Wtte Rnn
WTTE-RNN a framework for churn and time to event prediction
Stars: ✭ 654 (+2322.22%)
Mutual labels:  machine-learning-algorithms
Digitrecognizer
Java Convolutional Neural Network example for Hand Writing Digit Recognition
Stars: ✭ 23 (-14.81%)
Mutual labels:  machine-learning-algorithms
Prmlt
Matlab code of machine learning algorithms in book PRML
Stars: ✭ 5,356 (+19737.04%)
Mutual labels:  machine-learning-algorithms
Tdbn
Stars: ✭ 5 (-81.48%)
Mutual labels:  machine-learning-algorithms
100 Days Of Ml Code
100 Days of ML Coding
Stars: ✭ 33,641 (+124496.3%)
Mutual labels:  machine-learning-algorithms
Bfgs Neldermead Trustregion
Python implementation of some numerical (optimization) methods
Stars: ✭ 8 (-70.37%)
Mutual labels:  machine-learning-algorithms
Spring2017 proffosterprovost
Introduction to Data Science
Stars: ✭ 18 (-33.33%)
Mutual labels:  machine-learning-algorithms

R/hal9001

Travis-CI Build Status AppVeyor Build Status Coverage Status CRAN CRAN downloads Project Status: Active – The project has reached a stable, usable state and is being actively developed. License: GPL v3 DOI DOI

The Scalable Highly Adaptive Lasso

Authors: Jeremy Coyle, Nima Hejazi, and Mark van der Laan


What’s hal9001?

hal9001 is an R package providing an implementation of the scalable highly adaptive lasso (HAL), a nonparametric regression estimator that applies L1-regularized lasso regression to a design matrix composed of indicator functions corresponding to the support of the functional over a set of covariates and interactions thereof. HAL regression allows for arbitrarily complex functional forms to be estimated at fast (near-parametric) convergence rates under only global smoothness assumptions (van der Laan 2017a; Bibaut and van der Laan 2019). For detailed theoretical discussions of the highly adaptive lasso estimator, consider consulting, for example, van der Laan (2017a), van der Laan (2017b), and van der Laan and Bibaut (2017). For a computational demonstration of the versatility of HAL regression, see Benkeser and van der Laan (2016). Recent theoretical works have demonstrated success in building efficient estimators of complex parameters when particular variations of HAL regression are used to estimate nuisance parameters (e.g., van der Laan, Benkeser, and Cai 2019; Ertefaie, Hejazi, and van der Laan 2020).


Installation

For standard use, we recommend installing the package from CRAN via

install.packages("hal9001")

To contribute, install the development version of hal9001 from GitHub via remotes:

remotes::install_github("tlverse/hal9001")

Issues

If you encounter any bugs or have any specific feature requests, please file an issue.


Example

Consider the following minimal example in using hal9001 to generate predictions via Highly Adaptive Lasso regression:

# load the package and set a seed
library(hal9001)
#> Loading required package: Rcpp
#> hal9001 v0.2.8: The Scalable Highly Adaptive Lasso
set.seed(385971)

# simulate data
n <- 100
p <- 3
x <- matrix(rnorm(n * p), n, p)
y <- x[, 1] * sin(x[, 2]) + rnorm(n, mean = 0, sd = 0.2)

# fit the HAL regression
hal_fit <- fit_hal(X = x, Y = y)
#> [1] "I'm sorry, Dave. I'm afraid I can't do that."
hal_fit$times
#>                   user.self sys.self elapsed user.child sys.child
#> enumerate_basis       0.003    0.000   0.003          0         0
#> design_matrix         0.003    0.000   0.002          0         0
#> reduce_basis          0.000    0.000   0.000          0         0
#> remove_duplicates     0.010    0.000   0.011          0         0
#> lasso                 0.289    0.012   0.304          0         0
#> total                 0.306    0.012   0.321          0         0

# training sample prediction
preds <- predict(hal_fit, new_data = x)
mean(hal_mse <- (preds - y)^2)
#> [1] 0.006991539

Contributions

Contributions are very welcome. Interested contributors should consult our contribution guidelines prior to submitting a pull request.


Citation

After using the hal9001 R package, please cite both of the following:

    @software{coyle2020hal9001-rpkg,
      author = {Coyle, Jeremy R and Hejazi, Nima S and {van der Laan}, Mark
        J},
      title = {{hal9001}: The scalable highly adaptive lasso},
      year  = {2020},
      url = {https://doi.org/10.5281/zenodo.3558313},
      doi = {10.5281/zenodo.3558313}
      note = {{R} package version 0.2.7}
    }

    @article{hejazi2020hal9001-joss,
      author = {Hejazi, Nima S and Coyle, Jeremy R and {van der Laan}, Mark
        J},
      title = {{hal9001}: Scalable highly adaptive lasso regression in
        {R}},
      year  = {2020},
      url = {https://doi.org/10.21105/joss.02526},
      doi = {10.21105/joss.02526},
      journal = {Journal of Open Source Software},
      publisher = {The Open Journal}
    }

License

© 2017-2021 Jeremy R. Coyle & Nima S. Hejazi

The contents of this repository are distributed under the GPL-3 license. See file LICENSE for details.


References

Benkeser, David, and Mark J van der Laan. 2016. “The Highly Adaptive Lasso Estimator.” In 2016 IEEE International Conference on Data Science and Advanced Analytics (DSAA). IEEE. https://doi.org/10.1109/dsaa.2016.93.

Bibaut, Aurélien F, and Mark J van der Laan. 2019. “Fast Rates for Empirical Risk Minimization over Càdlàg Functions with Bounded Sectional Variation Norm.” https://arxiv.org/abs/1907.09244.

Ertefaie, Ashkan, Nima S Hejazi, and Mark J van der Laan. 2020. “Nonparametric Inverse Probability Weighted Estimators Based on the Highly Adaptive Lasso.” https://arxiv.org/abs/2005.11303.

van der Laan, Mark J. 2017a. “A Generally Efficient Targeted Minimum Loss Based Estimator Based on the Highly Adaptive Lasso.” The International Journal of Biostatistics. https://doi.org/10.1515/ijb-2015-0097.

———. 2017b. “Finite Sample Inference for Targeted Learning.” https://arxiv.org/abs/1708.09502.

van der Laan, Mark J, David Benkeser, and Weixin Cai. 2019. “Efficient Estimation of Pathwise Differentiable Target Parameters with the Undersmoothed Highly Adaptive Lasso.” https://arxiv.org/abs/1908.05607.

van der Laan, Mark J, and Aurélien F Bibaut. 2017. “Uniform Consistency of the Highly Adaptive Lasso Estimator of Infinite-Dimensional Parameters.” https://arxiv.org/abs/1709.06256.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].