loloA random forest
Stars: ✭ 37 (+54.17%)
Orange3🍊 📊 💡 Orange: Interactive data analysis
Stars: ✭ 3,152 (+13033.33%)
Machine Learning Is All You Need🔥🌟《Machine Learning 格物志》: ML + DL + RL basic codes and notes by sklearn, PyTorch, TensorFlow, Keras & the most important, from scratch!💪 This repository is ALL You Need!
Stars: ✭ 173 (+620.83%)
KramersMoyalkramersmoyal: Kramers-Moyal coefficients for stochastic data of any dimension, to any desired order
Stars: ✭ 53 (+120.83%)
Machine-Learning-ModelsIn This repository I made some simple to complex methods in machine learning. Here I try to build template style code.
Stars: ✭ 30 (+25%)
InfiniteboostInfiniteBoost: building infinite ensembles with gradient descent
Stars: ✭ 180 (+650%)
NanoFlowPyTorch implementation of the paper "NanoFlow: Scalable Normalizing Flows with Sublinear Parameter Complexity." (NeurIPS 2020)
Stars: ✭ 63 (+162.5%)
Benchm MlA minimal benchmark for scalability, speed and accuracy of commonly used open source implementations (R packages, Python scikit-learn, H2O, xgboost, Spark MLlib etc.) of the top machine learning algorithms for binary classification (random forests, gradient boosted trees, deep neural networks etc.).
Stars: ✭ 1,835 (+7545.83%)
MultiKDE.jlMultivariate kernel density estimation
Stars: ✭ 30 (+25%)
receiptdIDReceipt.ID is a multi-label, multi-class, hierarchical classification system implemented in a two layer feed forward network.
Stars: ✭ 22 (-8.33%)
ShifuAn end-to-end machine learning and data mining framework on Hadoop
Stars: ✭ 207 (+762.5%)
ChefboostA Lightweight Decision Tree Framework supporting regular algorithms: ID3, C4,5, CART, CHAID and Regression Trees; some advanced techniques: Gradient Boosting (GBDT, GBRT, GBM), Random Forest and Adaboost w/categorical features support for Python
Stars: ✭ 176 (+633.33%)
EmlearnMachine Learning inference engine for Microcontrollers and Embedded devices
Stars: ✭ 154 (+541.67%)
normalizing-flowsPyTorch implementation of normalizing flow models
Stars: ✭ 271 (+1029.17%)
Awesome Decision Tree PapersA collection of research papers on decision, classification and regression trees with implementations.
Stars: ✭ 1,908 (+7850%)
Github-Stars-PredictorIt's a github repo star predictor that tries to predict the stars of any github repository having greater than 100 stars.
Stars: ✭ 34 (+41.67%)
naruNeural Relation Understanding: neural cardinality estimators for tabular data
Stars: ✭ 76 (+216.67%)
Ml ProjectsML based projects such as Spam Classification, Time Series Analysis, Text Classification using Random Forest, Deep Learning, Bayesian, Xgboost in Python
Stars: ✭ 127 (+429.17%)
soft-intro-vae-pytorch[CVPR 2021 Oral] Official PyTorch implementation of Soft-IntroVAE from the paper "Soft-IntroVAE: Analyzing and Improving Introspective Variational Autoencoders"
Stars: ✭ 170 (+608.33%)
cheapmlMachine Learning algorithms coded from scratch
Stars: ✭ 17 (-29.17%)
Shapley regressionsStatistical inference on machine learning or general non-parametric models
Stars: ✭ 37 (+54.17%)
AIML-ProjectsProjects I completed as a part of Great Learning's PGP - Artificial Intelligence and Machine Learning
Stars: ✭ 85 (+254.17%)
wetlandmapRScripts, tools and example data for mapping wetland ecosystems using data driven R statistical methods like Random Forests and open source GIS
Stars: ✭ 16 (-33.33%)
Decision Tree JsSmall JavaScript implementation of ID3 Decision tree
Stars: ✭ 253 (+954.17%)
scorubyRuby Scoring API for PMML
Stars: ✭ 69 (+187.5%)
QuickmlA fast and easy to use decision tree learner in java
Stars: ✭ 230 (+858.33%)
Loan-WebML-powered Loan-Marketer Customer Filtering Engine
Stars: ✭ 13 (-45.83%)
Gumbel-CRFImplementation of NeurIPS 20 paper: Latent Template Induction with Gumbel-CRFs
Stars: ✭ 51 (+112.5%)
Tensorflow Ml Nlp텐서플로우와 머신러닝으로 시작하는 자연어처리(로지스틱회귀부터 트랜스포머 챗봇까지)
Stars: ✭ 176 (+633.33%)
eForestThis is the official implementation for the paper 'AutoEncoder by Forest'
Stars: ✭ 71 (+195.83%)
RandomforestexplainerA set of tools to understand what is happening inside a Random Forest
Stars: ✭ 175 (+629.17%)
Machine Learning ModelsDecision Trees, Random Forest, Dynamic Time Warping, Naive Bayes, KNN, Linear Regression, Logistic Regression, Mixture Of Gaussian, Neural Network, PCA, SVD, Gaussian Naive Bayes, Fitting Data to Gaussian, K-Means
Stars: ✭ 160 (+566.67%)
pykitmlMachine Learning library written in Python and NumPy.
Stars: ✭ 26 (+8.33%)
Machine Learning With PythonPractice and tutorial-style notebooks covering wide variety of machine learning techniques
Stars: ✭ 2,197 (+9054.17%)
goscoreGo Scoring API for PMML
Stars: ✭ 85 (+254.17%)
Machine Learning In RWorkshop (6 hours): preprocessing, cross-validation, lasso, decision trees, random forest, xgboost, superlearner ensembles
Stars: ✭ 144 (+500%)
rfvisA tool for visualizing the structure and performance of Random Forests 🌳
Stars: ✭ 20 (-16.67%)
Trajectory-Analysis-and-Classification-in-Python-Pandas-and-Scikit-LearnFormed trajectories of sets of points.Experimented on finding similarities between trajectories based on DTW (Dynamic Time Warping) and LCSS (Longest Common SubSequence) algorithms.Modeled trajectories as strings based on a Grid representation.Benchmarked KNN, Random Forest, Logistic Regression classification algorithms to classify efficiently t…
Stars: ✭ 41 (+70.83%)
xforestA super-fast and scalable Random Forest library based on fast histogram decision tree algorithm and distributed bagging framework. It can be used for binary classification, multi-label classification, and regression tasks. This library provides both Python and command line interface to users.
Stars: ✭ 20 (-16.67%)
Isl PythonSolutions to labs and excercises from An Introduction to Statistical Learning, as Jupyter Notebooks.
Stars: ✭ 108 (+350%)
handson-ml도서 "핸즈온 머신러닝"의 예제와 연습문제를 담은 주피터 노트북입니다.
Stars: ✭ 285 (+1087.5%)
KernelEstimator.jlThe julia package for nonparametric density estimate and regression
Stars: ✭ 25 (+4.17%)
dlime experimentsIn this work, we propose a deterministic version of Local Interpretable Model Agnostic Explanations (LIME) and the experimental results on three different medical datasets shows the superiority for Deterministic Local Interpretable Model-Agnostic Explanations (DLIME).
Stars: ✭ 21 (-12.5%)