Kaggle CompetitionsThere are plenty of courses and tutorials that can help you learn machine learning from scratch but here in GitHub, I want to solve some Kaggle competitions as a comprehensive workflow with python packages. After reading, you can use this workflow to solve other real problems and use it as a template.
Stars: β 86 (+160.61%)
Machine Learning Workflow With PythonThis is a comprehensive ML techniques with python: Define the Problem- Specify Inputs & Outputs- Data Collection- Exploratory data analysis -Data Preprocessing- Model Design- Training- Evaluation
Stars: β 157 (+375.76%)
fastknnFast k-Nearest Neighbors Classifier for Large Datasets
Stars: β 64 (+93.94%)
ChefboostA Lightweight Decision Tree Framework supporting regular algorithms: ID3, C4,5, CART, CHAID and Regression Trees; some advanced techniques: Gradient Boosting (GBDT, GBRT, GBM), Random Forest and Adaboost w/categorical features support for Python
Stars: β 176 (+433.33%)
Data-ScienceUsing Kaggle Data and Real World Data for Data Science and prediction in Python, R, Excel, Power BI, and Tableau.
Stars: β 15 (-54.55%)
50-days-of-Statistics-for-Data-ScienceThis repository consist of a 50-day program. All the statistics required for the complete understanding of data science will be uploaded in this repository.
Stars: β 19 (-42.42%)
supervised-machine-learningThis repo contains regression and classification projects. Examples: development of predictive models for comments on social media websites; building classifiers to predict outcomes in sports competitions; churn analysis; prediction of clicks on online ads; analysis of the opioids crisis and an analysis of retail store expansion strategies usingβ¦
Stars: β 34 (+3.03%)
linear-treeA python library to build Model Trees with Linear Models at the leaves.
Stars: β 128 (+287.88%)
SegmentationTensorflow implementation : U-net and FCN with global convolution
Stars: β 101 (+206.06%)
NyaggleCode for Kaggle and Offline Competitions
Stars: β 209 (+533.33%)
MLDay18Material from "Random Forests and Gradient Boosting Machines in R" presented at Machine Learning Day '18
Stars: β 15 (-54.55%)
AdaptiveRandomForestRepository for the AdaptiveRandomForest algorithm implemented in MOA 2016-04
Stars: β 28 (-15.15%)
SharplearningMachine learning for C# .Net
Stars: β 294 (+790.91%)
DtreevizA python library for decision tree visualization and model interpretation.
Stars: β 1,857 (+5527.27%)
MachinelearnjsMachine Learning library for the web and Node.
Stars: β 498 (+1409.09%)
Machine Learning In RWorkshop (6 hours): preprocessing, cross-validation, lasso, decision trees, random forest, xgboost, superlearner ensembles
Stars: β 144 (+336.36%)
Machine Learning With PythonPractice and tutorial-style notebooks covering wide variety of machine learning techniques
Stars: β 2,197 (+6557.58%)
ferFacial Expression Recognition
Stars: β 32 (-3.03%)
Machine Learning ModelsDecision Trees, Random Forest, Dynamic Time Warping, Naive Bayes, KNN, Linear Regression, Logistic Regression, Mixture Of Gaussian, Neural Network, PCA, SVD, Gaussian Naive Bayes, Fitting Data to Gaussian, K-Means
Stars: β 160 (+384.85%)
Orange3π π π‘ Orange: Interactive data analysis
Stars: β 3,152 (+9451.52%)
tsflexFlexible time series feature extraction & processing
Stars: β 252 (+663.64%)
Kaggle NotebooksSample notebooks for Kaggle competitions
Stars: β 77 (+133.33%)
LightautomlLAMA - automatic model creation framework
Stars: β 196 (+493.94%)
LightgbmA fast, distributed, high performance gradient boosting (GBT, GBDT, GBRT, GBM or MART) framework based on decision tree algorithms, used for ranking, classification and many other machine learning tasks.
Stars: β 13,293 (+40181.82%)
rfvisA tool for visualizing the structure and performance of Random Forests π³
Stars: β 20 (-39.39%)
Amazon-Fine-Food-ReviewMachine learning algorithm such as KNN,Naive Bayes,Logistic Regression,SVM,Decision Trees,Random Forest,k means and Truncated SVD on amazon fine food review
Stars: β 28 (-15.15%)
yggdrasil-decision-forestsA collection of state-of-the-art algorithms for the training, serving and interpretation of Decision Forest models.
Stars: β 156 (+372.73%)
Mljar SupervisedAutomated Machine Learning Pipeline with Feature Engineering and Hyper-Parameters Tuning π
Stars: β 961 (+2812.12%)
Home Credit Default RiskDefault risk prediction for Home Credit competition - Fast, scalable and maintainable SQL-based feature engineering pipeline
Stars: β 68 (+106.06%)
Statistical-Learning-using-RThis is a Statistical Learning application which will consist of various Machine Learning algorithms and their implementation in R done by me and their in depth interpretation.Documents and reports related to the below mentioned techniques can be found on my Rpubs profile.
Stars: β 27 (-18.18%)
Predicting real estate prices using scikit LearnPredicting Amsterdam house / real estate prices using Ordinary Least Squares-, XGBoost-, KNN-, Lasso-, Ridge-, Polynomial-, Random Forest-, and Neural Network MLP Regression (via scikit-learn)
Stars: β 78 (+136.36%)
Machine Learning Is All You Needπ₯πγMachine Learning ζ Όη©εΏγ: ML + DL + RL basic codes and notes by sklearn, PyTorch, TensorFlow, Keras & the most important, from scratch!πͺ This repository is ALL You Need!
Stars: β 173 (+424.24%)
kaggle-berlinMaterial of the Kaggle Berlin meetup group!
Stars: β 36 (+9.09%)
digit recognizerCNN digit recognizer implemented in Keras Notebook, Kaggle/MNIST (0.995).
Stars: β 27 (-18.18%)
KaggleKaggle Kernels (Python, R, Jupyter Notebooks)
Stars: β 26 (-21.21%)
SporfThis is the implementation of Sparse Projection Oblique Randomer Forest
Stars: β 70 (+112.12%)
dku-kaggle-classλ¨κ΅λ SWμ€μ¬λν 2020λ
λ μ€νμμ€SWμ€κ³ - μΊκΈλ½κ°κΈ° μμ
μΌμ λ° κ°μμλ£
Stars: β 48 (+45.45%)
StoreItemDemand(117th place - Top 26%) Deep learning using Keras and Spark for the "Store Item Demand Forecasting" Kaggle competition.
Stars: β 24 (-27.27%)
TpotA Python Automated Machine Learning tool that optimizes machine learning pipelines using genetic programming.
Stars: β 8,378 (+25287.88%)
goscoreGo Scoring API for PMML
Stars: β 85 (+157.58%)
argus-tgs-saltKaggle | 14th place solution for TGS Salt Identification Challenge
Stars: β 73 (+121.21%)