All Projects → RainingComputers → pykitml

RainingComputers / pykitml

Licence: MIT license
Machine Learning library written in Python and NumPy.

Programming Languages

python
139335 projects - #7 most used programming language
Makefile
30231 projects

Projects that are alternatives of or similar to pykitml

Machine Learning With Python
Practice and tutorial-style notebooks covering wide variety of machine learning techniques
Stars: ✭ 2,197 (+8350%)
Mutual labels:  random-forest
Textclassification
several methods for text classification
Stars: ✭ 180 (+592.31%)
Mutual labels:  random-forest
AIML-Projects
Projects I completed as a part of Great Learning's PGP - Artificial Intelligence and Machine Learning
Stars: ✭ 85 (+226.92%)
Mutual labels:  random-forest
Machine Learning Models
Decision Trees, Random Forest, Dynamic Time Warping, Naive Bayes, KNN, Linear Regression, Logistic Regression, Mixture Of Gaussian, Neural Network, PCA, SVD, Gaussian Naive Bayes, Fitting Data to Gaussian, K-Means
Stars: ✭ 160 (+515.38%)
Mutual labels:  random-forest
Tensorflow Ml Nlp
텐서플로우와 머신러닝으로 시작하는 자연어처리(로지스틱회귀부터 트랜스포머 챗봇까지)
Stars: ✭ 176 (+576.92%)
Mutual labels:  random-forest
Quickml
A fast and easy to use decision tree learner in java
Stars: ✭ 230 (+784.62%)
Mutual labels:  random-forest
Machine Learning In R
Workshop (6 hours): preprocessing, cross-validation, lasso, decision trees, random forest, xgboost, superlearner ensembles
Stars: ✭ 144 (+453.85%)
Mutual labels:  random-forest
Trajectory-Analysis-and-Classification-in-Python-Pandas-and-Scikit-Learn
Formed trajectories of sets of points.Experimented on finding similarities between trajectories based on DTW (Dynamic Time Warping) and LCSS (Longest Common SubSequence) algorithms.Modeled trajectories as strings based on a Grid representation.Benchmarked KNN, Random Forest, Logistic Regression classification algorithms to classify efficiently t…
Stars: ✭ 41 (+57.69%)
Mutual labels:  random-forest
Infiniteboost
InfiniteBoost: building infinite ensembles with gradient descent
Stars: ✭ 180 (+592.31%)
Mutual labels:  random-forest
decision-trees-for-ml
Building Decision Trees From Scratch In Python
Stars: ✭ 61 (+134.62%)
Mutual labels:  random-forest
Machine Learning Is All You Need
🔥🌟《Machine Learning 格物志》: ML + DL + RL basic codes and notes by sklearn, PyTorch, TensorFlow, Keras & the most important, from scratch!💪 This repository is ALL You Need!
Stars: ✭ 173 (+565.38%)
Mutual labels:  random-forest
Chefboost
A Lightweight Decision Tree Framework supporting regular algorithms: ID3, C4,5, CART, CHAID and Regression Trees; some advanced techniques: Gradient Boosting (GBDT, GBRT, GBM), Random Forest and Adaboost w/categorical features support for Python
Stars: ✭ 176 (+576.92%)
Mutual labels:  random-forest
Orange3
🍊 📊 💡 Orange: Interactive data analysis
Stars: ✭ 3,152 (+12023.08%)
Mutual labels:  random-forest
Emlearn
Machine Learning inference engine for Microcontrollers and Embedded devices
Stars: ✭ 154 (+492.31%)
Mutual labels:  random-forest
cqr
Conformalized Quantile Regression
Stars: ✭ 152 (+484.62%)
Mutual labels:  random-forest
Benchm Ml
A minimal benchmark for scalability, speed and accuracy of commonly used open source implementations (R packages, Python scikit-learn, H2O, xgboost, Spark MLlib etc.) of the top machine learning algorithms for binary classification (random forests, gradient boosted trees, deep neural networks etc.).
Stars: ✭ 1,835 (+6957.69%)
Mutual labels:  random-forest
Shifu
An end-to-end machine learning and data mining framework on Hadoop
Stars: ✭ 207 (+696.15%)
Mutual labels:  random-forest
Heart disease prediction
Heart Disease prediction using 5 algorithms
Stars: ✭ 43 (+65.38%)
Mutual labels:  random-forest
STOCK-RETURN-PREDICTION-USING-KNN-SVM-GUASSIAN-PROCESS-ADABOOST-TREE-REGRESSION-AND-QDA
Forecast stock prices using machine learning approach. A time series analysis. Employ the Use of Predictive Modeling in Machine Learning to Forecast Stock Return. Approach Used by Hedge Funds to Select Tradeable Stocks
Stars: ✭ 94 (+261.54%)
Mutual labels:  random-forest
Decision Tree Js
Small JavaScript implementation of ID3 Decision tree
Stars: ✭ 253 (+873.08%)
Mutual labels:  random-forest

pykitml logo

pykitml (Python Kit for Machine Learning)

Machine Learning library written in Python and NumPy.

Installation

python3 -m pip install pykitml

Documentation

https://pykitml.readthedocs.io/en/latest/

Demo (MNIST)

Training

import os.path

import numpy as np
import pykitml as pk
from pykitml.datasets import mnist
    
# Download dataset
if(not os.path.exists('mnist.pkl')): mnist.get()

# Load dataset
training_data, training_targets, testing_data, testing_targets = mnist.load()
    
# Create a new neural network
digit_classifier = pk.NeuralNetwork([784, 100, 10])
    
# Train it
digit_classifier.train(
    training_data=training_data,
    targets=training_targets, 
    batch_size=50, 
    epochs=1200, 
    optimizer=pk.Adam(learning_rate=0.012, decay_rate=0.95), 
    testing_data=testing_data, 
    testing_targets=testing_targets,
    testing_freq=30,
    decay_freq=15
)
    
# Save it
pk.save(digit_classifier, 'digit_classifier_network.pkl')

# Show performance
accuracy = digit_classifier.accuracy(training_data, training_targets)
print('Train Accuracy:', accuracy)        
accuracy = digit_classifier.accuracy(testing_data, testing_targets)
print('Test Accuracy:', accuracy)
    
# Plot performance graph
digit_classifier.plot_performance()

# Show confusion matrix
digit_classifier.confusion_matrix(training_data, training_targets)

Trying the model

import random

import numpy as np
import matplotlib.pyplot as plt
import pykitml as pk
from pykitml.datasets import mnist

# Load dataset
training_data, training_targets, testing_data, testing_targets = mnist.load()

# Load the trained network
digit_classifier = pk.load('digit_classifier_network.pkl')

# Pick a random example from testing data
index = random.randint(0, 9999)

# Show the test data and the label
plt.imshow(training_data[index].reshape(28, 28))
plt.show()
print('Label: ', training_targets[index])

# Show prediction
digit_classifier.feed(training_data[index])
model_output = digit_classifier.get_output_onehot()
print('Predicted: ', model_output)

Performance Graph

Performance Graph

Confusion Matrix

Confusion Matrix

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].