All Projects → karanchahal → buildTensorflow

karanchahal / buildTensorflow

Licence: other
A lightweight deep learning framework made with ❤️

Programming Languages

C++
36643 projects - #6 most used programming language
Cuda
1817 projects

Projects that are alternatives of or similar to buildTensorflow

Deeplearning Cfn
Distributed Deep Learning on AWS Using CloudFormation (CFN), MXNet and TensorFlow
Stars: ✭ 252 (+800%)
Mutual labels:  deeplearning
gan deeplearning4j
Automatic feature engineering using Generative Adversarial Networks using Deeplearning4j and Apache Spark.
Stars: ✭ 19 (-32.14%)
Mutual labels:  deeplearning
Simplified SqueezeNet
An improved version of SqueezeNet networks https://github.com/DeepScale/SqueezeNet
Stars: ✭ 38 (+35.71%)
Mutual labels:  deeplearning
Dawn Bench Entries
DAWNBench: An End-to-End Deep Learning Benchmark and Competition
Stars: ✭ 254 (+807.14%)
Mutual labels:  deeplearning
Savior
(WIP)The deployment framework aims to provide a simple, lightweight, fast integrated, pipelined deployment framework for algorithm service that ensures reliability, high concurrency and scalability of services.
Stars: ✭ 124 (+342.86%)
Mutual labels:  deeplearning
Baidu-Dog2017
http://js.baidu.com/
Stars: ✭ 37 (+32.14%)
Mutual labels:  deeplearning
Scan2cad
[CVPR'19] Dataset and code used in the research project Scan2CAD: Learning CAD Model Alignment in RGB-D Scans
Stars: ✭ 249 (+789.29%)
Mutual labels:  deeplearning
knime-tensorflow
KNIME Deep Learning - Tensorflow Integration
Stars: ✭ 18 (-35.71%)
Mutual labels:  deeplearning
awesome-conformal-prediction
A professionally curated list of awesome Conformal Prediction videos, tutorials, books, papers, PhD and MSc theses, articles and open-source libraries.
Stars: ✭ 998 (+3464.29%)
Mutual labels:  deeplearning
66Days NaturalLanguageProcessing
I am sharing my Journey of 66DaysofData in Natural Language Processing.
Stars: ✭ 127 (+353.57%)
Mutual labels:  deeplearning
easytorch
基于Python的numpy实现的简易深度学习框架,包括自动求导、优化器、layer等的实现。
Stars: ✭ 76 (+171.43%)
Mutual labels:  autodiff
autoptim
Automatic differentiation + optimization
Stars: ✭ 102 (+264.29%)
Mutual labels:  autodiff
Malware-Detection
Deep Learning Based Android Malware Detection Framework
Stars: ✭ 29 (+3.57%)
Mutual labels:  deeplearning
Ai papers
AI Papers
Stars: ✭ 253 (+803.57%)
Mutual labels:  deeplearning
TensorFlow20-Notes
TensorFlow 2.0 Various notes and tutorials
Stars: ✭ 14 (-50%)
Mutual labels:  deeplearning
C3d Pytorch
Pytorch porting of C3D network, with Sports1M weights
Stars: ✭ 251 (+796.43%)
Mutual labels:  deeplearning
genetic deep learning
No description or website provided.
Stars: ✭ 13 (-53.57%)
Mutual labels:  deeplearning
googlecodelabs
TPU ile Yapay Sinir Ağlarınızı Çok Daha Hızlı Eğitin
Stars: ✭ 116 (+314.29%)
Mutual labels:  deeplearning
smd
Simple mmdetection CPU inference
Stars: ✭ 27 (-3.57%)
Mutual labels:  deeplearning
sunode
Solve ODEs fast, with support for PyMC
Stars: ✭ 67 (+139.29%)
Mutual labels:  autodiff

BuildTensorflow

A lightweight deep learning framework made with ❤️

Introduction

Ever interacted with code that looks like this ?

int main() {
    // Load Dataset
    Celsius2Fahrenheit<float,float> dataset;
    dataset.create(5);

    // Create Model
    Dense<float> fc1(1,1,NO_ACTIVATION);

    // Initialise Optimiser
    SGD<float> sgd(0.01);
    
    // Train
    cout<<"Training started"<<endl;
    for(int j = 0;j<2000;j++) {
        for(auto i: dataset.data) {
            // Get data
            auto inp = new Tensor<float>({i.first}, {1,1});
            auto tar = new Tensor<float>({i.second}, {1,1});

            // Forward Prop
            auto out = fc1.forward(inp);

            // Get Loss
            auto finalLoss = tensorOps::mse(tar,out);

            // Compute backProp
            finalLoss->backward();

            // Perform Gradient Descent
            sgd.minimise(finalLoss);
        
        }
    }

    cout<<"Training completed"<<endl;

    // Inference
    float cel = 4;
    auto test = new Tensor<float>({cel}, {1,1});
    auto out1 = fc1.forward(test);

    cout<<"The conversion of "<<cel<<" degrees celsius to fahrenheit is "<<out1->val<<endl; // For 4 Celcius: it's ~39.2
}

No, this isn't some C++ API from Pytorch, this is our very own lightweight deep learning framework learning how to become a celsius to fahrenheit convertor ! The whole codebase is less than a 1000 lines of code and has no external dependencies.

Also, did we mention that our neural network can also be run on the GPU ? (you'll need CUDA support for this though)

Ugh, why do we need another Deep Learning framework ?

Have you every wanted to know how loss.backward() works in your Pytorch code? Or what does sgd.minimise(loss) even do ?

Sure you've read the theory, you know how it works but you don't really know how it works.

Have you ever wanted to go looking into the Pytorch codebase trying to find how automatic differentiation works ? I've tried to and have found the experience really stressful. Diving into those many lines of code is mentally draining and leaves you with more questions that answers.

And, that is why this project exists.

We want to give readers interested in deep learning frameworks an idea of how everything works under the hood by providing a clear and concise codebase that is less than 1000 lines of code, expertly documented and tested rigorously.

Goals

In this learning odyssey we hope to give readers the knowledge of the following:

  1. How production code is written in C++ along with how it is structured. We have written unit tests for each feature and have commented each part of the codebase so that the reader is able to understand the code with minimal fuss.

  2. Lightweight implementations of popular concepts such as Stochastic Gradient Descent, various Loss functions and how they interact with automatic differentiation.

  3. How to speed up your neural networks by running matrix multiplications on the GPU. This framework is both CPU and GPU compatible and gives users some insight into how exactly code is parallelised on the GPU.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].