All Projects → bilal2vec → L2

bilal2vec / L2

Licence: mit
l2 is a fast, Pytorch-style Tensor+Autograd library written in Rust

Programming Languages

rust
11053 projects

Projects that are alternatives of or similar to L2

Aiyprojects Raspbian
API libraries, samples, and system images for AIY Projects (Voice Kit and Vision Kit)
Stars: ✭ 1,510 (+1098.41%)
Mutual labels:  ai
Nlp
兜哥出品 <一本开源的NLP入门书籍>
Stars: ✭ 1,677 (+1230.95%)
Mutual labels:  ai
Snake Ai Reinforcement
AI for Snake game trained from pixels using Deep Reinforcement Learning (DQN).
Stars: ✭ 123 (-2.38%)
Mutual labels:  ai
Otto
Otto sample project for the AI Planner
Stars: ✭ 113 (-10.32%)
Mutual labels:  ai
Pkulaw spider
爬取北大法宝网http://www.pkulaw.cn/Case/
Stars: ✭ 113 (-10.32%)
Mutual labels:  ai
Ai Thermometer
Code for automatically measuring the temperature of people using a thermal camera.
Stars: ✭ 116 (-7.94%)
Mutual labels:  ai
Xiaoai
a tiny&smart AI & Repo for work for AI Survey百度云资源持续更新中,欢迎点赞star Min's blog 欢迎访问我的博客主页!(Welcome to my blog website !)https://liweimin1996.github.io/
Stars: ✭ 111 (-11.9%)
Mutual labels:  ai
Musical Creativity
Models of Musical Creativity (in Clojure)
Stars: ✭ 125 (-0.79%)
Mutual labels:  ai
Deep Neuroevolution
Deep Neuroevolution
Stars: ✭ 1,526 (+1111.11%)
Mutual labels:  ai
Trainer Mac
Trains a model, then generates a complete Xcode project that uses it - no code necessary
Stars: ✭ 122 (-3.17%)
Mutual labels:  ai
Xlearning Xdml
extremely distributed machine learning
Stars: ✭ 113 (-10.32%)
Mutual labels:  ai
Mac Graph
The MacGraph network. An attempt to get MACnets running on graph knowledge
Stars: ✭ 113 (-10.32%)
Mutual labels:  ai
Tenginekit
TengineKit - Free, Fast, Easy, Real-Time Face Detection & Face Landmarks & Face Attributes & Hand Detection & Hand Landmarks & Body Detection & Body Landmarks & Iris Landmarks & Yolov5 SDK On Mobile.
Stars: ✭ 2,103 (+1569.05%)
Mutual labels:  ai
Owl Bt
owl-bt is editor for Behavior trees. It has been inspired by Unreal engine behavior trees in a way, that it supports special node items like decorators and services. This makes trees smaller and much more readable.
Stars: ✭ 112 (-11.11%)
Mutual labels:  ai
Baidu Ai Go Sdk
百度AI服务go语言sdk
Stars: ✭ 124 (-1.59%)
Mutual labels:  ai
Microsoft ai
人工智能实战微信小程序demo
Stars: ✭ 111 (-11.9%)
Mutual labels:  ai
Mtensor
A C++ Cuda Tensor Lazy Computing Library
Stars: ✭ 115 (-8.73%)
Mutual labels:  tensor
Djl Demo
Demo applications showcasing DJL
Stars: ✭ 126 (+0%)
Mutual labels:  ai
Machine Learning Flappy Bird
Machine Learning for Flappy Bird using Neural Network and Genetic Algorithm
Stars: ✭ 1,683 (+1235.71%)
Mutual labels:  ai
Modelchimp
Experiment tracking for machine and deep learning projects
Stars: ✭ 121 (-3.97%)
Mutual labels:  ai

l2 • 🤖

A Pytorch-style Tensor+Autograd library written in Rust

Rust: CI License: MIT crates.io l2 badge docs.rs l2 badge

InstallationContributingAuthorsLicenseAcknowledgements

Made by Bilal Khan • https://bilal.software

What is l2?

l2 is named after the l2 or Euclidean distance, a popular distance function in deep learning

l2 is a Pytorch-style Tensor+Autograd library written in Rust. It contains a multidimensional array class, Tensor, with support for strided arrays, numpy-style array slicing, broadcasting, and most major math operations (including fast, BLAS-accelerated matrix multiplication!). On top of this, l2 has a built-in efficient graph-based autograd engine that keeps track of all operations performed on a tensor and topologically sorts and traverses the graph to compute the gradients.

I also made a more simplified C++ version of l2 last year, which you can take a look at here

Quick start

Add l2 = "1.0.3" to your Cargo.toml file and add the following to main.rs

Note: L2 will by default use Apple's acclerate BLAS library on macOS You can also change the BLAS library that you want to use yourself. Take a look at the blas-src crate for more information

use l2::tensor::*;

let x: Tensor = Tensor::normal(&[2, 4], 0.0, 1.0)?;
let y: Tensor = Tensor::normal(&[4, 1], 0.0, 1.0)?;

let z: Tensor = l2::matmul(&x, &y)?;

z.backward();

println!("{}", z);

Design choices

I made l2 to get better at using Rust and to learn more about how libraries like Pytorch and Tensorflow work behind the scenes, so don't expect this library to be production-ready :)

l2 is surprisingly fast especially since I didn't try very hard to optimize all the operators, it's usually only less than one order of magnitude slower than Pytorch in most of the benchmarks that I ran. l2 only supports a cpu backend at the moment since I'm not familiar enough with rust to start working with CUDA and cudnn. So far, l2 doesn't have any Pytorch-style abstractions like the Parameter, Layer, or Module classes. There might still be some bugs in the transpose operators and calling .backward() on tensors with more dimensions. I was interested in using Rust's Const Generics to run compile-time shape checks but I decided to leave it until some other time.

Contributing

This repository is still a work in progress, so if you find a bug, think there is something missing, or have any suggestions for new features, feel free to open an issue or a pull request. Feel free to use the library or code from it in your own projects, and if you feel that some code used in this project hasn't been properly accredited, please open an issue.

Authors

  • Bilal Khan

License

This project is licensed under the MIT License - see the license file for details

Acknowledgements

The fast.ai deep learning from the foundations course (https://course.fast.ai/part2) teaches a lot about how to make your own deep learning library

Some of the resources that I found useful when working on this library include:

This README is based on:

I used carbon.now.sh with the "Shades of Purple" theme for the screenshot at the beginning of this README

This project contains ~4300 lines of code

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].