All Projects → alejandro-isaza → Braincore

alejandro-isaza / Braincore

Licence: mit
The iOS and OS X neural network framework

Programming Languages

swift
15916 projects

Projects that are alternatives of or similar to Braincore

YALCT
Yet Another Live Coding Tool - Powered by Veldrid and elbow grease
Stars: ✭ 25 (-93.33%)
Mutual labels:  metal
parm
minimal macOS Metal application
Stars: ✭ 41 (-89.07%)
Mutual labels:  metal
Viry3d
Cross platform 2D and 3D game engine in C++.
Stars: ✭ 307 (-18.13%)
Mutual labels:  metal
ThickRedLine
Thick Red Line - drawing thick lines for SceneKit with metal shaders
Stars: ✭ 40 (-89.33%)
Mutual labels:  metal
mosaix
An iOS photo mosaic application.
Stars: ✭ 42 (-88.8%)
Mutual labels:  metal
Urde
Data interchange and engine re-implementation for games by Retro Studios | Mirror
Stars: ✭ 253 (-32.53%)
Mutual labels:  metal
Flocking
An example showing how to use SwiftUI, Satin, Forge and Youi to simulate birds flocking via a compute particle system (n-body).
Stars: ✭ 63 (-83.2%)
Mutual labels:  metal
Ios 10 Sampler
Code examples for new APIs of iOS 10.
Stars: ✭ 3,341 (+790.93%)
Mutual labels:  metal
SuperShapes
A tiny macOS app showing how to use Satin, Forge, Youi and SwiftUI to visualize super shapes in 3D.
Stars: ✭ 42 (-88.8%)
Mutual labels:  metal
Metalscope
Metal-backed 360° panorama view for iOS
Stars: ✭ 293 (-21.87%)
Mutual labels:  metal
CrossWindow-Graphics
A header only library to simplify creating 🌋 Vulkan / ⚪ OpenGL / 🌐 WebGL / ❎DirectX / 🤖 Metal data structures with CrossWindow.
Stars: ✭ 48 (-87.2%)
Mutual labels:  metal
metal dataset
metal lyrics and band names dataset (raw)
Stars: ✭ 19 (-94.93%)
Mutual labels:  metal
Waifu2x Mac
Waifu2x-ios port to macOS, still in Core ML and Metal
Stars: ✭ 258 (-31.2%)
Mutual labels:  metal
pygfx
Like ThreeJS but for Python and based on wgpu
Stars: ✭ 72 (-80.8%)
Mutual labels:  metal
Ultralight
Next-generation HTML renderer for apps and games
Stars: ✭ 3,585 (+856%)
Mutual labels:  metal
Apple-Silicon-Guide
Apple Silicon Guide. Learn all about the M1, M1 Pro, M1 Max, and M1 Ultra chips.
Stars: ✭ 240 (-36%)
Mutual labels:  metal
LuisaRender
High-Performance Multiple-Backend Renderer Based on LuisaCompute
Stars: ✭ 47 (-87.47%)
Mutual labels:  metal
Water
Simple calculation to render cheap water effects.
Stars: ✭ 372 (-0.8%)
Mutual labels:  metal
Colormap Shaders
A collection of shaders to draw color maps.
Stars: ✭ 315 (-16%)
Mutual labels:  metal
Metalfilters
Instagram filters implemented in Metal
Stars: ✭ 272 (-27.47%)
Mutual labels:  metal

BrainCore

CocoaPods Compatible Carthage Compatible

BrainCore is a simple but fast neural network framework written in Swift. It uses Metal which makes it screamin' fast. If you want to see it in action check out InfiniteMonkeys—an app that uses a recursive neural network to generate poems.

Features

  • [x] Inner product layers
  • [x] Linear rectifier (ReLU) layers
  • [x] Sigmoid layers
  • [x] LSTM layers
  • [x] L2 Loss layers

Requirements

  • iOS 8.0+ / Mac OS X 10.11+
  • Xcode 7.2+
  • A device that supports Metal (doesn't work on the iOS simulator)

Usage

Network Definition

Before you build your network, start by building all the layers. This is as simple as calling each constructor:

let dataLayer = MyDataLayer()
let lstmLayer = LSTMLayer(weights: lstmWeights, biases: lstmBiases)
let ipLayer = InnerProductLayer(weights: ipWeights, biases: ipBiases)
let reluLayer = ReLULayer(size: ipBiases.count)
let sinkLayer = MySinkLayer()

BrainCore uses overloaded operators to make network definitions more concise. To connect layers together simply use the => operator inside a Net.build {} closure:

let net = Net.build {
    dataLayer => lstmLayer => ipLayer => reluLayer => sinkLayer
}

If you need to concatenate the output of two layers put them inside square brackets:

let net = Net.build {
    [dataLayer1, dataLayer2] => lstmLayer => ipLayer => reluLayer => sinkLayer
}

Similarly, if you need to split the output of one layer put its target layers in square brackets:

let net = Net.build {
    dataLayer => lstmLayer => ipLayer => reluLayer => [sinkLayer1, sinkLayer2]
}

When splitting, the inputSize of the target layers will determine where to split. If the sum of the target layers' inputSizes doesn't match the source layer's outputSize and error will be thrown.

If you want to continue on separate branches after a split you have to split the definition into separate lines:

let net = Net.build {
    dataLayer => lstmLayer => [ipLayer1, ipLayer2]
    ipLayer1 => reluLayer1 => sinkLayer1
    ipLayer2 => reluLayer2 => sinkLayer2
}

Finally if you want send multiple copies of the output of a layer to different layers use the =>> operator:

let net = Net.build {
    dataLayer => lstmLayer
    lstmLayer =>> ipLayer1 => reluLayer1 => sinkLayer1
    lstmLayer =>> ipLayer2 => reluLayer2 => sinkLayer2
}

Evaluating

Currently BrainCore only supports executing pre-trained networks. Ideally you would train your network on a server using one of the well-established neural network frameworks and import the trained weights into BrainCore. We are working on implementing solvers so that you can do everything inside BrainCore, stay posted.

Let's start by creating the layers.

// Load weights and biases from a pre-trained network
let lstmWeights = ...
let lstmBiases = ...
let ipWeights = ...
let ipBiases = ...

// Create layers
let dataLayer = MyDataLayer()
let lstmLayer = LSTMLayer(weights: lstmWeights, biases: lstmBiases)
let ipLayer = InnerProductLayer(weights: ipWeights, biases: ipBiases)
let reluLayer = ReLULayer(size: ipBiases.count)
let sinkLayer = MySinkLayer()

Next we'll build the net.

let net = Net.build {
    dataLayer => lstmLayer => ipLayer => reluLayer => sinkLayer
}

And finally execute! You need to provide a Metal device to the runner which is usually just the default device.

guard let device = MTLCreateSystemDefaultDevice() else {
    fatalError("Failed to create a Metal device.")
}

let evaluator: Evaluator
do {
    evaluator = try Evaluator(net: net, device: device)
} catch let e {
    fatalError("Failed to create an Evaluator: \(e)")
}

evaluator.evaluate { snapshot in
    print("Feed-forward pass complete!")
}

The evaluator may fail to build if there is any problem creating the buffers or initializing all the Metal code, that's why there is a try.

Calling evaluate() will execute a single forward pass, but you can call this as often as you want. In fact you will want to call evaluate() multiple times before you get any results back so that you maximise the GPU bandwidth. You can also increase the batch size to execute multiple passes in parallel.

Your data layer will most likely want to provide new data every time you call evaluate(). So your code may look something like

while !shouldStop {
    dataLayer.gather()
    evaluator.evaluate(completion)
}

Note: both the sink layer's consume() function and the completion closure will be called from a background thread. Make sure you synchronize access to the data as needed and try not to block on either of those calls for too long.


License

Upsurge is available under the MIT license. See the LICENSE file for more info.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].