All Projects → awslabs → Djl

awslabs / Djl

Licence: apache-2.0
An Engine-Agnostic Deep Learning Framework in Java

Programming Languages

java
68154 projects - #9 most used programming language
C++
36643 projects - #6 most used programming language
Jupyter Notebook
11667 projects
c
50402 projects - #5 most used programming language
HTML
75241 projects
shell
77523 projects

Projects that are alternatives of or similar to Djl

djl
An Engine-Agnostic Deep Learning Framework in Java
Stars: ✭ 3,080 (+36.16%)
Mutual labels:  mxnet, ml, autograd, onnxruntime, djl
Onnx
Open standard for machine learning interoperability
Stars: ✭ 11,829 (+422.94%)
Mutual labels:  deep-neural-networks, ml, mxnet
Polyaxon
Machine Learning Platform for Kubernetes (MLOps tools for experimentation and automation)
Stars: ✭ 2,966 (+31.12%)
Mutual labels:  ai, ml, mxnet
Ffdl
Fabric for Deep Learning (FfDL, pronounced fiddle) is a Deep Learning Platform offering TensorFlow, Caffe, PyTorch etc. as a Service on Kubernetes
Stars: ✭ 640 (-71.71%)
Mutual labels:  ai, deep-neural-networks, ml
Netron
Visualizer for neural network, deep learning, and machine learning models
Stars: ✭ 17,193 (+660.08%)
Mutual labels:  ai, ml, mxnet
Caffe2
Caffe2 is a lightweight, modular, and scalable deep learning framework.
Stars: ✭ 8,409 (+271.75%)
Mutual labels:  ai, deep-neural-networks, ml
Mxnet Finetuner
An all-in-one Deep Learning toolkit for image classification to fine-tuning pretrained models using MXNet.
Stars: ✭ 100 (-95.58%)
Mutual labels:  deep-neural-networks, mxnet
Intro To Deep Learning
A collection of materials to help you learn about deep learning
Stars: ✭ 103 (-95.45%)
Mutual labels:  ai, deep-neural-networks
Mac Graph
The MacGraph network. An attempt to get MACnets running on graph knowledge
Stars: ✭ 113 (-95%)
Mutual labels:  ai, ml
Tenginekit
TengineKit - Free, Fast, Easy, Real-Time Face Detection & Face Landmarks & Face Attributes & Hand Detection & Hand Landmarks & Body Detection & Body Landmarks & Iris Landmarks & Yolov5 SDK On Mobile.
Stars: ✭ 2,103 (-7.03%)
Mutual labels:  ai, deep-neural-networks
Deep Dream In Pytorch
Pytorch implementation of the DeepDream computer vision algorithm
Stars: ✭ 90 (-96.02%)
Mutual labels:  ai, deep-neural-networks
Nlp
兜哥出品 <一本开源的NLP入门书籍>
Stars: ✭ 1,677 (-25.86%)
Mutual labels:  ai, fasttext
Deephyper
DeepHyper: Scalable Asynchronous Neural Architecture and Hyperparameter Search for Deep Neural Networks
Stars: ✭ 117 (-94.83%)
Mutual labels:  deep-neural-networks, ml
Dopamine
Dopamine is a research framework for fast prototyping of reinforcement learning algorithms.
Stars: ✭ 9,681 (+327.98%)
Mutual labels:  ai, ml
Ngraph
nGraph has moved to OpenVINO
Stars: ✭ 1,322 (-41.56%)
Mutual labels:  deep-neural-networks, mxnet
Flashlight
A C++ standalone library for machine learning
Stars: ✭ 4,038 (+78.51%)
Mutual labels:  ml, autograd
Bert As Service
Mapping a variable-length sentence to a fixed-length vector using BERT model
Stars: ✭ 9,779 (+332.32%)
Mutual labels:  ai, deep-neural-networks
Trainer Mac
Trains a model, then generates a complete Xcode project that uses it - no code necessary
Stars: ✭ 122 (-94.61%)
Mutual labels:  ai, deep-neural-networks
Nlp Pretrained Model
A collection of Natural language processing pre-trained models.
Stars: ✭ 122 (-94.61%)
Mutual labels:  deep-neural-networks, mxnet
Aognet
Code for CVPR 2019 paper: " Learning Deep Compositional Grammatical Architectures for Visual Recognition"
Stars: ✭ 132 (-94.16%)
Mutual labels:  deep-neural-networks, mxnet

DeepJavaLibrary

Continuous Continuous PyTorch Continuous Tensorflow Docs Nightly Publish

Deep Java Library (DJL)

Overview

Deep Java Library (DJL) is an open-source, high-level, engine-agnostic Java framework for deep learning. DJL is designed to be easy to get started with and simple to use for Java developers. DJL provides a native Java development experience and functions like any other regular Java library.

You don't have to be machine learning/deep learning expert to get started. You can use your existing Java expertise as an on-ramp to learn and use machine learning and deep learning. You can use your favorite IDE to build, train, and deploy your models. DJL makes it easy to integrate these models with your Java applications.

Because DJL is deep learning engine agnostic, you don't have to make a choice between engines when creating your projects. You can switch engines at any point. To ensure the best performance, DJL also provides automatic CPU/GPU choice based on hardware configuration.

DJL's ergonomic API interface is designed to guide you with best practices to accomplish deep learning tasks. The following pseudocode demonstrates running inference:

    // Assume user uses a pre-trained model from model zoo, they just need to load it
    Criteria<Image, Classifications> criteria =
            Criteria.builder()
                    .optApplication(Application.CV.OBJECT_DETECTION) // find object dection model
                    .setTypes(Image.class, Classifications.class) // define input and output
                    .optFilter("backbone", "resnet50") // choose network architecture
                    .build();

    try (ZooModel<Image, Classifications> model = criteria.loadModel()) {
        try (Predictor<Image, Classifications> predictor = model.newPredictor()) {
            Image img = ImageFactory.getInstance().fromUrl("http://..."); // read image
            Classifications result = predictor.predict(img);

            // get the classification and probability
            ...
        }
    }

The following pseudocode demonstrates running training:

    // Construct your neural network with built-in blocks
    Block block = new Mlp(28, 28);

    try (Model model = Model.newInstance("mlp")) { // Create an empty model
        model.setBlock(block); // set neural network to model

        // Get training and validation dataset (MNIST dataset)
        Dataset trainingSet = new Mnist.Builder().setUsage(Usage.TRAIN) ... .build();
        Dataset validateSet = new Mnist.Builder().setUsage(Usage.TEST) ... .build();

        // Setup training configurations, such as Initializer, Optimizer, Loss ...
        TrainingConfig config = setupTrainingConfig();
        try (Trainer trainer = model.newTrainer(config)) {
            /*
             * Configure input shape based on dataset to initialize the trainer.
             * 1st axis is batch axis, we can use 1 for initialization.
             * MNIST is 28x28 grayscale image and pre processed into 28 * 28 NDArray.
             */
            Shape inputShape = new Shape(1, 28 * 28);
            trainer.initialize(new Shape[] {inputShape});

            EasyTrain.fit(trainer, epoch, trainingSet, validateSet);
        }

        // Save the model
        model.save(modelDir, "mlp");
    }

Getting Started

Resources

Release Notes

Building From Source

To build from source, begin by checking out the code. Once you have checked out the code locally, you can build it as follows using Gradle:

# for Linux/macOS:
./gradlew build

# for Windows:
gradlew build

To increase build speed, you can use the following command to skip unit tests:

# for Linux/macOS:
./gradlew build -x test

# for Windows:
gradlew build -x test

Importing into eclipse

to import source project into eclipse

# for Linux/macOS:
./gradlew eclipse


# for Windows:
gradlew eclipse

in eclipse

file->import->gradle->existing gradle project

Note: please set your workspace text encoding setting to UTF-8

Community

You can read our guide to community forums, following DJL, issues, discussions, and RFCs to figure out the best way to share and find content from the DJL community.

Join our slack channel to get in touch with the development team, for questions and discussions.

Follow our twitter to see updates about new content, features, and releases.

关注我们 知乎专栏 获取DJL最新的内容!

Useful Links

License

This project is licensed under the Apache-2.0 License.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].