All Projects → JONGGON → Deephumanprediction

JONGGON / Deephumanprediction

JG Master Thesis

Labels

Projects that are alternatives of or similar to Deephumanprediction

Adaptis
[ICCV19] AdaptIS: Adaptive Instance Selection Network, https://arxiv.org/abs/1909.07829
Stars: ✭ 314 (+1644.44%)
Mutual labels:  mxnet
Deeplearningzerotoall
TensorFlow Basic Tutorial Labs
Stars: ✭ 4,239 (+23450%)
Mutual labels:  mxnet
Deepo
Setup and customize deep learning environment in seconds.
Stars: ✭ 6,145 (+34038.89%)
Mutual labels:  mxnet
Tensorboard
Standalone TensorBoard for visualizing in deep learning
Stars: ✭ 364 (+1922.22%)
Mutual labels:  mxnet
Capsnet
CapsNet (Capsules Net) in Geoffrey E Hinton paper "Dynamic Routing Between Capsules" - State Of the Art
Stars: ✭ 423 (+2250%)
Mutual labels:  mxnet
Gluon Cv
Gluon CV Toolkit
Stars: ✭ 5,001 (+27683.33%)
Mutual labels:  mxnet
Autogluon
AutoGluon: AutoML for Text, Image, and Tabular Data
Stars: ✭ 3,920 (+21677.78%)
Mutual labels:  mxnet
Machine Learning Curriculum
💻 Make machines learn so that you don't have to struggle to program them; The ultimate list
Stars: ✭ 761 (+4127.78%)
Mutual labels:  mxnet
Bert Embedding
🔡 Token level embeddings from BERT model on mxnet and gluonnlp
Stars: ✭ 424 (+2255.56%)
Mutual labels:  mxnet
Mmdnn
MMdnn is a set of tools to help users inter-operate among different deep learning frameworks. E.g. model conversion and visualization. Convert models between Caffe, Keras, MXNet, Tensorflow, CNTK, PyTorch Onnx and CoreML.
Stars: ✭ 5,472 (+30300%)
Mutual labels:  mxnet
D2l Pytorch
This project reproduces the book Dive Into Deep Learning (https://d2l.ai/), adapting the code from MXNet into PyTorch.
Stars: ✭ 3,810 (+21066.67%)
Mutual labels:  mxnet
D2l Vn
Một cuốn sách tương tác về học sâu có mã nguồn, toán và thảo luận. Đề cập đến nhiều framework phổ biến (TensorFlow, Pytorch & MXNet) và được sử dụng tại 175 trường Đại học.
Stars: ✭ 402 (+2133.33%)
Mutual labels:  mxnet
Deeplearning
深度学习入门教程, 优秀文章, Deep Learning Tutorial
Stars: ✭ 6,783 (+37583.33%)
Mutual labels:  mxnet
Bmxnet
(New version is out: https://github.com/hpi-xnor/BMXNet-v2) BMXNet: An Open-Source Binary Neural Network Implementation Based on MXNet
Stars: ✭ 347 (+1827.78%)
Mutual labels:  mxnet
Aws Machine Learning University Accelerated Tab
Machine Learning University: Accelerated Tabular Data Class
Stars: ✭ 718 (+3888.89%)
Mutual labels:  mxnet
Dali
A GPU-accelerated library containing highly optimized building blocks and an execution engine for data processing to accelerate deep learning training and inference applications.
Stars: ✭ 3,624 (+20033.33%)
Mutual labels:  mxnet
Stn Ocr
Code for the paper STN-OCR: A single Neural Network for Text Detection and Text Recognition
Stars: ✭ 473 (+2527.78%)
Mutual labels:  mxnet
Multi Model Server
Multi Model Server is a tool for serving neural net models for inference
Stars: ✭ 770 (+4177.78%)
Mutual labels:  mxnet
Deepcamera
Open source face recognition on Raspberry Pi. SharpAI is open source stack for machine learning engineering with private deployment and AutoML for edge computing. DeepCamera is application of SharpAI designed for connecting computer vision model to surveillance camera. Developers can run same code on Raspberry Pi/Android/PC/AWS to boost your AI production development.
Stars: ✭ 757 (+4105.56%)
Mutual labels:  mxnet
Tusimple Duc
Understanding Convolution for Semantic Segmentation
Stars: ✭ 567 (+3050%)
Mutual labels:  mxnet

Introduction

  • A place to 'post' the progress of my master's thesis.

Progress(Related studies necessary for master 's thesis.)

Master's Thesis

Development environment

  • window 10.1 64 bit and Ubuntu Linux 16.04.2 LTS
  • python verison : 3.6.1 , anaconda3 version : (4.4.0)
  • pycharm Community Edition 2017.2.2

Dependencies

  • mxnet-0.12.1(window) , mxnet-0.12.1(Linux)
  • tqdm -> (progress) , graphviz -> ( Visualization )
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].