All Projects → titu1994 → Densenet

titu1994 / Densenet

Licence: mit
DenseNet implementation in Keras

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Densenet

Imitation
Code for the paper "Generative Adversarial Imitation Learning"
Stars: ✭ 555 (-19.91%)
Mutual labels:  paper
Recommendersystem Paper
This repository includes some papers that I have read or which I think may be very interesting.
Stars: ✭ 619 (-10.68%)
Mutual labels:  paper
Multiagent Competition
Code for the paper "Emergent Complexity via Multi-agent Competition"
Stars: ✭ 663 (-4.33%)
Mutual labels:  paper
Bert paper chinese translation
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 论文的中文翻译 Chinese Translation!
Stars: ✭ 564 (-18.61%)
Mutual labels:  paper
Cifar Zoo
PyTorch implementation of CNNs for CIFAR benchmark
Stars: ✭ 584 (-15.73%)
Mutual labels:  densenet
All About The Gan
All About the GANs(Generative Adversarial Networks) - Summarized lists for GAN
Stars: ✭ 630 (-9.09%)
Mutual labels:  paper
Mlsh
Code for the paper "Meta-Learning Shared Hierarchies"
Stars: ✭ 548 (-20.92%)
Mutual labels:  paper
Densenet.pytorch
A PyTorch implementation of DenseNet.
Stars: ✭ 684 (-1.3%)
Mutual labels:  densenet
Dnc Tensorflow
A TensorFlow implementation of DeepMind's Differential Neural Computers (DNC)
Stars: ✭ 587 (-15.3%)
Mutual labels:  paper
Dl Nlp Readings
My Reading Lists of Deep Learning and Natural Language Processing
Stars: ✭ 656 (-5.34%)
Mutual labels:  paper
Deeptype
Code for the paper "DeepType: Multilingual Entity Linking by Neural Type System Evolution"
Stars: ✭ 571 (-17.6%)
Mutual labels:  paper
Densenet Tensorflow
DenseNet Implementation in Tensorflow
Stars: ✭ 580 (-16.31%)
Mutual labels:  densenet
Minecraftdev
Plugin for IntelliJ IDEA that gives special support for Minecraft modding projects.
Stars: ✭ 645 (-6.93%)
Mutual labels:  paper
Cv paperdaily
CV 论文笔记
Stars: ✭ 555 (-19.91%)
Mutual labels:  paper
Pytorch2keras
PyTorch to Keras model convertor
Stars: ✭ 676 (-2.45%)
Mutual labels:  densenet
Medicalzoopytorch
A pytorch-based deep learning framework for multi-modal 2D/3D medical image segmentation
Stars: ✭ 546 (-21.21%)
Mutual labels:  densenet
Awesome Interaction Aware Trajectory Prediction
A selection of state-of-the-art research materials on trajectory prediction
Stars: ✭ 625 (-9.81%)
Mutual labels:  paper
Awesome Economics
A curated collection of links for economists
Stars: ✭ 688 (-0.72%)
Mutual labels:  paper
Senet Tensorflow
Simple Tensorflow implementation of "Squeeze and Excitation Networks" using Cifar10 (ResNeXt, Inception-v4, Inception-resnet-v2)
Stars: ✭ 682 (-1.59%)
Mutual labels:  densenet
Awesome Relation Extraction
📖 A curated list of awesome resources dedicated to Relation Extraction, one of the most important tasks in Natural Language Processing (NLP).
Stars: ✭ 656 (-5.34%)
Mutual labels:  paper

Dense Net in Keras

DenseNet implementation of the paper Densely Connected Convolutional Networks in Keras

Now supports the more efficient DenseNet-BC (DenseNet-Bottleneck-Compressed) networks. Using the DenseNet-BC-190-40 model, it obtaines state of the art performance on CIFAR-10 and CIFAR-100

Architecture

DenseNet is an extention to Wide Residual Networks. According to the paper:

The lth layer has l inputs, consisting of the feature maps of all preceding convolutional blocks. 
Its own feature maps are passed on to all L − l subsequent layers. This introduces L(L+1) / 2 connections 
in an L-layer network, instead of just L, as in traditional feed-forward architectures. 
Because of its dense connectivity pattern, we refer to our approach as Dense Convolutional Network (DenseNet).

It features several improvements such as :

  1. Dense connectivity : Connecting any layer to any other layer.
  2. Growth Rate parameter Which dictates how fast the number of features increase as the network becomes deeper.
  3. Consecutive functions : BatchNorm - Relu - Conv which is from the Wide ResNet paper and improvement from the ResNet paper.

The Bottleneck - Compressed DenseNets offer further performance benefits, such as reduced number of parameters, with similar or better performance.

  • Take into consideration the DenseNet-100-12 model, with nearly 7 million parameters against with the DenseNet-BC-100-12, with just 0.8 million parameters. The BC model achieves 4.51 % error in comparison to the original models' 4.10 % error

  • The best original model, DenseNet-100-24 (27.2 million parameters) achieves 3.74 % error, whereas the DenseNet-BC-190-40 (25.6 million parameters) achieves 3.46 % error which is a new state of the art performance on CIFAR-10.

Dense Nets have an architecture which can be shown in the following image from the paper:

Performance

The accuracy of DenseNet has been provided in the paper, beating all previous benchmarks in CIFAR 10, CIFAR 100 and SVHN

Usage

Import the densenet.py script and use the DenseNet(...) method to create a custom DenseNet model with a variety of parameters.

Examples :

import densenet

# 'th' dim-ordering or 'tf' dim-ordering
image_dim = (3, 32, 32) or image_dim = (32, 32, 3)

model = densenet.DenseNet(classes=10, input_shape=image_dim, depth=40, growth_rate=12, 
			  bottleneck=True, reduction=0.5)

Or, Import a pre-built DenseNet model for ImageNet, with some of these models having pre-trained weights (121, 161 and 169).

Example :

import densenet

# 'th' dim-ordering or 'tf' dim-ordering
image_dim = (3, 224, 224) or image_dim = (224, 224, 3)

model = densenet.DenseNetImageNet121(input_shape=image_dim)

Weights for the DenseNetImageNet121, DenseNetImageNet161 and DenseNetImageNet169 models are provided (in the release tab) and will be automatically downloaded when first called. They have been trained on ImageNet. The weights were ported from the repository https://github.com/flyyufelix/DenseNet-Keras.

Requirements

  • Keras
  • Theano (weights not tested) / Tensorflow (tested) / CNTK (weights not tested)
  • h5Py
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].