All Projects → popcornell → keras-triplet-center-loss

popcornell / keras-triplet-center-loss

Licence: GPL-3.0 License
Simple Keras implementation of Triplet-Center Loss on the MNIST dataset

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to keras-triplet-center-loss

pytorch-siamese-triplet
One-Shot Learning with Triplet CNNs in Pytorch
Stars: ✭ 74 (+117.65%)
Mutual labels:  mnist, triplet-loss
Open-Set-Recognition
Open Set Recognition
Stars: ✭ 49 (+44.12%)
Mutual labels:  mnist, center-loss
tensorflow-mnist-convnets
Neural nets for MNIST classification, simple single layer NN, 5 layer FC NN and convolutional neural networks with different architectures
Stars: ✭ 22 (-35.29%)
Mutual labels:  mnist
ELM-pytorch
Extreme Learning Machine implemented in Pytorch
Stars: ✭ 68 (+100%)
Mutual labels:  mnist
VAE-Gumbel-Softmax
An implementation of a Variational-Autoencoder using the Gumbel-Softmax reparametrization trick in TensorFlow (tested on r1.5 CPU and GPU) in ICLR 2017.
Stars: ✭ 66 (+94.12%)
Mutual labels:  mnist
chainer-ADDA
Adversarial Discriminative Domain Adaptation in Chainer
Stars: ✭ 24 (-29.41%)
Mutual labels:  mnist
haskell-vae
Learning about Haskell with Variational Autoencoders
Stars: ✭ 18 (-47.06%)
Mutual labels:  mnist
deeplearning-mpo
Replace FC2, LeNet-5, VGG, Resnet, Densenet's full-connected layers with MPO
Stars: ✭ 26 (-23.53%)
Mutual labels:  mnist
MNIST-Keras
Using various CNN techniques on the MNIST dataset
Stars: ✭ 39 (+14.71%)
Mutual labels:  mnist
MNIST-multitask
6️⃣6️⃣6️⃣ Reproduce ICLR '18 under-reviewed paper "MULTI-TASK LEARNING ON MNIST IMAGE DATASETS"
Stars: ✭ 34 (+0%)
Mutual labels:  mnist
image triplet loss
Image similarity using Triplet Loss
Stars: ✭ 76 (+123.53%)
Mutual labels:  triplet-loss
mnist-challenge
My solution to TUM's Machine Learning MNIST challenge 2016-2017 [winner]
Stars: ✭ 68 (+100%)
Mutual labels:  mnist
CNN Own Dataset
CNN example for training your own datasets.
Stars: ✭ 25 (-26.47%)
Mutual labels:  mnist
AdaBound-tensorflow
An optimizer that trains as fast as Adam and as good as SGD in Tensorflow
Stars: ✭ 44 (+29.41%)
Mutual labels:  mnist
mnist test
mnist with Tensorflow
Stars: ✭ 30 (-11.76%)
Mutual labels:  mnist
mnist-flask
A Flask web app for handwritten digit recognition using machine learning
Stars: ✭ 34 (+0%)
Mutual labels:  mnist
rust-simple-nn
Simple neural network implementation in Rust
Stars: ✭ 24 (-29.41%)
Mutual labels:  mnist
Python-TensorFlow-WebApp
Emerging Technologies Project - 4th Year 2017
Stars: ✭ 16 (-52.94%)
Mutual labels:  mnist
minimal wgan
A minimal implementation of Wasserstein GAN
Stars: ✭ 44 (+29.41%)
Mutual labels:  mnist
tf-triplet-demo
This project implements triplet loss and semi-hard mining in tensorflow.
Stars: ✭ 13 (-61.76%)
Mutual labels:  triplet-loss

keras-triplet-center-loss

A simple Keras implementation of Triplet-Center Loss on the MNIST dataset. As a reference in this repository also implementations of other two similar losses, Center-Loss and Triplet-Loss are included.

The Center-Loss implementation is from shamangary: https://github.com/shamangary/Keras-MNIST-center-loss-with-visualization

The Triplet-Loss implementation is from KinWaiCheuk: https://github.com/KinWaiCheuk/Triplet-net-keras


Triplet-Center Loss

Triplet-Center Loss has been introduced by He et al. in https://arxiv.org/abs/1803.06189. It is an "hybrid" loss between Center Loss and Triplet Loss that allows to maximise inter-class distance and minimize intra-class distance.

Details

In this repository a simple implementation on the MNSIT or alternatively Fashion MNIST is shown.

Running main.py will start sequentially 4 training routines with 4 different losses:

  • Categorical Crossentropy only
  • Center-loss + Categorical Crossentropy
  • Triplet-loss + Categorical Crossentropy
  • Triplet-Center loss + Categorical Crossentropy

In Folder runs there will be the results of those models, including Tensorboard summaries. Also T-SNE is run on the embeddings to visualize how the network internal representation changes as the loss is changed.


triplet-center loss, T-SNE on internal representation (Train Data):

Image of Triplet_Center_Loss


Center loss, T-SNE on internal representation (Train Data):

Image of _Center_Loss


Triplet loss, T-SNE on internal representation (Train Data):

Image of _Triplet_Loss

As it can be seen the triplet-center loss maximises the inter-class distance as the Triplet Loss while keeping the Center-loss characteristic of minimizing intra-class distance. Another advantage of Triplet-Center loss is that it does not need advanced batching and triplet selection mining techniques as the Triplet-Loss does.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].