All Projects → jxgu1016 → Mnist_center_loss_pytorch

jxgu1016 / Mnist_center_loss_pytorch

Licence: mit
A PyTorch implementation of center loss on MNIST

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Mnist center loss pytorch

Multi-Face-Comparison
This repo is meant for backend API for face comparision and computer vision. It is built on python flask framework
Stars: ✭ 20 (-92.81%)
Mutual labels:  deeplearning
deep sort
Deep Sort algorithm C++ version
Stars: ✭ 60 (-78.42%)
Mutual labels:  deeplearning
Deeplearning.ai Assignments
Stars: ✭ 268 (-3.6%)
Mutual labels:  deeplearning
Similarity-Adaptive-Deep-Hashing
Unsupervised Deep Hashing with Similarity-Adaptive and Discrete Optimization (TPAMI2018)
Stars: ✭ 18 (-93.53%)
Mutual labels:  deeplearning
dilation-keras
Multi-Scale Context Aggregation by Dilated Convolutions in Keras.
Stars: ✭ 72 (-74.1%)
Mutual labels:  deeplearning
Jejunet
Real-Time Video Segmentation on Mobile Devices with DeepLab V3+, MobileNet V2. Worked on the project in 🏝 Jeju island
Stars: ✭ 258 (-7.19%)
Mutual labels:  deeplearning
Ensemble-Pytorch
A unified ensemble framework for PyTorch to improve the performance and robustness of your deep learning model.
Stars: ✭ 407 (+46.4%)
Mutual labels:  deeplearning
Shendusuipian
To know stats by heart
Stars: ✭ 275 (-1.08%)
Mutual labels:  deeplearning
Yolov5-deepsort-driverDistracted-driving-behavior-detection
基于深度学习的驾驶员分心驾驶行为(疲劳+危险行为)预警系统使用YOLOv5+Deepsort实现驾驶员的危险驾驶行为的预警监测
Stars: ✭ 107 (-61.51%)
Mutual labels:  deeplearning
In Prestissimo
A very fast neural network computing framework optimized for mobile platforms.QQ group: 676883532 【验证信息输:绝影】
Stars: ✭ 268 (-3.6%)
Mutual labels:  deeplearning
recurrent-defocus-deblurring-synth-dual-pixel
Reference github repository for the paper "Learning to Reduce Defocus Blur by Realistically Modeling Dual-Pixel Data". We propose a procedure to generate realistic DP data synthetically. Our synthesis approach mimics the optical image formation found on DP sensors and can be applied to virtual scenes rendered with standard computer software. Lev…
Stars: ✭ 30 (-89.21%)
Mutual labels:  deeplearning
HistoGAN
Reference code for the paper HistoGAN: Controlling Colors of GAN-Generated and Real Images via Color Histograms (CVPR 2021).
Stars: ✭ 158 (-43.17%)
Mutual labels:  deeplearning
Fixmatch Pytorch
Unofficial PyTorch implementation of "FixMatch: Simplifying Semi-Supervised Learning with Consistency and Confidence"
Stars: ✭ 259 (-6.83%)
Mutual labels:  deeplearning
Deep-Learning
It contains the coursework and the practice I have done while learning Deep Learning.🚀 👨‍💻💥 🚩🌈
Stars: ✭ 21 (-92.45%)
Mutual labels:  deeplearning
Randwirenn
Pytorch Implementation of: "Exploring Randomly Wired Neural Networks for Image Recognition"
Stars: ✭ 270 (-2.88%)
Mutual labels:  deeplearning
battery-rul-estimation
Remaining Useful Life (RUL) estimation of Lithium-ion batteries using deep LSTMs
Stars: ✭ 25 (-91.01%)
Mutual labels:  deeplearning
Data-Analysis
Different types of data analytics projects : EDA, PDA, DDA, TSA and much more.....
Stars: ✭ 22 (-92.09%)
Mutual labels:  deeplearning
Bert Ch Ner
基于BERT的中文命名实体识别
Stars: ✭ 274 (-1.44%)
Mutual labels:  deeplearning
Python web crawler da ml dl
python从最基础的语法历经网络基础、前端基础、后端基础和爬虫与数据基础走向机器学习
Stars: ✭ 272 (-2.16%)
Mutual labels:  deeplearning
Pytorch Correlation Extension
Custom implementation of Corrleation Module
Stars: ✭ 268 (-3.6%)
Mutual labels:  deeplearning

UPDATE(Oct. 2018)

By dropping the bias of the last fc layer according to the issue, the centers tend to distribute around a circle as reported in the orignal paper.

UPDATE(May. 2018)

Migration to PyTorch 0.4 done!

UPDATE(Apr. 2018)

Thanks @wenfahu for accomplishing the optimization of backward().

UPDATE(Mar. 2018)

Problems reported in the NOTIFICATION now has been SOLVED! Functionally, this repo is exactly the same as the official repo. New result is shown below and looks similar to the former one. If you want to try the former one, please return to Commits on Feb 12, 2018.

Some codes can be and should be optimized when calculating Eq.4 in backword() to replace the for-loop and feel free to pull your request.

NOTIFICATION(Feb. 2018)

In the begining, it was just a practise project to get familiar with PyTorch. Surprisedly, I didn't expect that there would be so many researchers following my repo of center loss. In that case, I'd like to illustrate that this implementation is not exactly the same as the official one.

If you read the equations in the paper carefully, the defination of center loss in the Eq. 2 can only lead you to the Eq. 3 but the update equation of centers in Eq. 4 can not be inferred arrcoding to the differentiation formulas. If not specified, the derivatives of one module are decided by the forward operation following the strategy of autograd in PyTorch. Considering the incompatibility of Eq. 3 and Eq. 4, only one of them can be implemented correctly and what I chose was the latter one. If you remvoe the centers_count in my code, this will lead you to the Eq. 3.

This problem exists in other implementaions and the impact remains unknown but looks harmless.

TO DO: To specify the derivatives just like the original caffe repo, instead of being calculated by autograd system.

MNIST_center_loss_pytorch

A pytorch implementation of center loss on MNIST and it's a toy example of ECCV2016 paper A Discriminative Feature Learning Approach for Deep Face Recognition

In order to ease the classifiers, center loss was designed to make samples in each class flock together.

Results are shown below:

softmax loss and center loss(new)
softmax loss and center loss(old)
only softmax loss

The code also includes visualization of the training process and please wait until these gifs load

softmax loss and center loss(new)
softmax loss and center loss(old)
only softmax loss
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].