All Projects → amirhfarzaneh → Lsoftmax Pytorch

amirhfarzaneh / Lsoftmax Pytorch

The Pytorch Implementation of L-Softmax

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Lsoftmax Pytorch

Sru Deeplearning Workshop
دوره 12 ساعته یادگیری عمیق با چارچوب Keras
Stars: ✭ 66 (-50.38%)
Mutual labels:  classification, convolutional-neural-networks
Simpsonrecognition
Detect and recognize The Simpsons characters using Keras and Faster R-CNN
Stars: ✭ 131 (-1.5%)
Mutual labels:  classification, convolutional-neural-networks
Graph 2d cnn
Code and data for the paper 'Classifying Graphs as Images with Convolutional Neural Networks' (new title: 'Graph Classification with 2D Convolutional Neural Networks')
Stars: ✭ 67 (-49.62%)
Mutual labels:  classification, convolutional-neural-networks
Pytorch Classification Uncertainty
This repo contains a PyTorch implementation of the paper: "Evidential Deep Learning to Quantify Classification Uncertainty"
Stars: ✭ 59 (-55.64%)
Mutual labels:  classification, mnist
Cinc Challenge2017
ECG classification from short single lead segments (Computing in Cardiology Challenge 2017 entry)
Stars: ✭ 112 (-15.79%)
Mutual labels:  classification, convolutional-neural-networks
Pointcnn
PointCNN: Convolution On X-Transformed Points (NeurIPS 2018)
Stars: ✭ 1,120 (+742.11%)
Mutual labels:  classification, convolutional-neural-networks
Fashion Mnist
A MNIST-like fashion product database. Benchmark 👇
Stars: ✭ 9,675 (+7174.44%)
Mutual labels:  convolutional-neural-networks, mnist
Constrained attention filter
(ECCV 2020) Tensorflow implementation of A Generic Visualization Approach for Convolutional Neural Networks
Stars: ✭ 36 (-72.93%)
Mutual labels:  classification, convolutional-neural-networks
Shiftresnet Cifar
ResNet with Shift, Depthwise, or Convolutional Operations for CIFAR-100, CIFAR-10 on PyTorch
Stars: ✭ 112 (-15.79%)
Mutual labels:  classification, convolutional-neural-networks
Dni.pytorch
Implement Decoupled Neural Interfaces using Synthetic Gradients in Pytorch
Stars: ✭ 111 (-16.54%)
Mutual labels:  classification, mnist
Ml In Tf
Get started with Machine Learning in TensorFlow with a selection of good reads and implemented examples!
Stars: ✭ 45 (-66.17%)
Mutual labels:  convolutional-neural-networks, mnist
Handwritten Digit Recognition Using Deep Learning
Handwritten Digit Recognition using Machine Learning and Deep Learning
Stars: ✭ 127 (-4.51%)
Mutual labels:  classification, convolutional-neural-networks
Yolo tensorflow
🚖 Object Detection (YOLOv1) implentation in tensorflow, with training, testing and video features.
Stars: ✭ 45 (-66.17%)
Mutual labels:  classification, convolutional-neural-networks
Deep Atrous Cnn Sentiment
Deep-Atrous-CNN-Text-Network: End-to-end word level model for sentiment analysis and other text classifications
Stars: ✭ 64 (-51.88%)
Mutual labels:  classification, convolutional-neural-networks
Svhn Cnn
Google Street View House Number(SVHN) Dataset, and classifying them through CNN
Stars: ✭ 44 (-66.92%)
Mutual labels:  convolutional-neural-networks, mnist
Emnist
A project designed to explore CNN and the effectiveness of RCNN on classifying the EMNIST dataset.
Stars: ✭ 81 (-39.1%)
Mutual labels:  convolutional-neural-networks, mnist
Text classification
all kinds of text classification models and more with deep learning
Stars: ✭ 7,179 (+5297.74%)
Mutual labels:  classification, convolutional-neural-networks
Randwire tensorflow
tensorflow implementation of Exploring Randomly Wired Neural Networks for Image Recognition
Stars: ✭ 29 (-78.2%)
Mutual labels:  classification, mnist
Malware Classification
Towards Building an Intelligent Anti-Malware System: A Deep Learning Approach using Support Vector Machine for Malware Classification
Stars: ✭ 88 (-33.83%)
Mutual labels:  classification, convolutional-neural-networks
Ti Pooling
TI-pooling: transformation-invariant pooling for feature learning in Convolutional Neural Networks
Stars: ✭ 119 (-10.53%)
Mutual labels:  convolutional-neural-networks, mnist

The Pytorch Implementation of L-Softmax

this repository contains a new, clean and enhanced pytorch implementation of L-Softmax proposed in the following paper:

Large-Margin Softmax Loss for Convolutional Neural Networks By Weiyang Liu, Yandong Wen, Zhiding Yu, Meng Yang [pdf in arxiv] [original CAFFE code by authors]

L-Softmax proposes a modified softmax classification method to increase the inter-class separability and intra-class compactness.

this re-implementation is based on the earlier pytorch implementation here by jihunchoi and borrowing some ideas from its TensorFlow implementation here by auroua. Generally the improvements are as follows:

  • [x] Now features visualization as depicted in the original paper using the vis argument in the code.
  • [x] Cleaner and more readable code
  • [x] More comments in lsoftmax.py file for future readers
  • [x] Variable names are now in better correspondence with the original paper
  • [x] Using the updated PyTorch 0.4.1 syntax and API
  • [x] Two models to produce visualization in paper's fig 2 and the original MNIST model is provided
  • [x] The lambda (beta variable in code) optimization missing in the earlier PyTorch code has been added (refer to section 5.1 in the original paper)
  • [x] The numerical error of torch.acos has been addressed
  • [x] Provided training logs in the Logs folder
  • [x] Some other minor performance improvements

Version compatibility

This code has been tested in Ubuntu 18.04 LTS using PyCharm IDE and a NVIDIA 1080Ti GPU. Here is a list of libraries and their corresponding versions:

python = 3.6
pytorch = 0.4.1
torchvision = 0.2.1
matplotlib = 2.2.2
numpy = 1.14.3
scipy = 1.1.0

Network parameters

  • batch_size = 256
  • max epochs = 100
  • learning rate = 0.1 (0.01 at epoch 50 and 0.001 at epoch 65)
  • SGD with momentum = 0.9
  • weight_decay = 0.0005

Results

Here are the test set visualization results of training the MNIST for different margins: alt text

  • this plot has been generated using the smaller network proposed in the paper for visualization purposes only with batch size = 64, constant learning rate = 0.01 for 10 epochs, and no weight decay regularization.

And here is the tabulated results of training MNIST with the proposed network in the paper:

margin test accuracy paper
m = 1 99.37% 99.60%
m = 2 99.60% 99.68%
m = 3 99.56% 99.69%
m = 4 99.61% 99.69%
  • the test accuracy values are the max test accuracy of running the code only once with the network parameters above!
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].