All Projects → NegatioN → Onlineminingtripletloss

NegatioN / Onlineminingtripletloss

Licence: mit
PyTorch conversion of https://omoindrot.github.io/triplet-loss

Projects that are alternatives of or similar to Onlineminingtripletloss

Flask Rest Setup
Notes on Flask REST API and tutorial
Stars: ✭ 124 (-0.8%)
Mutual labels:  jupyter-notebook
Carnd Lenet Lab
Implement the LeNet deep neural network model with TensorFlow.
Stars: ✭ 124 (-0.8%)
Mutual labels:  jupyter-notebook
Dash Sample Apps
Open-source demos hosted on Dash Gallery
Stars: ✭ 2,090 (+1572%)
Mutual labels:  jupyter-notebook
Code2pix
code2pix: Generating Graphical User Interfaces from Code (A Differentiable Compiler)
Stars: ✭ 124 (-0.8%)
Mutual labels:  jupyter-notebook
Software Training
RoboJackets Software Training
Stars: ✭ 124 (-0.8%)
Mutual labels:  jupyter-notebook
Error Detection
A Baseline for Detecting Misclassified and Out-of-Distribution Examples in Neural Networks
Stars: ✭ 124 (-0.8%)
Mutual labels:  jupyter-notebook
Data Science
Toda semana um novo material estará disponível para guiar no estudo de ciência de dados =)
Stars: ✭ 124 (-0.8%)
Mutual labels:  jupyter-notebook
Deepinsight
A general framework for interpreting wide-band neural activity
Stars: ✭ 125 (+0%)
Mutual labels:  jupyter-notebook
Predictive Maintenance
Data Wrangling, EDA, Feature Engineering, Model Selection, Regression, Binary and Multi-class Classification (Python, scikit-learn)
Stars: ✭ 124 (-0.8%)
Mutual labels:  jupyter-notebook
Ipyvolume
3d plotting for Python in the Jupyter notebook based on IPython widgets using WebGL
Stars: ✭ 1,696 (+1256.8%)
Mutual labels:  jupyter-notebook
Pygru4rec
PyTorch Implementation of Session-based Recommendations with Recurrent Neural Networks(ICLR 2016, Hidasi et al.)
Stars: ✭ 124 (-0.8%)
Mutual labels:  jupyter-notebook
Simplegesturerecognition
A very simple gesture recognition technique using opencv and python
Stars: ✭ 124 (-0.8%)
Mutual labels:  jupyter-notebook
Gdeltpyr
Python based framework to retreive Global Database of Events, Language, and Tone (GDELT) version 1.0 and version 2.0 data.
Stars: ✭ 124 (-0.8%)
Mutual labels:  jupyter-notebook
Off Nutrition Table Extractor
Stars: ✭ 124 (-0.8%)
Mutual labels:  jupyter-notebook
100 Days Of Nlp
Stars: ✭ 125 (+0%)
Mutual labels:  jupyter-notebook
Oc Nn
Repository for the One class neural networks paper
Stars: ✭ 124 (-0.8%)
Mutual labels:  jupyter-notebook
Jupyterlab Demo
Demonstrations of JupyterLab
Stars: ✭ 122 (-2.4%)
Mutual labels:  jupyter-notebook
Pandaset Devkit
Stars: ✭ 121 (-3.2%)
Mutual labels:  jupyter-notebook
Huggingtweets
Tweet Generation with Huggingface
Stars: ✭ 124 (-0.8%)
Mutual labels:  jupyter-notebook
Nb2mail
Send a notebook as an email
Stars: ✭ 124 (-0.8%)
Mutual labels:  jupyter-notebook

online_triplet_loss

PyTorch conversion of the excellent post on the same topic in Tensorflow. Simply an implementation of a triple loss with online mining of candidate triplets used in semi-supervised learning.

Install

pip install online_triplet_loss

Then import with: from online_triplet_loss.losses import *

PS: Requires Pytorch version 1.1.0 or above to use.

How to use

In these examples I use a really large margin, since the embedding space is so small. A more realistic margins seems to be between 0.1 and 2.0

from torch import nn
import torch

model = nn.Embedding(10, 10)
#from online_triplet_loss.losses import *
labels = torch.randint(high=10, size=(5,)) # our five labels

embeddings = model(labels)
print('Labels:', labels)
print('Embeddings:', embeddings)
loss = batch_hard_triplet_loss(labels, embeddings, margin=100)
print('Loss:', loss)
loss.backward()
Labels: tensor([6, 1, 3, 6, 6])
Embeddings: tensor([[-1.1335,  0.3364, -3.0174, -0.8732, -0.9301,  1.3619,  0.3746,  0.0457,
          0.0180, -0.4500],
        [ 1.0757, -0.8420, -0.7630, -0.0746,  1.1545,  0.4017,  0.5587,  1.7947,
          0.1992, -2.2288],
        [ 0.2646,  1.2383,  0.1949,  0.5743, -0.8460, -0.9929, -2.0350,  0.2095,
          0.2129, -0.4855],
        [-1.1335,  0.3364, -3.0174, -0.8732, -0.9301,  1.3619,  0.3746,  0.0457,
          0.0180, -0.4500],
        [-1.1335,  0.3364, -3.0174, -0.8732, -0.9301,  1.3619,  0.3746,  0.0457,
          0.0180, -0.4500]], grad_fn=<EmbeddingBackward>)
Loss: tensor(95.1271, grad_fn=<MeanBackward0>)
#from online_triplet_loss.losses import *
embeddings = model(labels)
print('Labels:', labels)
print('Embeddings:', embeddings)
loss, fraction_pos = batch_all_triplet_loss(labels, embeddings, squared=False, margin=100)
print('Loss:', loss)
loss.backward()
Labels: tensor([6, 1, 3, 6, 6])
Embeddings: tensor([[-1.1335,  0.3364, -3.0174, -0.8732, -0.9301,  1.3619,  0.3746,  0.0457,
          0.0180, -0.4500],
        [ 1.0757, -0.8420, -0.7630, -0.0746,  1.1545,  0.4017,  0.5587,  1.7947,
          0.1992, -2.2288],
        [ 0.2646,  1.2383,  0.1949,  0.5743, -0.8460, -0.9929, -2.0350,  0.2095,
          0.2129, -0.4855],
        [-1.1335,  0.3364, -3.0174, -0.8732, -0.9301,  1.3619,  0.3746,  0.0457,
          0.0180, -0.4500],
        [-1.1335,  0.3364, -3.0174, -0.8732, -0.9301,  1.3619,  0.3746,  0.0457,
          0.0180, -0.4500]], grad_fn=<EmbeddingBackward>)
tensor(94.9947, grad_fn=<DivBackward0>) tensor(1.)
Loss: tensor(94.9947, grad_fn=<DivBackward0>)

References

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].