All Projects → eggie5 → NCE-loss

eggie5 / NCE-loss

Licence: other
Tensorflow NCE loss in Keras

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to NCE-loss

text-classification-cn
中文文本分类实践,基于搜狗新闻语料库,采用传统机器学习方法以及预训练模型等方法
Stars: ✭ 81 (+170%)
Mutual labels:  word2vec
word2vec
Use word2vec to improve search result
Stars: ✭ 63 (+110%)
Mutual labels:  word2vec
gonnp
📉Deep learning from scratch using Go. Specializes in natural language processing
Stars: ✭ 26 (-13.33%)
Mutual labels:  word2vec
test word2vec uyghur
Bu Uyghur yéziqini Pythonning gensim ambiridiki word2vec algorizimida sinap baqqan misal.
Stars: ✭ 15 (-50%)
Mutual labels:  word2vec
word2vec-from-scratch-with-python
A very simple, bare-bones, inefficient, implementation of skip-gram word2vec from scratch with Python
Stars: ✭ 85 (+183.33%)
Mutual labels:  word2vec
Deep-Learning-Mahjong---
Reinforcement learning (RL) implementation of imperfect information game Mahjong using markov decision processes to predict future game states
Stars: ✭ 45 (+50%)
Mutual labels:  softmax
RecSys PyTorch
PyTorch implementations of Top-N recommendation, collaborative filtering recommenders.
Stars: ✭ 125 (+316.67%)
Mutual labels:  recsys
DeepLearning-Lab
Code lab for deep learning. Including rnn,seq2seq,word2vec,cross entropy,bidirectional rnn,convolution operation,pooling operation,InceptionV3,transfer learning.
Stars: ✭ 83 (+176.67%)
Mutual labels:  word2vec
Word2VecJava
Word2Vec In Java (2013 google word2vec opensource)
Stars: ✭ 13 (-56.67%)
Mutual labels:  word2vec
adversarial-recommender-systems-survey
The goal of this survey is two-fold: (i) to present recent advances on adversarial machine learning (AML) for the security of RS (i.e., attacking and defense recommendation models), (ii) to show another successful application of AML in generative adversarial networks (GANs) for generative applications, thanks to their ability for learning (high-…
Stars: ✭ 110 (+266.67%)
Mutual labels:  recsys
wmd4j
wmd4j is a Java library for calculating Word Mover's Distance (WMD)
Stars: ✭ 31 (+3.33%)
Mutual labels:  word2vec
stylegan-encoder
StyleGAN Encoder - converts real images to latent space
Stars: ✭ 694 (+2213.33%)
Mutual labels:  loss-functions
doc2vec-golang
doc2vec , word2vec, implemented by golang. word embedding representation
Stars: ✭ 33 (+10%)
Mutual labels:  word2vec
word2vec
Rust interface to word2vec.
Stars: ✭ 22 (-26.67%)
Mutual labels:  word2vec
dnn-lstm-word-segment
Chinese Word Segmention Base on the Deep Learning and LSTM Neural Network
Stars: ✭ 24 (-20%)
Mutual labels:  word2vec
consistency
Implementation of models in our EMNLP 2019 paper: A Logic-Driven Framework for Consistency of Neural Models
Stars: ✭ 26 (-13.33%)
Mutual labels:  loss-functions
pycsou
Pycsou is a Python 3 package for solving linear inverse problems with state-of-the-art proximal algorithms. The software implements in a highly modular way the main building blocks -cost functionals, penalty terms and linear operators- of generic penalised convex optimisation problems.
Stars: ✭ 37 (+23.33%)
Mutual labels:  loss-functions
hierarchical-categories-loss-tensorflow
A loss function for categories with a hierarchical structure.
Stars: ✭ 26 (-13.33%)
Mutual labels:  loss-functions
chainer-notebooks
Jupyter notebooks for Chainer hands-on
Stars: ✭ 23 (-23.33%)
Mutual labels:  word2vec
navec
Compact high quality word embeddings for Russian language
Stars: ✭ 118 (+293.33%)
Mutual labels:  word2vec

Keras NCE-loss

Keras implemenation of the candidate sampling technique called Noise Contrastive Estimation (NCE). This is a Keras Layer which uses the TF implementation of NCE loss.

Gutmann, Hyvarinen. Noise-contrastive estimation: A new estimation principle for unnormalized statistical models. AISTATS 2010

NCE Background Document: http://www.eggie5.com/134-nce-Noise-contrastive-Estimation-Loss

from keras.layers import (
    Input,
    Dense,
    Embedding,
    Flatten,
)
from keras.models import Model
import keras.backend as K
import numpy as np
from nce import NCE


def build(NUM_ITEMS, num_users, k):

    iid = Input(shape=(1,), dtype="int32", name="iids")
    targets = Input(shape=(1,), dtype="int32", name="target_ids")

    item_embedding = Embedding(
        input_dim=NUM_ITEMS, output_dim=k, input_length=1, name="item_embedding"
    )
    selected_items = Flatten()(item_embedding(iid))

    h1 = Dense(k // 2, activation="relu", name="hidden")(selected_items)

    sm_logits = NCE(num_users, name="nce")([h1, targets])

    model = Model(inputs=[iid, targets], outputs=[sm_logits])
    return model


K = 10
SAMPLE_SIZE = 10000
num_items = 10000
NUM_USERS = 1000000 #THIS IS SIZE OF SOFTMAX

model = build(num_items, NUM_USERS, K)
model.compile(optimizer="adam", loss=None)
model.summary()

x = np.random.random_integers(num_items - 1, size=SAMPLE_SIZE)
y = np.ones(SAMPLE_SIZE)
X = [x, y]
print(x.shape, y.shape)

model.fit(x=X, batch_size=100, epochs=1)
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].