All Projects → thomasahle → codenames

thomasahle / codenames

Licence: GPL-3.0 license
Codenames AI using Word Vectors

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to codenames

Shallowlearn
An experiment about re-implementing supervised learning models based on shallow neural network approaches (e.g. fastText) with some additional exclusive features and nice API. Written in Python and fully compatible with Scikit-learn.
Stars: ✭ 196 (+378.05%)
Mutual labels:  word2vec, word-embeddings
Simple-Sentence-Similarity
Exploring the simple sentence similarity measurements using word embeddings
Stars: ✭ 99 (+141.46%)
Mutual labels:  word2vec, word-embeddings
Chameleon recsys
Source code of CHAMELEON - A Deep Learning Meta-Architecture for News Recommender Systems
Stars: ✭ 202 (+392.68%)
Mutual labels:  word2vec, word-embeddings
Debiaswe
Remove problematic gender bias from word embeddings.
Stars: ✭ 175 (+326.83%)
Mutual labels:  word2vec, word-embeddings
Arabic-Word-Embeddings-Word2vec
Arabic Word Embeddings Word2vec
Stars: ✭ 26 (-36.59%)
Mutual labels:  word2vec, word-embeddings
Germanwordembeddings
Toolkit to obtain and preprocess german corpora, train models using word2vec (gensim) and evaluate them with generated testsets
Stars: ✭ 189 (+360.98%)
Mutual labels:  word2vec, word-embeddings
lda2vec
Mixing Dirichlet Topic Models and Word Embeddings to Make lda2vec from this paper https://arxiv.org/abs/1605.02019
Stars: ✭ 27 (-34.15%)
Mutual labels:  word2vec, word-embeddings
Dna2vec
dna2vec: Consistent vector representations of variable-length k-mers
Stars: ✭ 117 (+185.37%)
Mutual labels:  word2vec, word-embeddings
word-benchmarks
Benchmarks for intrinsic word embeddings evaluation.
Stars: ✭ 45 (+9.76%)
Mutual labels:  word2vec, word-embeddings
word2vec-on-wikipedia
A pipeline for training word embeddings using word2vec on wikipedia corpus.
Stars: ✭ 68 (+65.85%)
Mutual labels:  word2vec, word-embeddings
Gensim
Topic Modelling for Humans
Stars: ✭ 12,763 (+31029.27%)
Mutual labels:  word2vec, word-embeddings
sentiment-analysis-of-tweets-in-russian
Sentiment analysis of tweets in Russian using Convolutional Neural Networks (CNN) with Word2Vec embeddings.
Stars: ✭ 51 (+24.39%)
Mutual labels:  word2vec, word-embeddings
Fasttext.js
FastText for Node.js
Stars: ✭ 127 (+209.76%)
Mutual labels:  word2vec, word-embeddings
wikidata-corpus
Train Wikidata with word2vec for word embedding tasks
Stars: ✭ 109 (+165.85%)
Mutual labels:  word2vec, word-embeddings
Scattertext
Beautiful visualizations of how language differs among document types.
Stars: ✭ 1,722 (+4100%)
Mutual labels:  word2vec, word-embeddings
Koan
A word2vec negative sampling implementation with correct CBOW update.
Stars: ✭ 232 (+465.85%)
Mutual labels:  word2vec, word-embeddings
Text Summarizer
Python Framework for Extractive Text Summarization
Stars: ✭ 96 (+134.15%)
Mutual labels:  word2vec, word-embeddings
Magnitude
A fast, efficient universal vector embedding utility package.
Stars: ✭ 1,394 (+3300%)
Mutual labels:  word2vec, word-embeddings
two-stream-cnn
A two-stream convolutional neural network for learning abitrary similarity functions over two sets of training data
Stars: ✭ 24 (-41.46%)
Mutual labels:  word2vec, word-embeddings
word2vec-tsne
Google News and Leo Tolstoy: Visualizing Word2Vec Word Embeddings using t-SNE.
Stars: ✭ 59 (+43.9%)
Mutual labels:  word2vec, word-embeddings

Play Codenames with Glove

This repository implements a simple single-player version of the codenames game by Vlaada Chvátil. You can play as the agent or the spymaster, and the Glove word vectors will take the role of your partner, as you try to find the 8 marked words in as few rounds as possible.

$ git clone [email protected]:thomasahle/codenames.git
...

$ sh get_glove.sh
...

$ python3 codenames.py
...Loading vectors
...Loading words
...Making word to index dict
...Loading codenames
Ready!

Will you be agent or spymaster?: agent

     buck       bat   pumpkin    charge      iron
     well      boot     chick superhero     glove
   stream   germany      sock    dragon scientist
     duck     bugle    school       ham   mammoth
   bridge      fair  triangle   capital      horn

Thinking....................

Clue: "golden 6" (certainty 7.78, remaining words 8)

Your guess: bridge
Correct!

How it works

The bot decides what words go well together, by comparing their vectors in the GloVe trained on Wikipedia text. This means that words that often occour in the same articles and sentences are judged to be similar. In the example about, golden is of course similar to bridge by association with the Golden Gate Bridge. Other words that were found to be similar were 'dragon', 'triangle', 'duck', 'iron' and 'horn'.

However, in Codenames the task is not merely to find words that describe other words well. You also need to make sure that 'bad words' are as different as possible from your clue. To achieve this, the bot tries to find a word that maximizes the similarity gap between the marked words and the bad words.

If you want the bot to be more aggressive in its clues (choosing larger groups), try changing the agg = .5 value near the top of codenames.py to a larger value, such as .8 or 1.5.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].