All Projects → danaugrs → Go Tsne

danaugrs / Go Tsne

Licence: bsd-3-clause
t-Distributed Stochastic Neighbor Embedding (t-SNE) in Go

Programming Languages

go
31211 projects - #10 most used programming language

Projects that are alternatives of or similar to Go Tsne

Homlr
Supplementary material for Hands-On Machine Learning with R, an applied book covering the fundamentals of machine learning with R.
Stars: ✭ 185 (+20.92%)
Mutual labels:  data-science, unsupervised-learning
Pyod
A Python Toolbox for Scalable Outlier Detection (Anomaly Detection)
Stars: ✭ 5,083 (+3222.22%)
Mutual labels:  data-science, unsupervised-learning
Free Ai Resources
🚀 FREE AI Resources - 🎓 Courses, 👷 Jobs, 📝 Blogs, 🔬 AI Research, and many more - for everyone!
Stars: ✭ 192 (+25.49%)
Mutual labels:  data-science, unsupervised-learning
Csm
Code release for "Canonical Surface Mapping via Geometric Cycle Consistency"
Stars: ✭ 156 (+1.96%)
Mutual labels:  3d, unsupervised-learning
Php Ml
PHP-ML - Machine Learning library for PHP
Stars: ✭ 7,900 (+5063.4%)
Mutual labels:  data-science, unsupervised-learning
Danmf
A sparsity aware implementation of "Deep Autoencoder-like Nonnegative Matrix Factorization for Community Detection" (CIKM 2018).
Stars: ✭ 161 (+5.23%)
Mutual labels:  data-science, unsupervised-learning
Mlxtend
A library of extension and helper modules for Python's data analysis and machine learning libraries.
Stars: ✭ 3,729 (+2337.25%)
Mutual labels:  data-science, unsupervised-learning
Stanford Cs 229 Machine Learning
VIP cheatsheets for Stanford's CS 229 Machine Learning
Stars: ✭ 12,827 (+8283.66%)
Mutual labels:  data-science, unsupervised-learning
Tadw
An implementation of "Network Representation Learning with Rich Text Information" (IJCAI '15).
Stars: ✭ 43 (-71.9%)
Mutual labels:  data-science, unsupervised-learning
Susi
SuSi: Python package for unsupervised, supervised and semi-supervised self-organizing maps (SOM)
Stars: ✭ 42 (-72.55%)
Mutual labels:  data-science, unsupervised-learning
Sealion
The first machine learning framework that encourages learning ML concepts instead of memorizing class functions.
Stars: ✭ 278 (+81.7%)
Mutual labels:  data-science, unsupervised-learning
Awesome Community Detection
A curated list of community detection research papers with implementations.
Stars: ✭ 1,874 (+1124.84%)
Mutual labels:  data-science, unsupervised-learning
Unsup3d
(CVPR'20 Oral) Unsupervised Learning of Probably Symmetric Deformable 3D Objects from Images in the Wild
Stars: ✭ 905 (+491.5%)
Mutual labels:  3d, unsupervised-learning
Vizuka
Explore high-dimensional datasets and how your algo handles specific regions.
Stars: ✭ 100 (-34.64%)
Mutual labels:  data-science, unsupervised-learning
Complete Life Cycle Of A Data Science Project
Complete-Life-Cycle-of-a-Data-Science-Project
Stars: ✭ 140 (-8.5%)
Mutual labels:  data-science, unsupervised-learning
Labs
Labs for the Foundations of Applied Mathematics curriculum.
Stars: ✭ 150 (-1.96%)
Mutual labels:  data-science
Artificial Intelligence Projects
Collection of Artificial Intelligence projects.
Stars: ✭ 152 (-0.65%)
Mutual labels:  data-science
Ml Workspace
🛠 All-in-one web-based IDE specialized for machine learning and data science.
Stars: ✭ 2,337 (+1427.45%)
Mutual labels:  data-science
Awesome Sentence Embedding
A curated list of pretrained sentence and word embedding models
Stars: ✭ 1,973 (+1189.54%)
Mutual labels:  unsupervised-learning
Flame pytorch
This is a implementation of the 3D FLAME model in PyTorch
Stars: ✭ 153 (+0%)
Mutual labels:  3d

go-tsne

A Go implementation of t-Distributed Stochastic Neighbor Embedding (t-SNE), a prize-winning technique for dimensionality reduction particularly well suited for visualizing high-dimensional datasets.

mnist2d mnist3d

Usage

Import this library:

    import "github.com/danaugrs/go-tsne/tsne"

Create the TSNE object:

    t := tsne.NewTSNE(2, 300, 100, 300, true)

The parameters are

  • Number of output dimensions
  • Perplexity
  • Learning rate
  • Max number of iterations
  • Verbosity

There are two ways to start the t-SNE embedding optimization. The regular way is to provide an n by d matrix where each row is a datapoint and each column is a dimension:

    Y := t.EmbedData(X, nil)

The alternative is to provide a distance matrix directly:

    Y := t.EmbedDistances(D, nil)

In either case, the returned matrix Y will contain the final embedding.

For more fine-grained control, a step function can be provided in either case:

    Y := t.EmbedData(X, func(iter int, divergence float64, embedding mat.Matrix) bool {
    	fmt.Printf("Iteration %d: divergence is %v\n", iter, divergence)
    	return false
    })

The step function has access to the iteration, the current divergence, and the embedding optimized so far. You can return true to halt the optimization.

Examples

Two examples are provided - mnist2d and mnist3d. They both use the same data - a subset of MNIST with 2500 handwritten digits. mnist2d generates plots throughout the optimization process, and mnist3d shows the optimization happening in real-time, in 3D. mnist3d depends on G3N. To run an example, cd to the example's directory, install the dependencies, and go run it, e.g:

    cd examples/mnist2d
    go get ./...
    go run mnist2d.go

Support

I hope you enjoy using and learning from go-tsne as much as I enjoyed writing it.

If you come across any issues, please report them.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].