All Projects → Rookfighter → knn-cpp

Rookfighter / knn-cpp

Licence: MIT License
A header-only C++ library for k nearest neighbor search with Eigen3.

Programming Languages

C++
36643 projects - #6 most used programming language
CMake
9771 projects
c
50402 projects - #5 most used programming language

Projects that are alternatives of or similar to knn-cpp

awesome-vector-search
Collections of vector search related libraries, service and research papers
Stars: ✭ 460 (+1740%)
Mutual labels:  nearest-neighbor-search, knn-search
pynanoflann
Unofficial python wrapper to the nanoflann k-d tree
Stars: ✭ 24 (-4%)
Mutual labels:  kd-tree, nearest-neighbor-search
spot price machine learning
Machine Learning for Spot Prices
Stars: ✭ 25 (+0%)
Mutual labels:  knn
ikd-Tree
This repository provides implementation of an incremental k-d tree for robotic applications.
Stars: ✭ 286 (+1044%)
Mutual labels:  kd-tree
Clustering-Python
Python Clustering Algorithms
Stars: ✭ 23 (-8%)
Mutual labels:  knn
MoTIS
Mobile(iOS) Text-to-Image search powered by multimodal semantic representation models(e.g., OpenAI's CLIP). Accepted at NAACL 2022.
Stars: ✭ 60 (+140%)
Mutual labels:  knn
annoy.rb
annoy-rb provides Ruby bindings for the Annoy (Approximate Nearest Neighbors Oh Yeah).
Stars: ✭ 23 (-8%)
Mutual labels:  nearest-neighbor-search
lshensemble
LSH index for approximate set containment search
Stars: ✭ 48 (+92%)
Mutual labels:  nearest-neighbor-search
Dolphinn
High Dimensional Approximate Near(est) Neighbor
Stars: ✭ 32 (+28%)
Mutual labels:  nearest-neighbor-search
MSF
Official code for "Mean Shift for Self-Supervised Learning"
Stars: ✭ 42 (+68%)
Mutual labels:  knn
deforestation
A machine learning exercise, using KNN to classify deforested areas
Stars: ✭ 26 (+4%)
Mutual labels:  knn
Machine learning trading algorithm
Master's degree project: Development of a trading algorithm which uses supervised machine learning classification techniques to generate buy/sell signals
Stars: ✭ 20 (-20%)
Mutual labels:  knn
pointu
✏️ Pointillisme tool based on Weighted Voronoi Stippling
Stars: ✭ 32 (+28%)
Mutual labels:  kd-tree
fastknn
Fast k-Nearest Neighbors Classifier for Large Datasets
Stars: ✭ 64 (+156%)
Mutual labels:  knn
kdtree-rs
K-dimensional tree in Rust for fast geospatial indexing and lookup
Stars: ✭ 137 (+448%)
Mutual labels:  nearest-neighbor-search
MachineLearning
机器学习教程,本教程包含基于numpy、sklearn与tensorflow机器学习,也会包含利用spark、flink加快模型训练等用法。本着能够较全的引导读者入门机器学习。
Stars: ✭ 23 (-8%)
Mutual labels:  knn
Fall-Detection-Dataset
FUKinect-Fall dataset was created using Kinect V1. The dataset includes walking, bending, sitting, squatting, lying and falling actions performed by 21 subjects between 19-72 years of age.
Stars: ✭ 16 (-36%)
Mutual labels:  knn
Rayuela.jl
Code for my PhD thesis. Library of quantization-based methods for fast similarity search in high dimensions. Presented at ECCV 18.
Stars: ✭ 54 (+116%)
Mutual labels:  nearest-neighbor-search
Handwritten-Digits-Classification-Using-KNN-Multiclass Perceptron-SVM
🏆 A Comparative Study on Handwritten Digits Recognition using Classifiers like K-Nearest Neighbours (K-NN), Multiclass Perceptron/Artificial Neural Network (ANN) and Support Vector Machine (SVM) discussing the pros and cons of each algorithm and providing the comparison results in terms of accuracy and efficiecy of each algorithm.
Stars: ✭ 42 (+68%)
Mutual labels:  knn
KernelKnn
Kernel k Nearest Neighbors in R
Stars: ✭ 14 (-44%)
Mutual labels:  knn

knn-cpp

Cpp11 License CMake

knn-cpp is a header-only C++ library for k nearest neighbor search using the Eigen3 library.

It implements various interfaces for KNN search:

  • pure Eigen3 parallelized brute force search
  • pure Eigen3 kdtree for efficient search with Manhatten, Euclidean and Minkowski distances

Install

Simply copy the header files into your project or install them using the CMake build system by typing

cd path/to/repo
mkdir build
cd build
cmake ..
make install

The library requires Eigen3 to be installed on your system.

In Debian based systems you can simply install these dependencies using apt-get.

apt-get install libeigen3-dev

Make sure Eigen3 can be found by your build system.

You can use the CMake Find module in cmake/ to find the installed headers.

Usage

All search algorithms share a similar interface. Have a look at the files in the examples/ directory.

Here is a basic example on how to build a kdtree and query it.

#include <iostream>
#include <knncpp.h>

typedef Eigen::MatrixXd Matrix;
typedef knncpp::Matrixi Matrixi;

int main()
{
    // Define some data points, which should be searched.
    // Each column defines one datapoint.
    Matrix dataPoints(3, 9);
    dataPoints << 1, 2, 3, 1, 2, 3, 1, 2, 3,
                  2, 1, 0, 3, 2, 1, 0, 3, 4,
                  3, 1, 3, 1, 3, 4, 4, 2, 1;

    // Create a KDTreeMinkowski object and set the data points.
    // Data is not copied by default. You can also pass an additional bool flag
    // to create a data copy. The tree is not built yet.
    // You can also use the setData() method to set the data at a later point.
    // The distance type is defined by the second template parameter.
    // Currently ManhattenDistance, EuclideanDistance, ChebyshevDistance and
    // MinkowskiDistance are available.
    knncpp::KDTreeMinkowskiX<double, knncpp::EuclideanDistance<double>> kdtree(dataPoints);

    // Set the bucket size for each leaf node in the tree. The higher the value
    // the less leafs have to be visited to find the nearest neighbors. The
    // lower the value the less distance evaluations have to be computed.
    // Default is 16.
    kdtree.setBucketSize(16);
    // Set if the resulting neighbors should be sorted in ascending order after
    // a successfull search.
    // This consumes some time during the query.
    // Default is true.
    kdtree.setSorted(true);
    // Set if the root should be taken of the distances after a successful search.
    // This consumes some time during the query.
    // Default is false.
    kdtree.setTakeRoot(true);
    // Set the maximum inclusive distance for the query. Set to 0 or negative
    // to disable maximum distances.
    // Default is 0.
    kdtree.setMaxDistance(2.5 * 2.5);
    // Set how many threads should be used during the query. Set to 0 or
    // negative to autodetect the optimal number of threads (OpenMP only).
    // Default is 1.
    kdtree.setThreads(2);

    // Build the tree. This consumes some time.
    kdtree.build();

    // Create a querypoint. We will search for this points nearest neighbors.
    Matrix queryPoints(3, 1);
    queryPoints << 0, 1, 0;

    Matrixi indices;
    Matrix distances;
    // Search for 3 nearest neighbors.
    // The matrices indices and distances hold the index and distance of the
    // respective nearest neighbors.
    // Their value is set to -1 if no further neighbor was found.
    kdtree.query(queryPoints, 3, indices, distances);

    // Do something with the results.
    std::cout
        << "Data points:" << std::endl
        << dataPoints << std::endl
        << "Query points:" << std::endl
        << queryPoints << std::endl
        << "Neighbor indices:" << std::endl
        << indices << std::endl
        << "Neighbor distances:" << std::endl
        << distances << std::endl;

    return 0;
}

References

  1. Songrit Maneewongvatana and David M. Mount, Analysis of Approximate Nearest Neighbor Searching with Clustered Point Sets, DIMACS Series in Discrete Mathematics and Theoretical Computer Science, 2002

  2. Mohammad Norouzi, Ali Punjani and David J. Fleet, Fast Search in Hamming Space with Multi-Index Hashing, In Proceedings of 2012 IEEE Conference on Computer Vision and Pattern Recognition

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].