All Projects → lvsn → 6DOF_tracking_evaluation

lvsn / 6DOF_tracking_evaluation

Licence: other
Code visualize and evaluate the dataset from "A Framework for Evaluating 6-DOF Object Trackers".

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to 6DOF tracking evaluation

BOVText-Benchmark
BOVText: A Large-Scale, Multidimensional Multilingual Dataset for Video Text Spotting
Stars: ✭ 44 (+29.41%)
Mutual labels:  tracking
sdks
Polyaxon Clients & Langange SDKS
Stars: ✭ 12 (-64.71%)
Mutual labels:  tracking
epictracker
A demo of how can I track you using fingerprinting and some automated lookups and stuff, using modern Javascript APIs
Stars: ✭ 17 (-50%)
Mutual labels:  tracking
untrace
🐳 Minimal event tracking on the client in 300 bytes.
Stars: ✭ 26 (-23.53%)
Mutual labels:  tracking
trackanimation
Track Animation is a Python 2 and 3 library that provides an easy and user-adjustable way of creating visualizations from GPS data.
Stars: ✭ 74 (+117.65%)
Mutual labels:  tracking
ARFaceFilter
Javascript/WebGL lightweight face tracking library designed for augmented reality webcam filters. Features : multiple faces detection, rotation, mouth opening. Various integration examples are provided (Three.js, Babylon.js, FaceSwap, Canvas2D, CSS3D...).
Stars: ✭ 72 (+111.76%)
Mutual labels:  tracking
pyMHT
Track oriented, multi target, multi hypothesis tracker
Stars: ✭ 66 (+94.12%)
Mutual labels:  tracking
Object-Detection-And-Tracking
Target detection in the first frame and Tracking target by SiamRPN.
Stars: ✭ 33 (-2.94%)
Mutual labels:  tracking
Face-Detection-and-Tracking
Face Detection and tracking using CamShift, Kalman Filter, Optical Flow
Stars: ✭ 30 (-11.76%)
Mutual labels:  tracking
mixpanel-react-native
Official React Native Tracking Library for Mixpanel Analytics
Stars: ✭ 69 (+102.94%)
Mutual labels:  tracking
TobiiGlassesPyController
Tobii Pro Glasses 2 Python controller
Stars: ✭ 42 (+23.53%)
Mutual labels:  tracking
Awesome-Vision-Transformer-Collection
Variants of Vision Transformer and its downstream tasks
Stars: ✭ 124 (+264.71%)
Mutual labels:  tracking
edusense
EduSense: Practical Classroom Sensing at Scale
Stars: ✭ 44 (+29.41%)
Mutual labels:  tracking
IMU-VR-Full-Body-Tracker
Inertial Measurement Unit (IMU) based full body tracker for Steam VR.
Stars: ✭ 46 (+35.29%)
Mutual labels:  tracking
siamfc-pytorch-gpu-benchmark
No description or website provided.
Stars: ✭ 17 (-50%)
Mutual labels:  tracking
top-view-multi-person-tracking
This repo contains links to multi-person re-identification and tracking dataset in top view multi-camera environment.
Stars: ✭ 59 (+73.53%)
Mutual labels:  tracking
BirdsEye
Applying Perspective transformations to 2d images.
Stars: ✭ 22 (-35.29%)
Mutual labels:  tracking
spotifytrack
A personal homepage showing users' top songs and artists, providing a shareable link that they can use to show it off to friends.
Stars: ✭ 48 (+41.18%)
Mutual labels:  tracking
cpp-iout
C++ Implementation of IOU Tracker presented in AVSS17
Stars: ✭ 33 (-2.94%)
Mutual labels:  tracking
whotracks.me
Data from the largest and longest measurement of online tracking.
Stars: ✭ 288 (+747.06%)
Mutual labels:  tracking

6DOF_tracking_evaluation

Code to visualize and evaluate the dataset from "A Framework for Evaluating 6-DOF Object Trackers" [arxiv paper].

The dataset can be downloaded at this website.

Dependencies

To train the network, version 0.1 of pytorch_toolbox is required.

Citation

If you use this dataset in your research, please cite:

@inproceedings{garon2018framework,
	       title={A framework for evaluating 6-dof object trackers},
	       author={Garon, Mathieu and Laurendeau, Denis and Lalonde, Jean-Fran{\c{c}}ois},
	       booktitle={Proceedings of the European Conference on Computer Vision (ECCV)},
	       pages={582--597},
	       year={2018}
}

}

Visualize the dataset

Download the sample dataset here (583 MB).

python visualize_sequence -r /path/to/sample -s interaction_hard -o clock

Evaluating

Evaluating a single sequence : two csv files must be provided, one with ground truth poses and one with the predictions. Note that each row represent 16 values of a 4x4 transform matrix.

python evaluate_sequence.py -g /path/to/ground_truth.csv -p /path/to/predictions.csv ```

Evaluating a batch of sequence : the following folder structure is needed:

  • root
    • modelName
      • object_sequence (ex: dragon_interaction_hard)
        • ground_truth.csv
        • prediction_pose.csv
python evaluate_batch.py -r /path/to/root ```

Tracker

Generate the dataset

Change the parameters in generate_dataset.sh. And run to generate the training and validation dataset.

Train the network

Change the parameters in train_deeptrack.sh. And run to train the network.

License

License for Non-Commercial Use

If this software is redistributed, this license must be included.
The term software includes any source files, documentation, executables,
models, and data.

This software is available for general use by academic or non-profit,
or government-sponsored researchers. This license does not grant the
right to use this software or any derivation of it for commercial activities.
For commercial use, please contact Jean-Francois Lalonde at Université Laval
at [email protected].

This software comes with no warranty or guarantee of any kind. By using this
software, the user accepts full liability.
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].