All Projects → ashtuchkin → Vive Diy Position Sensor

ashtuchkin / Vive Diy Position Sensor

Licence: mit
Code & schematics for position tracking sensor using HTC Vive's Lighthouse system and a Teensy board.

Projects that are alternatives of or similar to Vive Diy Position Sensor

Blender
Mirror of the official Blender Git repository. Updated every hour.
Stars: ✭ 609 (-17.92%)
Mutual labels:  3d
Ol Cesium
OpenLayers - Cesium integration
Stars: ✭ 660 (-11.05%)
Mutual labels:  3d
Xviz
A protocol for real-time transfer and visualization of autonomy data
Stars: ✭ 691 (-6.87%)
Mutual labels:  3d
Oce
OpenCASCADE Community Edition (OCE): a community driven fork of the Open CASCADE library.
Stars: ✭ 623 (-16.04%)
Mutual labels:  3d
Extreme 3d faces
Extreme 3D Face Reconstruction: Looking Past Occlusions
Stars: ✭ 653 (-11.99%)
Mutual labels:  3d
Trois
✨ ThreeJS + VueJS 3 + ViteJS ⚡
Stars: ✭ 648 (-12.67%)
Mutual labels:  3d
Spector.js
Explore and Troubleshoot your WebGL scenes with ease.
Stars: ✭ 599 (-19.27%)
Mutual labels:  3d
Pythonocc Core
Python package for 3D CAD/BIM/PLM/CAM
Stars: ✭ 697 (-6.06%)
Mutual labels:  3d
Fauxgl
Software-only 3D renderer written in Go.
Stars: ✭ 658 (-11.32%)
Mutual labels:  3d
Wheelpicker
A smooth, highly customizable wheel view and picker view, support 3D effects like iOS. 一个顺滑的、高度自定义的滚轮控件和选择器,支持类似 iOS 的 3D 效果
Stars: ✭ 684 (-7.82%)
Mutual labels:  3d
Webworldwind
The NASA WorldWind Javascript SDK (WebWW) includes the library and examples for creating geo-browser web applications and for embedding a 3D globe in HTML5 web pages.
Stars: ✭ 628 (-15.36%)
Mutual labels:  3d
Osmbuildings
3d building geometry viewer based on OpenStreetMap data
Stars: ✭ 652 (-12.13%)
Mutual labels:  3d
Picogl.js
A minimal WebGL 2 rendering library
Stars: ✭ 671 (-9.57%)
Mutual labels:  3d
Awesome Blender
🪐 A curated list of awesome Blender addons, tools, tutorials; and 3D resources for everyone.
Stars: ✭ 608 (-18.06%)
Mutual labels:  3d
Anki 3d Engine
AnKi 3D Engine - Vulkan backend, modern renderer, scripting, physics and more
Stars: ✭ 688 (-7.28%)
Mutual labels:  3d
Jsmodeler
A JavaScript framework to create and visualize 3D models.
Stars: ✭ 608 (-18.06%)
Mutual labels:  3d
Lighthouse Ci Action
Audit URLs using Lighthouse and test performance with Lighthouse CI.
Stars: ✭ 666 (-10.24%)
Mutual labels:  lighthouse
Pyvista
3D plotting and mesh analysis through a streamlined interface for the Visualization Toolkit (VTK)
Stars: ✭ 734 (-1.08%)
Mutual labels:  3d
3d Convolutional Speaker Recognition
🔈 Deep Learning & 3D Convolutional Neural Networks for Speaker Verification
Stars: ✭ 697 (-6.06%)
Mutual labels:  3d
Sdf
Simple SDF mesh generation in Python
Stars: ✭ 683 (-7.95%)
Mutual labels:  3d

DIY Position Tracking using HTC Vive's Lighthouse Build Status

  • General purpose indoor positioning sensor, good for robots, drones, etc.
  • 3d position accuracy: currently ~10mm; less than 2mm possible with additional work.
  • Update frequency: 30 Hz
  • Output formats: Text; Mavlink ATT_POS_MOCAP via serial; Ublox GPS emulation (in works)
  • HTC Vive Station visibility requirements: full top hemisphere from sensor. Both stations need to be visible.
  • Positioning volume: same as HTC Vive, approx up to 4x4x3 meters.
  • Cost: ~$10 + Teensy 3.2 ($20) (+ Lighthouse stations (2x $135))
  • Skills to build: Low complexity soldering; Embedded C++ recommended for integration to your project.
  • License: MIT
image image
Demo showing raw XYZ position: image Indoor hold position for a drone: image

How it works

Lighthouse position tracking system consists of:
  – two stationary infrared-emitting base stations (we'll use existing HTC Vive setup),
  – IR receiving sensor and processing module (this is what we'll create).

The base stations are usually placed high in the room corners and "overlook" the room. Each station has an IR LED array and two rotating laser planes, horizontal and vertical. Each cycle, after LED array flash (sync pulse), laser planes sweep the room horizontally/vertically with constant rotation speed. This means that the time between the sync pulse and the laser plane "touching" sensor is proportional to horizontal/vertical angle from base station's center direction. Using this timing information, we can calculate 3d lines from each base station to sensor, the crossing of which yields 3d coordinates of our sensor (see calculation details). Great thing about this approach is that it doesn't depend on light intensity and can be made very precise with cheap hardware.

Visualization of one base station (by rvdm88, click for full video):
How it works

See also:
This Is How Valve’s Amazing Lighthouse Tracking Technology Works – Gizmodo
Lighthouse tracking examined – Oliver Kreylos' blog
Reddit thread on Lighthouse

The sensor we're building is the receiving side of the Lighthouse. It will receive, recognize the IR pulses, calculate the angles and produce 3d coordinates.

How it works – details

Base stations are synchronized and work in tandem (they see each other's pulses). Each cycle only one laser plane sweeps the room, so we fully update 3d position every 4 cycles (2 stations * horizontal/vertical sweep). Cycles are 8.333ms long, which is exactly 120Hz. Laser plane rotation speed is exactly 180deg per cycle.

Each cycle, as received by sensor, has the following pulse structure:

Pulse start, µs Pulse length, µs Source station Meaning
0 65–135 A Sync pulse (LED array, omnidirectional)
400 65-135 B Sync pulse (LED array, omnidirectional)
1222–6777 ~10 A or B Laser plane sweep pulse (center=4000µs)
8333 End of cycle

You can see all three pulses in the IR photodiode output (click for video): Lighthouse pulse structure

The sync pulse lengths encode which of the 4 cycles we're receiving and station id/calibration data (see description).

Hardware

Complete tracking module consists of two parts:

  • IR Sensor and amplifier (custom board)
  • Timing & processing module (we use Teensy 3.2)

IR Sensor

To detect the infrared pulses, of course we need an IR sensor. After a couple of attempts, I ended up using BPV22NF photodiodes. Main reasons are:

  • Optical IR filter 790-1050nm, which excludes most of sunlight, but includes the 850nm stations use.
  • High sensitivity and speed
  • Wide 120 degree field of view

To get the whole top hemisphere FOV we need to place 3 photodiodes in 120deg formation in horizontal plane, then tilt each one 30deg in vertical plane. I used a small 3d-printed part, but it's not required.

image image

Sensor board

IR photodiodes produce very small current, so we need to amplify it before feeding to a processing module. I use TLV2462IP opamp – a modern, general purpose rail-to-rail opamp with good bandwidth, plus there are 2 of them in a chip, which is convenient.

One more thing we need to add is a simple high-pass filter to filter out background illumination level changes.

Full schematics:
schematics

Top view Bottom view
image image

Part list (add to cart from here using 1-click BOM):

Part Model Count Cost (digikey)
D1, D2, D3 BPV22NF 3 3x$1.11
U1, U2 TLV2462IP 1 $2.80
Board Perma-proto 1 $2.95
C1 5pF 1 $0.28
C2 10nF 1 $0.40
R1 100k 1 $0.10
R2, R4 47k 2 2x$0.10
R3 3k 1 $0.10
Total $10.16

Sample oscilloscope videos:

Point Video
After transimpedance amplifier: we've got a good signal, but notice how base level changes depending on background illumination (click for video) first point
After high-pass filter: no more base level changes, but we see signal deformation second point
Sensor board output: 0-5v saturated signal output

Teensy connection

Teensy connections Full position tracker
image image

Note: Teensy's RX1/TX1 UART interface can be used to output position instead of USB.

Software (Teensy)

We use hardware comparator interrupt with ISR being called on both rise and fall edges of the signal. ISR (cmp0_isr) gets the timing in microseconds and processes the pulses depending on their lengths. We track the sync pulses lengths to determine which cycle corresponds to which base station and sweep. After the tracking is established, we convert time delays to angles and calculate the 3d lines and 3d position (see geometry.cpp). After position is determined, we report it as text to USB console and as Mavlink ATT_POS_MOCAP message to UART port (see mavlink.cpp).

NOTE: Currently, base station positions and direction matrices are hardcoded in geometry.cpp (lightsources). You'll need to adjust it for your setup. See #2.

Installation on macOS, Linux

Prerequisites:

  • GNU ARM Embedded toolchain. Can be installed on Mac with brew cask install gcc-arm-embedded. I'm developing with version 5_4-2016q3, but other versions should work too.
  • CMake 3.5+ brew install cmake
  • Command line uploader/monitor: ty. See build instructions in the repo.
  • I recommend CLion as the IDE - it made my life a lot easier and can compile/upload right there.

Getting the code:

$ git clone https://github.com/ashtuchkin/vive-diy-position-sensor.git
$ cd vive-diy-position-sensor
$ git submodule update --init

Compilation/upload command line (example, using CMake out-of-source build in build/ directory):

$ cd build
$ cmake .. -DPLATFORM=Teensy
$ make  # Build firmware
$ make vive-diy-position-sensor-upload  # Upload to Teensy
$ tyc monitor  # Serial console to Teensy

Installation on Windows

I haven't been able to make it work in Visual Studio, so providing command line build solution.

Prerequisites:

Getting the code is the same as above. GitHub client for Windows will make it even easier.

Building firmware:

cd build
cmake -G Ninja .. -DPLATFORM=Teensy
ninja  # Build firmware. Will generate "vive-diy-position-sensor.hex" in current directory.
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].