bkornel / Reproject Image To 3d
Comparing a OpenCV's reprojectImageTo3D to my own
Stars: ✭ 13
Labels
Projects that are alternatives of or similar to Reproject Image To 3d
Stereo Vision
This program has been developed as part of a project at the University of Karlsruhe in Germany. The final purpose of the algorithm is to measure the distance to an object by combining two webcams and use them as a Stereo Camera.
Stars: ✭ 160 (+1130.77%)
Mutual labels: calibration, opencv
Arucounity
Bring augmented reality to Unity by tracking Aruco markers in real time.
Stars: ✭ 144 (+1007.69%)
Mutual labels: calibration, opencv
Facerec Lock
Face recognition to control servo lock using Raspberry Pi and OpenCV
Stars: ✭ 7 (-46.15%)
Mutual labels: opencv
Skydetector
A Python implementation of Sky Region Detection in a Single Image for Autonomous Ground Robot Navigation (Shen and Wang, 2013)
Stars: ✭ 23 (+76.92%)
Mutual labels: opencv
Computer Vision
Computer vision exercise with Python and OpenCV.
Stars: ✭ 17 (+30.77%)
Mutual labels: opencv
Brfv4 mac examples
macOS C++ examples utilizing OpenCV for camera access and drawing the face tracking results.
Stars: ✭ 25 (+92.31%)
Mutual labels: opencv
Ncappzoo
Contains examples for the Movidius Neural Compute Stick.
Stars: ✭ 902 (+6838.46%)
Mutual labels: opencv
Camodet
Lightweight Simple CAmera MOtion DETection application.
Stars: ✭ 26 (+100%)
Mutual labels: opencv
Mocr
Meaningful Optical Character Recognition from identity cards with Deep Learning.
Stars: ✭ 19 (+46.15%)
Mutual labels: opencv
Multi Threading Camera Stream
Multi-threading camera stream to improve video processing performance
Stars: ✭ 18 (+38.46%)
Mutual labels: opencv
Jpegrtspcamera
Sample RTSP server streaming MJPEG video from PC camera
Stars: ✭ 25 (+92.31%)
Mutual labels: opencv
Road Detection And Tracking
Involves the OpenCV based C++ implementation to detect and track roads for almost realtime performance
Stars: ✭ 17 (+30.77%)
Mutual labels: opencv
Hadoop Pot
A scalable Apache Hadoop-based implementation of the Pooled Time Series video similarity algorithm based on M. Ryoo et al paper CVPR 2015.
Stars: ✭ 8 (-38.46%)
Mutual labels: opencv
Vivalasvenus
@viva_las_venus -- This project is to learn, teach and awareness about privacy and security in the digital life, to build a better, more open and more inclusive world together!
Stars: ✭ 16 (+23.08%)
Mutual labels: opencv
Prlib
Pre-Recognition Library - library with algorithms for improving OCR quality.
Stars: ✭ 18 (+38.46%)
Mutual labels: opencv
Surf Opencv Inclination
Indentify an object by SURF algorithm using OpenCV and calculate the rotatinal angle of the object
Stars: ✭ 12 (-7.69%)
Mutual labels: opencv
Reproject-Image-To-3D
Comparing a OpenCV's reprojectImageTo3D()
to my own
// Reproject image to 3D
void customReproject(const cv::Mat& disparity, const cv::Mat& Q, cv::Mat& out3D)
{
CV_Assert(disparity.type() == CV_32F && !disparity.empty());
CV_Assert(Q.type() == CV_32F && Q.cols == 4 && Q.rows == 4);
// 3-channel matrix for containing the reprojected 3D world coordinates
out3D = cv::Mat::zeros(disparity.size(), CV_32FC3);
// Getting the interesting parameters from Q, everything else is zero or one
float Q03 = Q.at<float>(0, 3);
float Q13 = Q.at<float>(1, 3);
float Q23 = Q.at<float>(2, 3);
float Q32 = Q.at<float>(3, 2);
float Q33 = Q.at<float>(3, 3);
// Transforming a single-channel disparity map to a 3-channel image representing a 3D surface
for (int i = 0; i < disparity.rows; i++)
{
const float* disp_ptr = disparity.ptr<float>(i);
cv::Vec3f* out3D_ptr = out3D.ptr<cv::Vec3f>(i);
for (int j = 0; j < disparity.cols; j++)
{
const float pw = 1.0f / (disp_ptr[j] * Q32 + Q33);
cv::Vec3f& point = out3D_ptr[j];
point[0] = (static_cast<float>(j)+Q03) * pw;
point[1] = (static_cast<float>(i)+Q13) * pw;
point[2] = Q23 * pw;
}
}
}
Disparity map and Q matrix is from Martin Peris' Blog.
Note that the project description data, including the texts, logos, images, and/or trademarks,
for each open source project belongs to its rightful owner.
If you wish to add or remove any projects, please contact us at [email protected].