Fusion of Stereo Vision and Time-of-Flight Information for 3D Imaging

Fusion of Stereo Vision and Time-of-Flight Information for 3D Imaging

Youval Nehmadi

Autore: Youval Nehmadi
Formato: Copertina flessibile
Pagine: 176
Data Pubblicazione: 2014-08-13
Edizione: 1
Lingua: English

Descrizione:
The acquisition of threedimensional (3D) information is important for different types of applications: autonomous navigation of vehicles and robots, user interaction with the computer and more. This work investigates novel ways to fuse passive 3D sensors (i.e. stereovision) and active 3D sensors (i.e. TimeOfFlight (TOF)) to provide information about the surrounding environment. Few algorithms were developed and tested: linear and curved surface estimation using "RANdom SAmple Consensus" (RANSAC), fast stereo algorithm using TOF depth information in the calculation of the disparity map and enhanced pattern recognition method. This technology provides a high resolution depth map of 768x1024 pixels from a depth input image of 120x160. The accuracy of the resulting image is higher than the input image. This method enables a real time implementation of 3D imaging. An autonomous robot was built for real time Simultaneous Localization and Mapping (SLAM) for maneuvering in an unknown terrain. A novel Scale Invariant Features Transform (SIFT) point filter based on TOF information and a gray scale image was used. The mapping and localization methods were tested successfully.