Omnidirectional texturing based on robust 3D registration through Euclidean reconstruction from two spherical images

作者:

Highlights:

摘要

We propose a semi-automatic omnidirectional texturing method that maps a spherical image onto a dense 3D model obtained by a range sensor. For accurate texturing, accurate estimation of the extrinsic parameters is inevitable. In order to estimate these parameters, we propose a robust 3D registration-based method between a dense range data set and a sparse spherical image stereo data set. For measuring the distances between the two data sets, we introduce generalized distances taking account of 3D error distributions of the stereo data. To reconstruct 3D models by images, we use two spherical images taken at arbitrary positions in arbitrary poses. Then, we propose a novel rectification method for spherical images that is derived from E matrix and facilitates the estimation of the disparities. The experimental results show that the proposed method can map the spherical image onto the dense 3D models effectively and accurately.

论文关键词:

论文评审过程:Received 24 July 2009, Accepted 17 December 2009, Available online 23 December 2009.

论文官网地址:https://doi.org/10.1016/j.cviu.2009.12.005