Real-time planar segmentation of depth images: from three-dimensional edges to segmented planes

H. Javan Hemmat, E. Bondarau, P.H.N. de With

Research output: Contribution to journalArticleAcademicpeer-review

5 Citations (Scopus)
396 Downloads (Pure)

Abstract

Abstract. Real-time execution of processing algorithms for handling depth images in a three-dimensional (3-D) data framework is a major challenge. More specifically, considering depth images as point clouds and performing planar segmentation requires heavy computation, because available planar segmentation algorithms are mostly based on surface normals and/or curvatures, and, consequently, do not provide real-time performance. Aiming at the reconstruction of indoor environments, the spaces mainly consist of planar surfaces, so that a possible 3-D application would strongly benefit from a real-time algorithm. We introduce a real-time planar segmentation method for depth images avoiding any surface normal calculation. First, we detect 3-D edges in a depth image and generate line segments between the identified edges. Second, we fuse all the points on each pair of intersecting line segments into a plane candidate. Third and finally, we implement a validation phase to select planes from the candidates. Furthermore, various enhancements are applied to improve the segmentation quality. The GPU implementation of the proposed algorithm segments depth images into planes at the rate of 58 fps. Our pipeline-interleaving technique increases this rate up to 100 fps. With this throughput rate improvement, the application benefit of our algorithm may be further exploited in terms of quality and enhancing the localization.
Original languageEnglish
Article number051008
Pages (from-to)1-11
JournalJournal of Electronic Imaging
Volume24
Issue number5
DOIs
Publication statusPublished - 2015

Fingerprint

Dive into the research topics of 'Real-time planar segmentation of depth images: from three-dimensional edges to segmented planes'. Together they form a unique fingerprint.

Cite this