Spatial modelling of multi-layered LiDAR images using reversible jump MCMC

Sergio Hernandez-Marin, Andrew Michael Wallace, Gavin Jarvis Gibson

Research output: Contribution to conferencePaperpeer-review

1 Citation (Scopus)


3D imaging LiDAR systems have the potential to acquire multi-layered 3D image data; that is rather than store a single depth value at each pixel, it is possible to store the range to more than one surface within the pixel view direction. Multiple returns are possible at a single pixel when imaging through transparent surfaces, for example when acquiring depth images of cars or buildings that have windows, in which case it is possible to record both external and internal structure. Multiple returns are also possible when the pixel field of view encompasses more than one opaque surface. However, to build such multi-layered 3D images, we need to think of new ways of processing the LiDAR data. In this paper, we present a unified theory of pixel processing for such data. This is based on a reversible jump Markov chain Monte Carlo (RJMCMC) methodology extended to include spatial constraints by a Markov Random Field with a Potts prior model. We consider two distinct proposal distributions, based on spatial mode jumping and spatial birth/death processes respectively. We also include a delayed-rejection step in the RJMCMC algorithm to improve the estimates of the range and reflectance of each surface element. Our methodology is demonstrated on both photon count and burst illumination LiDAR data.

Original languageEnglish
Publication statusPublished - 2007
Event18th British Machine Vision Conference 2007 - Warwick, United Kingdom
Duration: 10 Sept 200713 Sept 2007


Conference18th British Machine Vision Conference 2007
Abbreviated titleBMVC 2007
Country/TerritoryUnited Kingdom

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition


Dive into the research topics of 'Spatial modelling of multi-layered LiDAR images using reversible jump MCMC'. Together they form a unique fingerprint.

Cite this