Contextual smoothing of image segmentation

Jonathan Letham, Neil M. Robertson, Barry Connor

Research output: Chapter in Book/Report/Conference proceedingConference contribution

6 Citations (Scopus)

Abstract

This paper presents a new method for improving region segmentation in sequences of images when temporal and spatial prior context is available. The proposed technique uses elementary classifiers on infra-red, polarimetic and video data to obtain a coarse segmentation per-pixel. Contextual information is exploited in a Bayesian formulation to smooth the segmentation between frames. This is a general framework and significantly enhances segmentation from the classifiers alone. The method is demonstrated by classifying images of a rural scene into 3 positive classes: sky, vegetation and road, and one class of all other unlabelled data. Priors for the probabilistic smoothing in this scene are learned from ground-truth images. It is shown that an overall improvement of around 10% is achieved. Individual classes are improved by up to 30%. © 2010 IEEE.

Original languageEnglish
Title of host publication2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops, CVPRW 2010
Pages7-12
Number of pages6
DOIs
Publication statusPublished - 2010
Event2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - San Francisco, CA, United States
Duration: 13 Jun 201018 Jun 2010

Conference

Conference2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition
Abbreviated titleCVPR 2010
CountryUnited States
CitySan Francisco, CA
Period13/06/1018/06/10

Fingerprint Dive into the research topics of 'Contextual smoothing of image segmentation'. Together they form a unique fingerprint.

  • Cite this

    Letham, J., Robertson, N. M., & Connor, B. (2010). Contextual smoothing of image segmentation. In 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops, CVPRW 2010 (pp. 7-12) https://doi.org/10.1109/CVPRW.2010.5543910