Accurate eye-like segmentation in a heavily untextured contrasted scene

Ludovico Carozza, Alessandro Gherardi, Alessandro Bevilacqua

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

Automatic pattern recognition is a hard task to carry out when shapes or textures have to be detected. This can be even more difficult when accurate photometric measures are required. Nevertheless, most of times the feasibility to have a measurable ground truth at our disposal (it maybe is achievable using other sensors) gives us a reference point that makes the researcher task easier. In this paper, we present a method to segment automatically a light-shadow line in a very high-contrasted scene, in the context of an industrial automotive application, where the ground truth is the line as being perceived by sight from an experienced operator. After segmenting the line, some measures have been achieved related to accuracy and precision of a reference parameter of the line (the "elbow"), using an industrial prototype integral with the headlamp to be tested. The experiments prove how the method we developed is able to detect perturbation of the headlamp beam in pitch and yaw lower than 1/10deg, this representing an excellent outcome.
Original languageEnglish
Title of host publicationFirst Workshops on Image Processing Theory, Tools and Applications, 2008. IPTA 2008
PublisherIEEE
Pages414-420
Number of pages7
ISBN (Electronic)978-1-4244-3322-3
ISBN (Print)978-1-4244-3321-6
DOIs
Publication statusPublished - 2008
EventFirst International Workshops on Image Processing Theory, Tools and Applications - Sousse, Tunisia
Duration: 23 Nov 200826 Nov 2008

Conference

ConferenceFirst International Workshops on Image Processing Theory, Tools and Applications
Abbreviated titleIPTA 2008
Country/TerritoryTunisia
CitySousse
Period23/11/0826/11/08

Fingerprint

Dive into the research topics of 'Accurate eye-like segmentation in a heavily untextured contrasted scene'. Together they form a unique fingerprint.

Cite this