Efficient motion tracking using gait analysis

Huiyu Zhou, Patrick R. Green, Andrew M. Wallace

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Citations (Scopus)

Abstract

For navigation and obstacle detection, it is necessary to develop robust and efficient algorithms to compute ego-motion and model the changing scene. These algorithms must cope with the high video data rate from the input sensor. In this paper, we present an approach to achieve improved motion tracking from a monocular image sequence acquired by a camera attached to a pedestrian. The human gait is modelled from the motion history of the camera, and used to predict the feature positions in successive frames. This is encoded within a maximum a posteriori (MAP) framework to seek fast and robust motion estimation. Experimental results show how use of the gait model can reduce the computational load by allowing longer gaps between successive frames, while retaining the robust ability to track features.

Original languageEnglish
Title of host publicationIEEE International Conference on Acoustics, Speech, and Signal Processing, 2004. Proceedings
PagesIII601-III604
Volume3
DOIs
Publication statusPublished - 2004
Event29th IEEE International Conference on Acoustics, Speech, and Signal Processing 2004 - Montreal, Quebec, Canada
Duration: 17 May 200421 May 2004

Conference

Conference29th IEEE International Conference on Acoustics, Speech, and Signal Processing 2004
CountryCanada
CityMontreal, Quebec
Period17/05/0421/05/04

Fingerprint Dive into the research topics of 'Efficient motion tracking using gait analysis'. Together they form a unique fingerprint.

  • Cite this

    Zhou, H., Green, P. R., & Wallace, A. M. (2004). Efficient motion tracking using gait analysis. In IEEE International Conference on Acoustics, Speech, and Signal Processing, 2004. Proceedings (Vol. 3, pp. III601-III604) https://doi.org/10.1109/ICASSP.2004.1326616