Abstract
The use of illumination and view-dependent texture information is recently the best way to capture the appearance of real-world materials accurately. One example is the Bidirectional Texture Function. The main disadvantage of these data is their massive size. In this article, we employ perceptually-based methods to allow more efficient handling of these data. In the first step, we analyse different uniform resampling by means of a psychophysical study with 11 subjects, comparing original data with rendering of a uniformly resampled version over the hemisphere of illumination and view-dependent textural measurements. We have found that down-sampling in view and illumination azimuthal angles is less apparent than in elevation angles and that illumination directions can be down-sampled more than view directions without loss of visual accuracy. In the second step, we analyzed subjects gaze fixation during the experiment. The gaze analysis confirmed results from the experiment and revealed that subjects were fixating at locations aligned with direction of main gradient in rendered stimuli. As this gradient was mostly aligned with illumination gradient, we conclude that subjects were observing materials mainly in direction of illumination gradient. Our results provide interesting insights in human perception of real materials and show promising consequences for development of more efficient compression and rendering algorithms using these kind of massive data. © 2009 ACM.
Original language | English |
---|---|
Article number | 18 |
Journal | ACM Transactions on Applied Perception |
Volume | 6 |
Issue number | 3 |
DOIs | |
Publication status | Published - 1 Aug 2009 |
Keywords
- BTF
- Eye tracking
- Phychophysical experiment
- Texture compression
- Uniform resampling
- Visual degradation