Perceptual Calibration for Immersive Display Environments


Kevin Ponto, Michael Gleicher, Robert G. Radwin, and Hyun Joon Shin

Abstract—The perception of objects, depth, and distance has been repeatedly shown to be divergent between virtual and physical environments. We hypothesize that many of these discrepancies stem from incorrect geometric viewing parameters, specifically that physical measurements of eye position are insufficiently precise to provide proper viewing parameters. In this paper, we introduce a perceptual calibration procedure derived from geometric models. While most research has used geometric models to predict perceptual errors, we instead use these models inversely to determine perceptually correct viewing parameters. We study the advantages of these new psychophysically determined viewing parameters compared to the commonly used measured viewing parameters in an experiment with 20 subjects. The perceptually calibrated viewing parameters for the subjects generally produced new virtual eye positions that were wider and deeper than standard practices would estimate. Our study shows that perceptually calibrated viewing parameters can significantly improve depth acuity, distance estimation, and the perception of shape.

PrePrint: https://graphics.cs.wisc.edu/Papers/2013/PGRS13/perCal-preprint.pdf

Published in:
Visualization and Computer Graphics, IEEE Transactions on  (Volume:19 ,  Issue: 4 )

Date of Publication: April 2013

PubMed ID :23428454