Reference Overview: HMD Calibration and Its Effects on Distance Judgments

Initial paper:

Kuhl, S. A., Thompson, W. B., & Creem-Regehr, S. H. (2009). HMD calibration and its effects on distance judgments. ACM Transactions on Applied Perception (TAP), 6(3), 19.

Experiments testing distance estimation subject to three potential miscalibrations in HMDs: pitch, pincusion distortion, minification/magnification via FOV. Only FOV is seen to cause change. Calibration procedures are suggested; the gist is to match against real world objects, popping the HMD on and off.

List of references, grouped by topic and ordered (loosely) by novelty vs related papers, usefulness, and whim:

— horizon / tilt different in VR / Real?
OOI, T. L., WU, B., AND HE, Z. J. 2001. Distance determination by the angular declination below the horizon. Nature 414, 197–200.
ANDRE, J. AND ROGERS, S. 2006. Using verbal and blind-walking distance estimates to investigate the two visual systems hypothesis. Percept. Psychophys. 68, 3, 353–361.

— support for effect of horizon position / tilt
MESSING, R. AND DURGIN, F. 2005. Distance perception and the visual horizon in head-mounted displays. ACM Trans. Appl. Percept. 2, 3, 234–250.
RICHARDSON, A. R. AND WALLER, D. 2005. The effect of feedback training on distance estimation in virtual environments. Appl. Cognitive Psych. 19, 1089–1108.
GARDNER, P. L. AND MON-WILLIAMS, M. 2001. Vertical gaze angle: Absolute height-in-scene information for the programming of prehension. Exper. Brain Res. 136, 3, 379–385.

— depth in photographs (2D?)
SMITH, O. W. 1958a. Comparison of apparent depth in a photograph viewed from two distances. Perceptual Motor Skills 8, 79–81.
SMITH, O. W. 1958b. Judgments of size and distance in photographs. Amer. J. Psych. 71, 3, 529–538.
KRAFT, R. N. AND GREEN, J. S. 1989. Distance perception as a function of photographic area of view. Percept. Psychophys. 45, 4, 459–466.

— AR calibration (vs real world objects)
MCGARRITY, E. AND TUCERYAN, M. 1999. A method for calibrating see-through head-mounted displays for AR. In Proceedings of the IEEE and ACM International Workshop on Augmented Reality. IEEE, Los Alamitos, CA, 75–84.
GILSON, S. J., FITZGIBBON, A. W., AND GLENNERSTER, A. 2008. Spatial calibration of an optical see-through head mounted display. J. Neurosci. Methods 173, 1, 140–146. Lawrence Erlbaum Associates, Hillsdale, NJ, 229–232.
GENC, Y., TUCERYAN, M., AND NAVAB, N. 2002. Practical solutions for calibration of optical see-through devices. In Proceedings of the 1st IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR’02). IEEE, Los Alamitos, CA.
AZUMA, R. AND BISHOP, G. 1994. Improving static and dynamic registration in an optical see-through HMD. In Proceedings of the 21st Annual Conference on Computer Graphics and Interactive Techniques (SIGGRAPH’04). ACM, New York, 197–204.

— effects of miscalibration / display properties
KUHL, S. A., CREEM-REGEHR, S. H., AND THOMPSON, W. B. 2008. Recalibration of rotational locomotion in immersive virtual environments. ACM Trans. Appl. Percept. 5, 3.
KUHL, S. A., THOMPSON,W. B., AND CREEM-REGEHR, S.H. 2006. Minification influences spatial judgments in virtual environments. In Proceedings of the Symposium on Applied Perception in Graphics and Visualization. ACM, New York, 15–19.
KUHL, S. A., THOMPSON, W. B., AND CREEM-REGEHR, S. H. 2008. HMD calibration and its effects on distance judgments. In Proceedings of the Symposium on Applied Perception in Graphics and Visualization. ACM, New York.
WILLEMSEN, P., COLTON, M. B.,CREEM-REGEHR, S. H., AND THOMPSON,W. B. 2009. The effects of head-mounted display mechanical properties and field-of-view on distance judgments in virtual environments. ACM Trans. Appl. Percept. 6, 2, 8:1–8:14.
WILLEMSEN, P., GOOCH, A. A., THOMPSON, W. B., AND CREEM-REGEHR, S. H. 2008. Effects of stereo viewing conditions on distance perception in virtual environments. Presence: Teleoperat. Virtual Environ. 17, 1, 91–101.
LUMSDEN, E. A. 1983. Perception of radial distance as a function of magnification and truncation of depicted spatial layout. Percept. Psychophys. 33, 2, 177–182.

— effects of feedback (lasts for a week?)
MOHLER, B. J., CREEM-REGEHR, S. H., AND THOMPSON,W. B. 2006. The influence of feedback on egocenteric distance judgments in real and virtual environments. In Proceedings of the Symposium on Applied Perception in Graphics and Visualization. ACM, New York, 9–14.

— visual quality
THOMPSON,W. B.,WILLEMSEN, P., GOOCH, A. A., CREEM-REGEHR, S. H., LOOMIS, J. M., AND BEALL, A. C. 2004. Does the quality of the computer graphics matter when judging distances in visually immersive environments? Presence: Teleoperat. Virtual Environ. 13, 5, 560–571.

— distortion correction
WATSON, B. A. AND HODGES, L. F. 1995. Using texture maps to correct for optical distortion in head-mounted displays. In Proceedings of the IEEE Conference on Virtual Reality. IEEE, Los Alamitos, CA, 172–178.
BAX, M. R. 2004. Real-time lens distortion correction: 3D video graphics cards are good for more than games. Stanford Electr. Eng. Comput. Sci. Res. J.
ROBINETT, W. AND ROLLAND, J. P. 1992. A computational model for the stereoscopic optics of a head-mounted display. Presence: Teleoperat. Virtual Environ. 1, 1, 45–62.

— camera calibration (spherical distortion, maybe some vision stuff)
TSAI, R. Y. 1987. A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses. IEEE J. Rob. Autom. 3, 4, 323–344.
WENG, J., COHEN, P., AND HERNIOU, M. 1992. Camera calibration with distortion models and accuracy evaluation. IEEE Trans.
Patt. Anal. Mach. Intell. 14, 10, 965–980.

— “distance underestimation exists”
WITMER, B. G. AND KLINE, P. B. 1998. Judging perceived and traversed distance in virtual environments. Presence: Teleoperat. Virtual Environ. 7, 2, 144–167.
KNAPP, J. 1999. The visual perception of egocentric distance in virtual environments. Ph.D. thesis, University of California at Santa Barbara.

— measures of percieved distance
SAHM, C. S., CREEM-REGEHR, S. H., THOMPSON, W. B., AND WILLEMSEN, P. 2005. Throwing versus walking as indicators of distance perception in real and virtual environments. ACM Trans. Appl. Percept. 1, 3, 35–45.

—- NOT FOUND —-

CAMPOS, J., FREITAS, P., TURNER, E.,WONG, M., AND SUN, H.-J. 2007. The effect of optical magnification/minimization on distance estimation by stationary and walking observers. J. Vision 7, 9, 1028a.

ELLIS, S. R. AND NEMIRE, K. 1993. A subjective technique for calibration of lines of sight in closed virtual environment viewing systems. In Proceedings of the Society for Information Display. Society for Information Display, Campbell, CA.

SEDGWICK, H. A. 1983. Environment-centered representation of spatial layout: Available information from texture and perspective. In Human and Machine Vision, J. Beck, B. Hope, and A. Rosenfeld, Eds. Academic Press, San Diego, CA, 425–458.

(also of note: sedgwick seems attatched to work on distance judgements vs spatial relations / disruptions)

GRUTZMACHER, R. P., ANDRE, J. T., AND OWENS, D. A. 1997. Gaze inclination: A source of oculomotor information for distance
perception. In Proceedings of the 9th International Conference on Perception and Action (Studies in Perception and Action IV ).

STOPER, A. E. 1999. Height and extent: Two kinds of perception. In Ecological Approaches to Cognition: Essays in Honor of
Ulric Neisser, E. Winograd, R. Fivush, and W. Hirst, Eds. Erlbaum, Hillsdale, NJ.

(book)
LOOMIS, J. M. AND KNAPP, J. 2003. Visual perception of egocentric distance in real and virtual environments. In Virtual and
Adaptive Environments, L. J. Hettinger and M. W. Haas, Eds. Erlbaum, Mahwah, NJ, 21–46.

(book)
ROGERS, S. 1995. Perceiving pictorial space. In Perception of Space and Motion,W. Epstein and S. Rogers, Eds. Academic Press,
San Diego, CA, 119–163.

(requested)
RINALDUCCI, E. J.,MAPES,D., CINQ-MARS, S. G., ANDHIGGINS,K. E. 1996. Determining the field of view in HMDs: A psychophysical method. Presence: Teleoperat. Virtual Environ. 5, 3, 353–356.

(misc find, not in refs)
Hendrix, C., & Barfield, W. (1994). Perceptual biases in spatial judgements as a function of eyepoint elevation angle and geometric field of view (No. 941441). SAE Technical Paper.

(misc find, not in refs)
Blackwell Handbook of Sensation and Perception
http://onlinelibrary.wiley.com.ezproxy.library.wisc.edu/book/10.1002/9780470753477

Kent State Fashion/Tech Hackathon

This past weekend I drove to Kent State in order to attend the TechStyle Symposium and the Fashion/Tech Hackathon. 

The symposium, held on Friday, was a gathering of apparel professors and graduate students from schools around the world, including Kent State, Iowa State, Loughborough University and others.  The talks in general revolved around various applications of technology in the apparel field.  They touched on topics such as 3D garment simulation, laser cutting, digital fabric printing and building technology into garments in order to assist disabled children.  In addition to the talks, there was a brief poster session featuring presentations from Iowa State graduate students.  Overall, the symposium was interesting and an excellent networking opportunity.

The Hackathon was a much different event, although equally interesting.  Over 150 students (undergrads and grads) from across the country were brought together and tasked with creating some sort of wearable technology prototype in 36 hours.  Assorted supplies were provided by the organizers (Arduinos, LEDs, Intel Edisons, Myo armbands, Oculus Rifts, etc.) for the hackers to use.  We also had access to the Kent State TextileLab facilities, included a 3D body scanner, 3D printers, a laser cut and digital fabric printers.  Teams could either be formed beforehand or at the event.

Although I had been told in advance that graduate students were welcome, there seemed to be very few actually attending the event.  That made trying to find a team a little awkward.  In the end, I decided to just work by myself.  That did mean that without any additional tech help I scaled back some of my experimentation and chose a project I knew I could complete within the alloted time.

The final outcome is what I called the LightPrint Dress (a terrible name, I know; in my defense I had only had 3 hours of sleep).

I rendered the neckpiece which I printed on a Makerbot Replicator 2, then embedded it with UV LEDs harvested from several small UV flashlights.  I designed the fabric and had it printed at the Kent State facilities.  I then hand-stencilled UV reactive liquid (aka Tide) onto sections of the pattern before draping the dress.  The intended outcome was that the UV lights would activate the reactive portions of the pattern, thereby changing the appearance of the textile in an interactive way.  Unfortunately, due to time and material limitations the final effect was not what I had hoped.

The project as a whole, however, was well-received by the judges.  I was awarded the prize for “Most Technically Challenging Hack” by one of the event sponsors.  The judges seemed most impressed by the fact that I had completed all aspects of the project by myself, thus showing a broad range of skills.  The prize was a Moto360 Smartwatch, which I am still trying to figure out how to use, lol.

While the project was not hugely challenging by my personal criteria, it is a good proof of concept that I would like to pursue further.  A future iteration I would like to explore is one where all of the electronics are fully integrated/encased in the neckpiece with a recharging port and wireless connection to some sort of app to enable user programming.  Rather than using UV LEDs, I would like to install high-powered RGB LEDs and use white fabric for the actual garment.  Theoretically, this would allow me to create user-controlled, color change garments.

Overall, the entire event was an excellent experience.  I would definitely participate in another fashion hack in the future.

Phenomenal Regression: First Look

A participant views a circle placed on a table in front of them, and is asked to describe what they see.  Their answer lies somewhere between what geometry tells us the retinal image should be (or, what we might render in a virtual world), and the “real” version of the circle, undistorted by perspective.  Back in the ’30s, Thouless observed this, and dubbed it “phenomenal regression” — that the observed, “phenomenal” shape is not the expected retinal image, but rather “regresses” to the “real” shape.

phenomenal regression example

From Elner & Wright, 2014.

This makes some sense with shapes (and orientations) simple enough to describe perspective transformation as compression on one axis; that is to say, when the “real” form is unambiguously just a circle, because other orientations are significantly less interesting.  Or perhaps the real shape is that aligned with the plane the object rests on — a mental estimation of an overhead view of the table?

Thouless claims it’s not simply a familiar form, though that experiment bears another read to convince me.  There’s also a bit on properties like brightness/color; Thouless seems to imply shape is not the only property for which we exhibit this regression, and that seems to further confuse how one constructs the “real” form.

Elner and Wright have recently (2014) explored using the concept as a measure of “spatial quality” in virtual environments; they introduce regression as “an involuntary reaction that cannot be defeated even when pointed out”, which could make for a compelling measure.  Their experiment is inconclusive (virtual cues possibly influenced by a physical tripod), and I’ll need to become more familiar with the lit on size constancy to understand why they claim so strongly that it’s not what they (nor Thouless) are doing.  But, they’ve a thorough paper, particularly related works and analysis; I suspect they do know what they’re doing, and I should probably revisit this sometime to better understand the implications.

 


  1. Elner, K. W., & Wright, H. (2014). Phenomenal regression to the real object in physical and virtual worlds. Virtual Reality, 1-11.

  2. Thouless, R. H. (1931). Phenomenal regression to the real object. I. British Journal of Psychology. General Section, 21(4), 339-359.

  3. Thouless, Robert H. “Phenomenal regression to the ‘real’object. II.” British Journal of Psychology. General Section 22.1 (1931): 1-30.

DSCVR and Unity

Hello everyone,

My name is Ted. I am a senior in the Applied Math, Engineering and Physics (AMEP) program with a penchant for architectural design and computer graphics. This semester I’ll be working with Prof. Ponto and the amazing piece of hardware at SoHE known as the DSCVR (Design Studies Commodity Virtual Reality). My goal is to cover the basics of the game engine called Unity, get acquainted with C#, and expand my modeling and rendering knowledge of 3DS MAX to develop an application that will be able to use the virtual reality features of the DSCVR to visualize building designs in real time.

This Week

On Tuesday last week, I got to do a tour of the DSCVR and experienced a hands-on demo of the system. The coolness factor is definitely overwhelming. The amazing thing about it is how accessible it would be to deploy such an equipment in a variety of settings. Any game developed in Unity can easily be made to run and take advantage of the DSCVR’s features by simply running a couple of script assets on top of your game files.

The following day I downloaded the Unity software to my home computer. Unfortunately, its grey interface can’t be changed to black and it makes text very hard to read unless you changed its size. Though it is possible to model within Unity, I am choosing to use my preferred 3D package to do the modeling and simply import the geometry into Unity. This week I spent a good ten hours watching some introductory tutorials on Unity and some others on modeling game sets on 3DS MAX.

Unity UI

Unity UI

Next Week

I expect to continue watching tutorials and begin making a simple game which will consist of a single room and a playable character.

See you next week!

Ted

2/1/2015 TEB Update

What I accomplished this week

  • Finished moving into the new office
  • Ordered the rest of the equipment and tools needed to complete my research
  • Populated the PCB with all the surface mount components. The process is as follows
    • Apply solder paste to component pads
    • Using tweezers, place components onto pads
    • Preheat oven to 400 degrees Fahrenheit
    • Place PCB in oven and wait a few minutes until you see the components ‘pop’ into place
    • Remove PCB and inspect joints. Use solder wick to remove solder if there is excess amounts and/or solder bridges
    • Clean PCB of remaining flux using toothbrush and isopropyl alcohol

Problems

  • Accidentally snapped a pin off the slide switch which rendered it as useless so I had to purchase more

Next week’s work

  • Solder the slide switch, push buttons and headers to PCB
  • Test PCB for functionality (Any fried components or smoke coming from PCB?)
  • Conduct electrical measurements using multimeter

Below is an image of the board for prototype 1.2

IMG_20150131_132557529