Teleporting in virtual reality (funded by National Science Foundation grant 1816029):
Virtual reality often allows the user to physically walk to explore the virtual environment. However, virtual environments are typically larger than the physical space available for walking, so the user must use a locomotion interface to fully explore the virtual environment. One common locomotion interface is teleporting, whereby the user points a virtual laser pointer at the intended location and is then transported to that location without any motion cues. However, teleporting comes at a spatial cognitive cost. Our research investigates the pros and cons of locomotion interfaces, such as teleporting, and seeks practical guidelines to mitigate the spatial cognitive costs.
- Cherep, L.A., Lim, A.F., Kelly, J.W., Acharya, D., Velasco, A., Bustamante, E., Ostrander, A., & Gilbert, S.B. (2020). Spatial cognitive implications of teleporting through virtual environments. Journal of Experimental Psychology: Applied, 26(3), 480-492. *2020 Nickerson Award for best paper in the journal
Virtual reality can cause discomfort called cybersickness, which is somewhat similar to the experience of motion sickness. In this line of research, we are studying the causes of cybersickness and potential interventions to reduce cybersickness.
Connecting human navigation with animal neuroscience:
Neuroscience research involving rats and humans has identified several types of neurons that serve unique and important roles in navigation. For example, place cells respond when the animal occupies a specific location in a known environment, head direction cells respond when the animal occupies a specific orientation in the environment, and grid cells respond when the animal moves through the environment. John O’Keefe, May-Brit Moser, and Edvard Moser won the 2014 Nobel Prize for their research on these topics. We are conducting human
behavioral experiments based on predictions from known properties of place, grid, and head direction cells recorded from rats. These experiments have the potential to highlight whether and how specific cell types impact human navigational behaviors, as in this paper:
- Chen, X., He, Q., Kelly, J.W., Fiete, I.R., & McNamara, T.P. (2015). Bias in human path integration is predicted by properties of grid cells. Current Biology, 25(13), 1771-1776.
When we move through the environment, we are retrieving an existing spatial memories and/or creating a new spatial memory. Even a seemingly straightforward spatial behavior like planning a detour to avoid traffic congestion is a rather complex spatial task. It requires an accurate spatial memory of the surrounding neighborhood and also requires the ability to retrieve that memory in order to locate goals and plan routes.
Our research indicates that spatial memories are typically organized with respect to reference frames, which provide organizational structures for remembering locations. One effect of reference frames is that inter-object relationships aligned with the reference frame are more easily retrieved than misaligned inter-object relationships. For example, the rectangular walls of a room provide salient axes that influence the structure of memories for objects within the room. As a result, it is easier to imagine perspectives parallel to room axes. Here are just a few of the spatial memory research questions we are currently pursuing:
- What are the cues that influence reference frame selection?
- What roles do reference frames play during the development of spatial memories?
- Are locations learned through different sensory modalities incorporated into a common reference frame or independent reference frames?
- How does spatial memory differ from other types of memory?
The papers listed below illustrates some of the basic tools and techniques we use to answer these questions. Many other examples can be found by following the “Publications” link on the left.
- Kelly, J.W., Costabile, K.A. & Cherep, L.A. (2018). Social effects on reference frame selection. Psychonomic Bulletin & Review, 25(6), 2339-2345.
- Kelly, J.W., Carpenter, S.K., & Sjolund, L.A. (2015). Retrieval enhances route knowledge acquisition, but only when movement errors are prevented. Journal of Experimental Psychology: Learning, Memory, & Cognition, 41(5), 1540-1547.
- Kelly, J.W. & Avraamides, M.N. (2011). Cross-sensory transfer of reference frames in spatial memory. Cognition, 118, 444-450.
Navigation and Spatial Orientation:
Accurate spatial memories are just one component of successful navigation. We also need to know where we are within the remembered space, and which direction to move in order to achieve our navigational goal.
Our research on spatial orientation focuses on the cues used to stay oriented during navigation and to reorient after becoming lost. Some cues are provided by the environment, like the shape of the room or the direction of a landmark. Other cues are internal, or body-based, like our vestibular system. Here are just a few related research questions we are currently pursuing:
- How do we use body-based cues (like our vestibular system) to keep track of our movements?
- How are multiple cues integrated in order to stay oriented?
- What roles do reference frames play during movement through the environment?
The papers listed below illustrate a couple of our approaches to studying navigation and spatial orientation. Other examples can be found by following the “Publications” link.
- Sjolund, L.A., Kelly, J.W., & McNamara, T.P. (2018). Optimal combination of environmental cues and path integration during navigation. Memory & Cognition, 46, 89-99.
- Chen, X., McNamara, T.P., Kelly, J.W., & Wolbers, T. (2017). Cue combination in human spatial navigation. Cognitive Psychology, 95, 105-144.
- Kelly, J.W., McNamara, T.P., Bodenheimer, B., Carr, T.H. & Rieser, J.J. (2008). The shape of human navigation: How environmental geometry is used in the maintenance of spatial orientation. Cognition, 109, 281-286.
Perception of Distance in Virtual Reality:
Experiments in our lab often use immersive virtual reality to study spatial cognition. In immersive virtual reality, people wear a head mounted display (HMD) to view the virtual environment in stereo. They can physically walk and turn through our lab in order to move through the virtual environment.
Virtual reality is a great tool for basic scientific research. However, one shortcoming of VR is that distances are underperceived, often by as much as 50% of the intended distance. This is a practical problem that we are actively researching by applying perceptual and cognitive theories. Our research attempts to identify deficient distance cues in virtual reality, and develop training methods for improving distance perception in virtual reality.
The papers listed below illustrate some projects in our lab related to space perception in virtual reality. Other examples can be found by following the “Publications” link.
- Kelly, J. W., Cherep, L. A., Klesel, B., Siegel, Z. D., & George, S. (2018). Comparison of two methods for improving distance perception in virtual reality. ACM Transactions on Applied Perception, 15(2), 11:1-11.
- Kelly, J.W., Cherep, L.A., & Siegel, Z.D. (2017). Perceived space in the HTC Vive. ACM Transactions on Applied Perception, 15(1), 2:1-16.