• An important step toward understanding the role of body-based cues on human spatial memory for large-scale environments

      Huffman, D.J.; Ekstrom, A.D.; University of Arizona (MIT Press Journals, 2020)
      Moving our body through space is fundamental to human navigation; however, technical and physical limitations have hindered our ability to study the role of these body-based cues experimentally. We recently designed an experiment using novel immersive virtual-reality technology, which allowed us to tightly control the availability of body-based cues to determine how these cues influence human spatial memory [Huffman, D. J., & Ekstrom, A. D. A modality-independent network underlies the retrieval of large-scale spatial environments in the human brain. Neuron, 104, 611–622, 2019]. Our analysis of behavior and fMRI data revealed a similar pattern of results across a range of body-based cues conditions, thus suggesting that participants likely relied primarily on vision to form and retrieve abstract, holistic representations of the large-scale environments in our experiment. We ended our paper by discussing a number of caveats and future directions for research on the role of body-based cues in human spatial memory. Here, we reiterate and expand on this discussion, and we use a commentary in this issue by A. Steel, C. E. Robertson, and J. S. Taube (Current promises and limitations of combined virtual reality and functional magnetic resonance imaging research in humans: A commentary on Huffman and Ekstrom (2019). Journal of Cognitive Neuroscience, 2020) as a helpful discussion point regarding some of the questions that we think will be the most interesting in the coming years. We highlight the exciting possibility of taking a more naturalistic approach to study the behavior, cognition, and neuroscience of navigation. Moreover, we share the hope that researchers who study navigation in humans and nonhuman animals will syner-gize to provide more rapid advancements in our understanding of cognition and the brain. © 2020 Massachusetts Institute of Technology.
    • Common and distinct roles of frontal midline theta and occipital alpha oscillations in coding temporal intervals and spatial distances

      Liang, M.; Zheng, J.; Isham, E.; Ekstrom, A.; University of Arizona (MIT Press Journals, 2021)
      Judging how far away something is and how long it takes to get there is critical to memory and navigation. Yet, the neural codes for spatial and temporal information remain unclear, particularly the involvement of neural oscillations in maintaining such codes. To address these issues, we designed an immersive virtual reality environment containing teleporters that displace participants to a different location after entry. Upon exiting the teleporters, participants made judgments from two given options regarding either the distance they had traveled (spatial distance condition) or the duration they had spent inside the teleporters (temporal duration condition). We wirelessly recorded scalp EEG while participants navigated in the virtual environment by physically walking on an omnidirectional treadmill and traveling through teleporters. An exploratory analysis revealed significantly higher alpha and beta power for short-distance versus long-distance traversals, whereas the contrast also revealed significantly higher frontal midline delta–theta–alpha power and global beta power increases for short versus long temporal duration teleportation. Analyses of occipital alpha instantaneous frequencies revealed their sensitivity for both spatial distances and temporal durations, suggesting a novel and common mechanism for both spatial and temporal coding. We further examined the resolution of distance and temporal coding by classifying discretized distance bins and 250-msec time bins based on multivariate patterns of 2-to 30-Hz power spectra, finding evidence that oscillations code fine-scale time and distance information. Together, these findings support partially independent coding schemes for spatial and temporal information, suggesting that low-frequency oscillations play important roles in coding both space and time. © 2021 Massachusetts Institute of Technology.
    • Greater early disambiguating information for less-probable words: The lexicon is shaped by incremental processing

      King, A.; Wedel, A.; Department of Linguistics, University of Arizona (MIT Press Journals, 2020)
      There has been much work over the last century on optimization of the lexicon for efficient communication, with a particular focus on the form of words as an evolving balance between production ease and communicative accuracy. Zipf’s law of abbreviation, the cross-linguistic trend for less-probable words to be longer, represents some of the strongest evidence the lexicon is shaped by a pressure for communicative efficiency. However, the various sounds that make up words do not all contribute the same amount of disambiguating information to a listener. Rather, the information a sound contributes depends in part on what specific lexical competitors exist in the lexicon. In addition, because the speech stream is perceived incrementally, early sounds in a word contribute on average more information than later sounds. Using a dataset of diverse languages, we demonstrate that, above and beyond containing more sounds, less-probable words contain sounds that convey more disambiguating information overall. We show further that this pattern tends to be strongest at word-beginnings, where sounds can contribute the most information. © 2020 Massachusetts Institute of Technology.