首页 | 本学科首页   官方微博 | 高级检索  
检索        


How vision and movement combine in the hippocampal place code
Authors:Guifen Chen  John A King  Neil Burgess  John O'Keefe
Institution:aDepartment of Cell and Developmental Biology and;eSainsbury Wellcome Centre for Neural Circuits and Behaviour, University College London, London WC1E 6BT, United Kingdom; and;bResearch Department of Clinical, Educational, and Health Psychology.;cInstitute of Cognitive Neuroscience, and;dInstitute of Neurology, University College London, London WC1N 3AR, United Kingdom
Abstract:How do external environmental and internal movement-related information combine to tell us where we are? We examined the neural representation of environmental location provided by hippocampal place cells while mice navigated a virtual reality environment in which both types of information could be manipulated. Extracellular recordings were made from region CA1 of head-fixed mice navigating a virtual linear track and running in a similar real environment. Despite the absence of vestibular motion signals, normal place cell firing and theta rhythmicity were found. Visual information alone was sufficient for localized firing in 25% of place cells and to maintain a local field potential theta rhythm (but with significantly reduced power). Additional movement-related information was required for normally localized firing by the remaining 75% of place cells. Trials in which movement and visual information were put into conflict showed that they combined nonlinearly to control firing location, and that the relative influence of movement versus visual information varied widely across place cells. However, within this heterogeneity, the behavior of fully half of the place cells conformed to a model of path integration in which the presence of visual cues at the start of each run together with subsequent movement-related updating of position was sufficient to maintain normal fields.Hippocampal place cells fire when the animal visits a specific area in a familiar environment (1), providing a population representation of self-location (24). However, it is still unclear what information determines their firing location (“place field”). Existing models suggest that movement-related information updates the representation of self-location from moment-to-moment (i.e., performing “path integration”), whereas environmental information provides initial localization and allows the accumulating error inherent in path integration to be corrected sporadically (513). Previous experimental work addressing this question has found it difficult to dissociate the different types of information available in the real world. Both external sensory cues (3, 1416) and internal self-motion information (1719) can influence place cell firing, but these have usually been tightly coupled in previous experiments.To date, a range of computational models predicting place fields has been proposed based on the assumption that either environmental sensory information (2022) or a self-motion metric is fundamental (7, 23). However, there is no agreement on which is more important and how these signals combine to generate spatially localized place cell firing and its temporal organization with respect to the theta rhythm (24). Recent studies showed that mice could navigate in a virtual environment (VE) and a small sample of place cells has been recorded in mice running on a virtual linear track (2527). VE affords the opportunity to isolate the visual environment and internal movement-related information from other sensory information, and to study their contributions to place cell firing. Here we use manipulations of these inputs in a VE to dissociate the relative contributions to place cell firing and theta rhythmicity of external sensory information relating to the (virtual) visual environment and internal movement-related (motoric and proprioceptive) information.
Keywords:
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号