首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 328 毫秒
1.
2.
The complex relationship between structural and functional connectivity, as measured by noninvasive imaging of the human brain, poses many unresolved challenges and open questions. Here, we apply analytic measures of network communication to the structural connectivity of the human brain and explore the capacity of these measures to predict resting-state functional connectivity across three independently acquired datasets. We focus on the layout of shortest paths across the network and on two communication measures—search information and path transitivity—which account for how these paths are embedded in the rest of the network. Search information is an existing measure of information needed to access or trace shortest paths; we introduce path transitivity to measure the density of local detours along the shortest path. We find that both search information and path transitivity predict the strength of functional connectivity among both connected and unconnected node pairs. They do so at levels that match or significantly exceed path length measures, Euclidean distance, as well as computational models of neural dynamics. This capacity suggests that dynamic couplings due to interactions among neural elements in brain networks are substantially influenced by the broader network context adjacent to the shortest communication pathways.The topology and dynamics of brain networks are a central focus of the emerging field of connectomics (1). A growing number of studies of human brain networks carried out with modern noninvasive neuroimaging methods have begun to characterize the architecture of structural networks (24), as well as spatially distributed components (57) and time-varying dynamics (8) of functional networks. Although structural connectivity (SC) is inferred from diffusion imaging and tractography, functional connectivity (FC) is generally derived from pairwise correlations of time series recorded during “resting” brain activity, measured with functional magnetic resonance imaging (fMRI). Both networks define a multiplex system (9) in which the SC level shapes or imposes constraints on the FC level. Indeed, mounting evidence indicates that SC and FC are robustly related. Numerous studies have documented strong and significant correlations between the strengths of structural and functional connections at whole-brain (2, 1013) and mesoscopic scales (14), as well as acute changes in FC after perturbation of SC (15).Although there is ample evidence documenting statistical relationships between SC and FC, the causal role of SC in shaping whole-brain patterns of FC is still only incompletely understood. There are numerous reports of strong FC among brain regions that are not directly structurally connected, an effect that has been ascribed to signal propagation along one or more indirect structural paths (11), or to network-wide contextual influence (16). The present paper builds on two interrelated premises. First, if SC plays a major causal role in shaping resting-state FC, then appropriately configured generative models that incorporate SC topology should be able to predict, at least to some extent, FC patterns. To this end, a number of models have been proposed, including large-scale neural mass models generating synthetic fMRI time series (11, 17, 18) as well as analytic models based on distance and topological measures (19) or attractor dynamics (20, 21). Second, the extent to which the resting-state time courses of two brain regions become temporally aligned (i.e., highly functionally correlated) should be at least partially related to the ease with which mutual dynamic influences or perturbations can spread within the underlying structural brain network.Both premises imply that the strength of FC is related to measures of network communication. The principal communication measure applied previously in studies of brain networks is the efficiency (22), computed as the averaged inverse of the lengths of the shortest paths between node pairs. The use of this measure is based on the assumption that short paths are dynamically favored, as they allow more direct (faster, less noisy) transmission of neural signals. However, relying on path length as the sole measure of communication does not take into account how these paths are embedded in the rest of the network, which may further modulate the dynamic interactions of neuronal populations. For example, along a given path, branch points may lead to signal dispersion and hence attenuate FC, whereas local detours may offer alternative routes that amplify FC.Here, we present an approach toward predicting FC from SC based on several analytic measures of network communication. We used sets of high-resolution SC and FC maps of the cerebral cortex, obtained from three separate cohorts of participants and acquired using different scanners and imaging protocols. First, the relationship of FC to spatial embedding and path length was explored. Next, we attempted to predict FC from SC by implementing both linear and nonlinear computational models. We then examined the capacity of several analytic measures of network communication along shortest paths to predict FC from SC, singly and in the simple form of a joint multilinear model. Our results demonstrate that analytic measures that take into account the structural embedding of short paths are indeed capable of predicting a large portion of the variance observed in long-time averages of resting-brain FC.  相似文献   

3.
Schizophrenia may involve an elevated excitation/inhibition (E/I) ratio in cortical microcircuits. It remains unknown how this regulatory disturbance maps onto neuroimaging findings. To address this issue, we implemented E/I perturbations within a neural model of large-scale functional connectivity, which predicted hyperconnectivity following E/I elevation. To test predictions, we examined resting-state functional MRI in 161 schizophrenia patients and 164 healthy subjects. As predicted, patients exhibited elevated functional connectivity that correlated with symptom levels, and was most prominent in association cortices, such as the fronto-parietal control network. This pattern was absent in patients with bipolar disorder (n = 73). To account for the pattern observed in schizophrenia, we integrated neurobiologically plausible, hierarchical differences in association vs. sensory recurrent neuronal dynamics into our model. This in silico architecture revealed preferential vulnerability of association networks to E/I imbalance, which we verified empirically. Reported effects implicate widespread microcircuit E/I imbalance as a parsimonious mechanism for emergent inhomogeneous dysconnectivity in schizophrenia.Schizophrenia (SCZ) is a disabling psychiatric disease associated with widespread neural disturbances. These involve abnormal neurodevelopment (13), neurochemistry (47), neuronal gene expression (811), and altered microscale neural architecture (2). Such deficits are hypothesized to impact excitation-inhibition (E/I) balance in cortical microcircuits (12). Clinically, SCZ patients display a wide range of symptoms, including delusions, hallucinations (13, 14), higher-level cognitive deficits (15, 16), and lower-level sensory alterations (17). This display is consistent with a widespread neuropathology (18), such as the E/I imbalance suggested by the NMDA receptor (NMDAR) hypofunction model (1921). However, emerging resting-state functional magnetic resonance imaging (rs-fMRI) studies implicate more network-specific abnormalities in SCZ. Typically, these alterations are localized to higher-order association regions, such as the fronto-parietal control network (FPCN) (18, 22) and the default mode network (DMN) (23, 24), with corresponding disturbances in thalamo-cortical circuits connecting to association regions (25, 26). It remains unknown how to reconcile widespread cellular-level neuropathology in SCZ (20, 21, 27, 28) with preferential association network disruptions (29, 30).Currently a tension exists between two competing frameworks: global versus localized neural dysfunction in SCZ. Association network alterations in SCZ, identified via neuroimaging, may arise from a localized dysfunction (3, 9, 31, 32). Alternatively, they may represent preferential abnormalities arising emergently from a nonspecific global microcircuit disruption (20, 33). Mechanistically, an emergent preferential effect could occur because of intrinsic differences between cortical areas in the healthy brain, leading to differential vulnerability toward a widespread homogenous neuropathology. For example, histological studies of healthy primate brains show interregional variation in cortical cytoarchitectonics (3438). Additional studies reveal differences in microscale organization and activity timescales for neuronal populations in higher-order association cortex compared with lower-order sensory regions (3840). However, these well-established neuroanatomical and neurophysiological hierarchies have yet to be systematically applied to inform network-level neuroimaging disturbances in SCZ. In this study, we examined the neuroimaging consequences of cortical hierarchy as defined by neurophysiological criteria (i.e., functional) rather than anatomical or structural criteria.One way to link cellular-level neuropathology hypotheses with neuroimaging is via biophysically based computational models (18, 41). Although these models have been applied to SCZ, none have integrated cortical hierarchy into their architecture. Here we initially implemented elevated E/I ratio within our well-validated computational model of resting-state neural activity (18, 42, 43) without assuming physiological differences between brain regions, but maintaining anatomical differences. The model predicted widespread elevated functional connectivity as a consequence of elevated E/I ratio. In turn, we tested this connectivity prediction across 161 SCZ patients and 164 matched healthy comparison subjects (HCS). However, we discovered an inhomogeneous spatial pattern of elevated connectivity in SCZ generally centered on association cortices.To capture the observed inhomogeneity, we hypothesized that pre-existing intrinsic regional differences between association and lower-order cortical regions may give rise to preferential network-level vulnerability to elevated E/I. Guided by primate studies examining activity timescale differences across the cortical hierarchy (39, 44), we incorporated physiological differentiation across cortical regions in the model. Specifically, we tested whether pre-existing stronger recurrent excitation in “association” networks (39, 40) would preferentially increase their functional connectivity in response to globally elevated E/I. Indeed, modeling simulations predicted preferential effects of E/I elevation in association networks, which could not be explained by structural connectivity differences alone.Finally, we empirically tested all model-derived predictions by examining network-specific disruptions in SCZ. To investigate diagnostic specificity of SCZ effects, we examined an independent sample of bipolar disorder (BD) patients (n = 73) that did not follow model-derived predictions. These results collectively support a parsimonious theoretical framework whereby emergent preferential association network disruptions in SCZ can arise from widespread and nonspecific E/I elevations at the microcircuit level. This computational psychiatry study (45) illustrates the productive interplay between biologically grounded modeling and clinical effects, which may inform refinement of neuroimaging markers and ultimately rational development of treatments for SCZ.  相似文献   

4.
The increasing use of mouse models for human brain disease studies presents an emerging need for a new functional imaging modality. Using optical excitation and acoustic detection, we developed a functional connectivity photoacoustic tomography system, which allows noninvasive imaging of resting-state functional connectivity in the mouse brain, with a large field of view and a high spatial resolution. Bilateral correlations were observed in eight functional regions, including the olfactory bulb, limbic, parietal, somatosensory, retrosplenial, visual, motor, and temporal regions, as well as in several subregions. The borders and locations of these regions agreed well with the Paxinos mouse brain atlas. By subjecting the mouse to alternating hyperoxic and hypoxic conditions, strong and weak functional connectivities were observed, respectively. In addition to connectivity images, vascular images were simultaneously acquired. These studies show that functional connectivity photoacoustic tomography is a promising, noninvasive technique for functional imaging of the mouse brain.Resting-state functional connectivity (RSFC) is an emerging neuroimaging approach that aims to identify low-frequency, spontaneous cerebral hemodynamic fluctuations and their associated functional connections (1, 2). Recent research suggests that these fluctuations are highly correlated with local neuronal activity (3, 4). The spontaneous fluctuations relate to activity that is intrinsically generated by the brain, instead of activity attributable to specific tasks or stimuli (2). A hallmark of functional organization in the cortex is the striking bilateral symmetry of corresponding functional regions in the left and right hemispheres (5). This symmetry also exists in spontaneous resting-state hemodynamics, where strong correlations are found interhemispherically between bilaterally homologous regions as well as intrahemispherically within the same functional regions (3). Clinical studies have demonstrated that RSFC is altered in brain disorders such as stroke, Alzheimer’s disease, schizophrenia, multiple sclerosis, autism, and epilepsy (612). These diseases disrupt the healthy functional network patterns, most often reducing correlations between functional regions. Due to its task-free nature, RSFC imaging requires neither stimulation of the subject nor performance of a task during imaging (13). Thus, it can be performed on patients under anesthesia (14), on patients unable to perform cognitive tasks (15, 16), and even on patients with brain injury (17, 18).RSFC imaging is also an appealing technique for studying brain diseases in animal models, in particular the mouse, a species that holds the largest variety of neurological disease models (3, 13, 19, 20). Compared with clinical studies, imaging genetically modified mice allows exploration of molecular pathways underlying the pathogenesis of neurological disorders (21). The connection between RSFC maps and neurological disorders permits testing and validation of new therapeutic approaches. However, conventional neuroimaging modalities cannot easily be applied to mice. For instance, in functional connectivity magnetic resonance imaging (fcMRI) (22), the resting-state brain activity is determined via the blood-oxygen-level–dependent (BOLD) signal contrast, which originates mainly from deoxy-hemoglobin (23). The correlation analysis central to functional connectivity requires a high signal-to-noise ratio (SNR). However, achieving a sufficient SNR is made challenging by the high magnetic fields and small voxel size needed for imaging the mouse brain, as well as the complexity of compensating for field inhomogeneities caused by tissue–bone or tissue–air boundaries (24). Functional connectivity mapping with optical intrinsic signal imaging (fcOIS) was recently introduced as an alternative method to image functional connectivity in mice (3, 20). In fcOIS, changes in hemoglobin concentrations are determined based on changes in the reflected light intensity from the surface of the brain (3, 25). Therefore, neuronal activity can be measured through the neurovascular response, similar to the method used in fcMRI. However, due to the diffusion of light in tissue, the spatial resolution of fcOIS is limited, and experiments have thus far been performed using an exposed skull preparation, which increases the complexity for longitudinal imaging.Photoacoustic imaging of the brain is based on the acoustic detection of optical absorption from tissue chromophores, such as oxy-hemoglobin (HbO2) and deoxy-hemoglobin (Hb) (26, 27). This imaging modality can simultaneously provide high-resolution images of the brain vasculature and hemodynamics with intact scalp (28, 29). In this article, we perform functional connectivity photoacoustic tomography (fcPAT) to study RSFC in live mice under either hyperoxic or hypoxic conditions, as well as in dead mice. Our experiments show that fcPAT is able to detect connectivities between different functional regions and even between subregions, promising a powerful functional imaging modality for future brain research.  相似文献   

5.
The brain is not idle during rest. Functional MRI (fMRI) studies have identified several resting-state networks, including the default mode network (DMN), which contains a set of cortical regions that interact with a hippocampus (HC) subsystem. Age-related alterations in the functional architecture of the DMN and HC may influence memory functions and possibly constitute a sensitive biomarker of forthcoming memory deficits. However, the exact form of DMN–HC alterations in aging and concomitant memory deficits is largely unknown. Here, using both task and resting data from 339 participants (25–80 y old), we have demonstrated age-related decrements in resting-state functional connectivity across most parts of the DMN, except for the HC network for which age-related elevation of connectivity between left and right HC was found along with attenuated HC–cortical connectivity. Elevated HC connectivity at rest, which was partly accounted for by age-related decline in white matter integrity of the fornix, was associated with lower cross-sectional episodic memory performance and declining longitudinal memory performance over 20 y. Additionally, elevated HC connectivity at rest was associated with reduced HC neural recruitment and HC–cortical connectivity during active memory encoding, which suggests that strong HC connectivity restricts the degree to which the HC interacts with other brain regions during active memory processing revealed by task fMRI. Collectively, our findings suggest a model in which age-related disruption in cortico–hippocampal functional connectivity leads to a more functionally isolated HC at rest, which translates into aberrant hippocampal decoupling and deficits during mnemonic processing.The brain is not idle at rest (1). Rather, intrinsic neuronal signaling, which manifests as spontaneous fluctuations in the blood oxygen level-dependent (BOLD) functional MRI (fMRI) signal, is ubiquitous in the human brain and consumes a substantial portion of the brain’s energy (2). Coherent spontaneous activity has been revealed in a hierarchy of networks that span large-scale functional circuits in the brain (36). These resting-state networks (RSNs) show moderate-to-high test–retest reliability (7) and replicability (8), and some have been found in the monkey (9) and infant (10) brain. In the adult human brain, RSNs include sensory motor, visual, attention, and mnemonic networks, as well as the default mode network (DMN). There is evidence that the DMN entails interacting subsystems and hubs that are implicated in episodic memory (1113). One major hub encompasses the posterior cingulate cortex and the retrosplenial cortex. Other hubs include the lateral parietal cortex and the medial prefrontal cortex. In addition, a hippocampus (HC) subsystem is distinct from, yet interrelated with, the major cortical DMN hubs (12, 14).The functional architecture of the DMN and other RSNs is affected by different conditions, such as Alzheimer’s disease (AD), Parkinson’s disease, and head injury, suggesting that measurements of the brain’s intrinsic activity may be a sensitive biomarker and a putative diagnostic tool (for a review, see ref. 15). Alterations of the DMN have also been shown in age-comparative studies (16, 17), but the patterns of alterations are not homogeneous across different DMN components (18). Reduced functional connectivity among major cortical DMN nodes has been reported in aging (16, 17) and also in AD (19) and for asymptomatic APOE e4 carriers at increased risk of developing AD (20). Reduced cortical DMN connectivity has been linked to age-impaired performance on episodic memory (EM) tasks (21, 22). For instance, Wang and colleagues (21) showed that functional connectivity between cortical and HC hubs promoted performance on an EM task and was substantially weaker among low-performing elderly. This and other findings suggest that reductions in the DMN may be a basis for age-related EM impairment. However, elevated connectivity has been observed for the HC in individuals at genetic risk for AD (23, 24) and for elderly with memory complaints (25). Furthermore, a trend toward elevated functional connectivity for the medial temporal lobe (MTL) subsystem was observed in healthy older adults (26). Critically, higher subcortical RSN connectivity was found to correlate negatively with EM performance in an aging sample (27). Moreover, a recent combined fMRI/EEG study observed age increases in HC EEG beta power during rest (28).Thus, the association of aging with components of the DMN is complex, and it has been argued that age-related increases in functional connectivity need further examination (18). Such increases could reflect a multitude of processes, including age-related degenerative effects on the brain’s gray and white matter (18). Additionally, increases in HC functional connectivity may reflect alterations in proteolytic processes, such as amyloid deposition (29). Amyloid deposition is most prominent in posterior cortical regions of the DMN (29). It has been argued that there is a topological relationship between high neural activity over a lifetime within the DMN and amyloid deposition (30). Increased amyloid β protein burden within the posterior cortical DMN may cause cortico–hippocampal functional connectivity disruption (31), leading to a more functionally isolated HC network, which translates into aberrant hippocampal decoupling (30, 32, 33). Correspondingly, a recent model hypothesized that progressively less inhibitory cortical input would cause HC hyperactivity in aging (34).Elevated HC resting-state connectivity might thus be a sign of brain dysfunction, but the evidence remains inconclusive. Here, using data from a population-based sample covering the adult age span (n = 339, 25–80 y old), we tested the hypothesis that aging differentially affects distinct DMN components. A data-driven approach, independent component analysis (ICA), was used to identify DMN subsystems (4). We expected to observe age-related decreases in the connectivity of the cortical DMN. We also examined age-related alterations of HC RSN connectivity, and tested whether such alterations were related to HC volume and white matter integrity. We predicted that if increased HC connectivity was found, it would be accompanied by age-related decreases in internetwork connectivity of the HC RSN with cortical DMN regions. To constrain interpretations of age-related alterations, the DMN components were related to cognitive performance. Elevated HC RSN should negatively correlate with level and longitudinal change in EM performance. Such negative correlations could reflect an inability to flexibly recruit the HC and functionally associated areas during EM task performance due to aberrant hippocampal decoupling (23, 24). We tested this prediction by relating the HC RSN, within-person, to HC recruitment during an EM fMRI task (35, 36).  相似文献   

6.
The mechanism underlying temporal correlations among blood oxygen level-dependent signals is unclear. We used oxygen polarography to better characterize oxygen fluctuations and their correlation and to gain insight into the driving mechanism. The power spectrum of local oxygen fluctuations is inversely proportional to frequency raised to a power (1/f) raised to the beta, with an additional positive band-limited component centered at 0.06 Hz. In contrast, the power of the correlated oxygen signal is band limited from ∼0.01 Hz to 0.4 Hz with a peak at 0.06 Hz. These results suggest that there is a band-limited mechanism (or mechanisms) driving interregional oxygen correlation that is distinct from the mechanism(s) driving local (1/f) oxygen fluctuations. Candidates for driving interregional oxygen correlation include rhythmic or pseudo-oscillatory mechanisms.Resting-state functional connectivity MRI (rs-fcMRI) analyses provide insight into the functional architecture of the brain. The method is based on slow correlations (e.g., 0.01–0.1 Hz) in blood oxygen level-dependent (BOLD) signal across the brain. The pattern of these slow correlations has been used to trace out functional networks and to describe how these networks develop, change with experience, vary across individuals, and are disturbed in disease (18). Slow BOLD fluctuations and their correlations are thought to reflect neuronal processes, yet the underlying mechanisms remain unknown (9, 10). We used a high temporal resolution method, oxygen polarography, to characterize the dynamics of oxygen fluctuations and thereby gain insight into the underlying neuronal mechanisms.Two types of dynamics commonly observed in the brain may be associated with two distinct types of underlying mechanisms or processes. Dynamics with narrow band-limited power may reflect the influence of specific pacemaker units. For instance, the occipital alpha rhythm, which dominates the EEG during relaxed wakefulness, may originate from an alpha pacemaker unit, which consists of a specialized subset of gap-junction–coupled thalamocortical neurons that exhibit intrinsic rhythmic bursting at alpha frequencies (1113). Although much evidence supports this oscillatory model of resting-state activity (e.g., refs. 14 and 15), the dominant hypothesis in the field is that correlations arise from neural activity propagating within an anatomically constrained small world network (e.g., refs. 16 and 17). This model predicts scale-free dynamics, also known as 1/f dynamics (17, 18). With 1/f dynamics, event amplitude varies inversely with frequency, so that large events are rare whereas small events are common. More precisely, power may vary inversely with frequency raised by a (small) exponent: P ∝ 1/fβ, with typical exponents from 0 to 3 (19). The 1/f dynamics are a hallmark of a complex dynamic system operating at a critical point, at which the system is balanced between ordered and disordered phases (2022), although 1/f dynamics may also arise in other noncritical systems (23). The fact that various neural signals, such as local field potentials, show 1/f characteristics has inspired models of the brain as operating at a critical point through a process of self-organization (17, 2428).Local BOLD fluctuations have a 1/f power spectrum (2931). This has led to the suggestion that the slow correlations of resting-state connectivity may reflect a critical process (18, 19). This assumes that the dynamics of interregional oxygen correlation match the dynamics of local fluctuations. Indeed, three studies report that BOLD correlations vary inversely with frequency (1/f), much like local oxygen (3234). However, Sasai et al. (35), Achard et al. (36), and Cordes et al. (37) report instead that oxygen correlation peaks around 0.04–0.06 Hz, with less correlation at lower frequencies—a band-limited pattern that is distinctly different from 1/f (3840). Finally, other studies report that BOLD correlations reach a plateau at low frequencies, a result that is intermediate between 1/f and band-limited dynamics (41, 42).We used oxygen polarography to directly measure the spectrum of interregional oxygen correlation. Polarography is an invasive alternative to BOLD fMRI that allows robust recording of local oxygen fluctuations with higher temporal resolution, higher frequency specificity, and broader frequency range than can be achieved with standard fMRI techniques. We measured oxygen fluctuations in the default network [bilateral posterior cingulate cortex (PCC) area 23] and the visual/attention network (bilateral V3) in the awake, resting macaque. Here, we report that correlations between homotopic regions are band limited rather than 1/f. Further, we show that the variance of local oxygen fluctuations can be separated into a 1/f component and a band-limited component. Only the band-limited component relates to long-range correlation. This suggests that there is a band-limited mechanism (or mechanisms) driving interregional oxygen correlation that is distinct from the mechanism(s) driving local (1/f) oxygen fluctuations. The fact that correlation is band limited is suggestive of a rhythmic or pseudo-oscillatory mechanism.  相似文献   

7.
Although typically identified in early childhood, the social communication symptoms and adaptive behavior deficits that are characteristic of autism spectrum disorder (ASD) persist throughout the lifespan. Despite this persistence, even individuals without cooccurring intellectual disability show substantial heterogeneity in outcomes. Previous studies have found various behavioral assessments [such as intelligence quotient (IQ), early language ability, and baseline autistic traits and adaptive behavior scores] to be predictive of outcome, but most of the variance in functioning remains unexplained by such factors. In this study, we investigated to what extent functional brain connectivity measures obtained from resting-state functional connectivity MRI (rs-fcMRI) could predict the variance left unexplained by age and behavior (follow-up latency and baseline autistic traits and adaptive behavior scores) in two measures of outcome—adaptive behaviors and autistic traits at least 1 y postscan (mean follow-up latency = 2 y, 10 mo). We found that connectivity involving the so-called salience network (SN), default-mode network (DMN), and frontoparietal task control network (FPTCN) was highly predictive of future autistic traits and the change in autistic traits and adaptive behavior over the same time period. Furthermore, functional connectivity involving the SN, which is predominantly composed of the anterior insula and the dorsal anterior cingulate, predicted reliable improvement in adaptive behaviors with 100% sensitivity and 70.59% precision. From rs-fcMRI data, our study successfully predicted heterogeneity in outcomes for individuals with ASD that was unaccounted for by simple behavioral metrics and provides unique evidence for networks underlying long-term symptom abatement.Although typically identified in childhood, the social communication symptoms that are characteristic of autism spectrum disorder (ASD) persist throughout the lifespan (1, 2). On average, individuals with ASD show smaller age-related improvements in adaptive behaviors, including daily living skills critical for independent living, than do typically developing (TD) peers (24). The burden of prolonged clinical symptom expression, coupled with limited adaptive behaviors, leads to a relatively poor prognosis for a majority of adults with ASD. For example, only 12% of adults with ASD achieve “very good” outcomes, defined by a high level of independence (5). Adolescence and young adulthood are poorly understood in ASD. Although there seems to be a great deal of change during this time, the nature of this change varies across studies, with a handful of studies reporting a decline in functioning (2, 3, 6), others reporting general improvement (7, 8), and still others reporting a quadratic course of autistic symptoms and adaptive functioning where the trajectory peaks in late adolescence (9) or the late 20s (10) and begins to fall subsequently.Predictors of positive outcomes in ASD include higher intelligence quotient (IQ) (1113), language ability (9, 14), less severe ASD symptoms (15), and stronger adaptive behaviors (16, 17). However, there is substantial variability in outcome even among individuals with ASD without cooccurring intellectual disability (7, 13, 17, 18). Age, IQ, and language ability accounted for as much as 45% of the variance in outcome measures in a sample composed of predominantly individuals with both ASD and intellectual disability (11). Others reported more modest numbers for these predictors, with IQ predicting 3% of variance in outcome and language ability predicting 32% of variance in outcome (14). A study that included only individuals with ASD without cooccurring intellectual disability found age and IQ as weaker predictors, predicting 6–28% of various adaptive behavior subscales (6). Although these previous studies have been successful in predicting these outcomes using behavioral measures, in most cases, the majority of the variance in outcomes remains unexplained. Thus, it remains difficult to identify individuals with ASD who may struggle to achieve independence during adulthood and who may benefit from additional intervention.In the present study, we explored whether a functional neuroimaging-based measure of brain connectivity, termed resting-state functional connectivity MRI (rs-fcMRI), can predict variance in behavioral outcomes in young adults with ASD beyond that explained by cognitive or behavioral measures. Functional connectivity strength in individuals with ASD has been found to predict an ASD diagnosis (1921) and to correlate with many aspects of cognition and behavior that also predict outcome, including IQ (21, 22), and ASD symptomatology using the Autism Diagnostic Observation Schedule (2123), the Autism Diagnostic Interview-Revised (20), and the Social Responsiveness Scale (SRS) (19, 22, 24). As such, brain measures may explain additional variance in behavioral outcomes. One previous study has shown that combining functional MRI (fMRI) data with behavioral data increased predictive power for categorical language outcomes in early developing ASD (25). Brain-derived data has also added explanatory power to predictive models of depression (26), dyslexia (27), alcoholism (28), and reading and math ability (29, 30).We tested whether rs-fcMRI data acquired in late adolescence and early adulthood [time 1 (T1)] could predict behavioral outcomes at least 1 y after the imaging data were acquired [time 2 (T2)]. We defined behavioral outcomes with a measure of predominantly social autistic traits (SRS) and a measure of adaptive functioning [Adaptive Behavior Assessment System-Second Edition (ABAS-II)]. Using an approach that controlled for nuisance variables (e.g., variable duration between time 1 and time 2) and variables known to strongly predict outcome (e.g., age and baseline score on outcome measure), we performed regressions to investigate whether and to what extent the remaining variance in outcome could be predicted by baseline functional connectivity in networks known to be involved in ASD.  相似文献   

8.
The prefrontal cortex continues to mature after puberty and into early adulthood, mirroring the time course of maturation of cognitive abilities. However, the way in which prefrontal activity changes during peri- and postpubertal cortical maturation is largely unknown. To address this question, we evaluated the developmental stage of peripubertal rhesus monkeys with a series of morphometric, hormonal, and radiographic measures, and conducted behavioral and neurophysiological tests as the monkeys performed working memory tasks. We compared firing rate and the strength of intrinsic functional connectivity between neurons in peripubertal vs. adult monkeys. Notably, analyses of spike train cross-correlations demonstrated that the average magnitude of functional connections measured between neurons was lower overall in the prefrontal cortex of peripubertal monkeys compared with adults. The difference resulted because negative functional connections (indicative of inhibitory interactions) were stronger and more prevalent in peripubertal compared with adult monkeys, whereas the positive connections showed similar distributions in the two groups. Our results identify changes in the intrinsic connectivity of prefrontal neurons, particularly that mediated by inhibition, as a possible substrate for peri- and postpubertal advances in cognitive capacity.The prefrontal cortex, the brain area associated with the highest-level cognitive operations, is known to undergo a protracted period of development (13). A virtually linear increase in performance with age has been observed in tasks that assess visuospatial working memory, executive control, and resistance to distraction, a process that continues well after puberty and into early adulthood (4, 5). The accrual of cognitive capacities during this period parallels structural changes of the prefrontal cortex in humans and nonhuman primates (610). Imaging studies in humans suggest that patterns of brain activation associated with working memory tasks undergo distinct changes between childhood and adulthood, supporting the idea of prolonged prefrontal maturation (1114). However, how the patterns of prefrontal activation change during cortical maturation remains unclear. A possible mechanism that could account for variations in prefrontal responses—and which could have a significant functional impact (15)—is an overall change in the distribution of intrinsic functional connections, i.e., those between neurons within the prefrontal cortex. The intrinsic connectivity of a network is directly related to the correlation structure of neuronal responses, and this determines in a fundamental way the information-coding properties of the network and its ability to sustain activity on its own (1618). In this study, we sought to determine if the strengths of functional connections inferred from multisite neurophysiological recordings differ between peripubertal and adult monkeys.  相似文献   

9.
10.
11.
In many patients with major depressive disorder, sleep deprivation, or wake therapy, induces an immediate but often transient antidepressant response. It is known from brain imaging studies that changes in anterior cingulate and dorsolateral prefrontal cortex activity correlate with a relief of depression symptoms. Recently, resting-state functional magnetic resonance imaging revealed that brain network connectivity via the dorsal nexus (DN), a cortical area in the dorsomedial prefrontal cortex, is dramatically increased in depressed patients. To investigate whether an alteration in DN connectivity could provide a biomarker of therapy response and to determine brain mechanisms of action underlying sleep deprivations antidepressant effects, we examined its influence on resting state default mode network and DN connectivity in healthy humans. Our findings show that sleep deprivation reduced functional connectivity between posterior cingulate cortex and bilateral anterior cingulate cortex (Brodmann area 32), and enhanced connectivity between DN and distinct areas in right dorsolateral prefrontal cortex (Brodmann area 10). These findings are consistent with resolution of dysfunctional brain network connectivity changes observed in depression and suggest changes in prefrontal connectivity with the DN as a brain mechanism of antidepressant therapy action.Sleep deprivation has been used for decades as a rapid-acting and effective treatment in patients with major depressive disorder (MDD) (1, 2). Although clinically well established, the mechanisms of action are largely unknown.Brain imaging studies have shown that sleep deprivation in depressed patients is associated with renormalized metabolic activity, mainly in limbic structures including anterior cingulate (ACC) as well as dorsolateral prefrontal cortex (DLPFC) (36), and that changes in limbic and DLPFC activity correlated with a relief of depression symptoms (79). Recent studies in patients with depression point to a critical importance of altered large-scale brain network connectivity during the resting state (10, 11). Among these networks, the default mode network (DMN), which mainly comprises cortical midline structures including precuneus and medial frontal cortex as well as the inferior parietal lobule (1215), is most consistently characterized. In functional magnetic resonance imaging (fMRI) studies, the DMN shows the strongest blood oxygenation level–dependent (BOLD) activity during rest and decreased BOLD reactivity during goal-directed task performance. The DMN is anticorrelated with the cognitive control network (CCN), a corresponding task-positive network, which encompasses bilateral fronto-cingulo-parietal structures including lateral prefrontal and superior parietal areas (16). A third system with high relevance for depression—the affective network (AN)—is based in the subgenual and pregenual parts of the ACC [Brodman area (BA) 32] (17). The AN is active during both resting and task-related emotional processing, and forms strong functional and structural connections to other limbic areas such as hypothalamus, amygdala, entorhinal cortex, and nucleus accumbens (18, 19).Increased connectivity of DMN, CCN, and AN with a distinct area in the bilateral dorsomedial prefrontal cortex (DMPFC) was recently found in patients with depression compared with healthy controls (20). This area within the DMPFC was termed dorsal nexus (DN) and was postulated to constitute a converging node of depressive “hot wiring,” which manifests itself in symptoms of emotional, cognitive, and vegetative dysregulation. This led to the hypothesis that a modification in connectivity via the DN would be a potential target for antidepressant treatments (20).Recent studies in healthy subjects reported reduced functional connectivity within DMN and between DMN and CCN in the morning after total (21) and in the evening after partial sleep deprivation (22). However, brain network connectivity via the DN was not examined in these studies. Given the recently proposed role of the DN in mood regulation, here we specifically tested whether sleep deprivation as a well-known antidepressant treatment modality affects connectivity via the DN. Based on our previous findings on network changes by ketamine (23), we hypothesized that sleep deprivation leads to a reduction in connectivity via the DN.  相似文献   

12.
Recent advances in blood oxygen level-dependent–functional MRI (BOLD-fMRI)-based neurofeedback reveal that participants can modulate neuronal properties. However, it is unknown whether such training effects can be introduced in the absence of participants'' awareness that they are being trained. Here, we show unconscious neurofeedback training, which consequently produced changes in functional connectivity, introduced in participants who received positive and negative rewards that were covertly coupled to activity in two category-selective visual cortex regions. The results indicate that brain networks can be modified even in the complete absence of intention and awareness of the learning situation, raising intriguing possibilities for clinical interventions.There has been a growing interest in the field of neuroscience in the use of neurofeedback (NF) as a tool to both study and treat various clinical conditions. The uses of NF are diverse, ranging across a variety of motor and sensory tasks (14), investigation of cortical plasticity and attention (59), to treatment of chronic pain, depression, and mood control (1013).Recent advances in functional MRI (fMRI) techniques and hardware have made real-time fMRI (rtfMRI) a viable method for NF (14). This enables more anatomically specific training compared with methods such as EEG. This enhanced localization additionally allows to provide feedback to differential activation patterns (6, 15, 16), beyond simple up/down-regulation of a specific region/frequency.Another advance in the field of NF is the finding by several recent studies that participants are able to learn to successfully perform the NF paradigm, even without being given an explicit strategy (8, 16). This form of implicit learning is intriguing, both because there have been reports indicating certain advantages to implicit over explicit learning (17, 18), but mostly because this opens up previously unidentified pathways for therapeutic intervention, for cases for which there are no specific explicit strategies available (for instance, control over complex networks, such as in epilepsy, or over brain regions whose function is not fully elucidated).However, an important common factor in all previous NF studies was the fact that participants were aware that they were being trained, and received specific goals for this training. A fundamental question that therefore remains unanswered is whether targeted brain networks can still be modulated even in the complete absence of participants'' awareness that a training process is taking place. Theories of closed-loop learning provide evidence that such implicit learning through reward cues is possible (19, 20). This is an important issue, because it may open the way for NF training even in severe clinical cases such as minimally conscious or vegetative state, where such awareness is absent.In the present study, we examined this question in fMRI experiments in which participants were informed that they were engaged in a task aimed at mapping reward networks. Unbeknownst to them, these rewards were coupled with fMRI activations in specific cortical networks. Participants received auditory feedback associated with positive and negative rewards, based on blood oxygen level-dependent (BOLD)–fMRI activity from two well-researched visual regions of interest (ROIs), the fusiform face area (FFA) and the parahippocampal place area (PPA) (2123). However, participants were not informed of this procedure and believed, as revealed also by postscan interviews and questionnaires, that the reward was given at random.We have examined whether participants could learn implicitly to appropriately modulate their spontaneous cortical activity to increase reward. Previous work in our group (24) and others (2527) has demonstrated that training effects, albeit with explicit participants'' awareness of the training procedure, may leave a trace in the spontaneous patterns. Our question was whether such a trace could be similarly found following our covert training, with the crucial difference being that here participants had no explicit knowledge of the NF task, or even that it was possible to influence the reward.Our results show that 10 of 16 participants (62.5%) were indeed able to modulate their brain activity to enhance the positive rewards. Importantly, participants were completely unaware that they were so doing. We further show that this ability was associated with changes in connectivity that were apparent in the posttraining rest sessions, indicating that the network changes resulting from the training carried over beyond the training period itself.  相似文献   

13.
Cognition presents evolutionary research with one of its greatest challenges. Cognitive evolution has been explained at the proximate level by shifts in absolute and relative brain volume and at the ultimate level by differences in social and dietary complexity. However, no study has integrated the experimental and phylogenetic approach at the scale required to rigorously test these explanations. Instead, previous research has largely relied on various measures of brain size as proxies for cognitive abilities. We experimentally evaluated these major evolutionary explanations by quantitatively comparing the cognitive performance of 567 individuals representing 36 species on two problem-solving tasks measuring self-control. Phylogenetic analysis revealed that absolute brain volume best predicted performance across species and accounted for considerably more variance than brain volume controlling for body mass. This result corroborates recent advances in evolutionary neurobiology and illustrates the cognitive consequences of cortical reorganization through increases in brain volume. Within primates, dietary breadth but not social group size was a strong predictor of species differences in self-control. Our results implicate robust evolutionary relationships between dietary breadth, absolute brain volume, and self-control. These findings provide a significant first step toward quantifying the primate cognitive phenome and explaining the process of cognitive evolution.Since Darwin, understanding the evolution of cognition has been widely regarded as one of the greatest challenges for evolutionary research (1). Although researchers have identified surprising cognitive flexibility in a range of species (240) and potentially derived features of human psychology (4161), we know much less about the major forces shaping cognitive evolution (6271). With the notable exception of Bitterman’s landmark studies conducted several decades ago (63, 7274), most research comparing cognition across species has been limited to small taxonomic samples (70, 75). With limited comparable experimental data on how cognition varies across species, previous research has largely relied on proxies for cognition (e.g., brain size) or metaanalyses when testing hypotheses about cognitive evolution (7692). The lack of cognitive data collected with similar methods across large samples of species precludes meaningful species comparisons that can reveal the major forces shaping cognitive evolution across species, including humans (48, 70, 89, 9398).To address these challenges we measured cognitive skills for self-control in 36 species of mammals and birds (Fig. 1 and Tables S1–S4) tested using the same experimental procedures, and evaluated the leading hypotheses for the neuroanatomical underpinnings and ecological drivers of variance in animal cognition. At the proximate level, both absolute (77, 99107) and relative brain size (108112) have been proposed as mechanisms supporting cognitive evolution. Evolutionary increases in brain size (both absolute and relative) and cortical reorganization are hallmarks of the human lineage and are believed to index commensurate changes in cognitive abilities (52, 105, 113115). Further, given the high metabolic costs of brain tissue (116121) and remarkable variance in brain size across species (108, 122), it is expected that the energetic costs of large brains are offset by the advantages of improved cognition. The cortical reorganization hypothesis suggests that selection for absolutely larger brains—and concomitant cortical reorganization—was the predominant mechanism supporting cognitive evolution (77, 91, 100106, 120). In contrast, the encephalization hypothesis argues that an increase in brain volume relative to body size was of primary importance (108, 110, 111, 123). Both of these hypotheses have received support through analyses aggregating data from published studies of primate cognition and reports of “intelligent” behavior in nature—both of which correlate with measures of brain size (76, 77, 84, 92, 110, 124).Open in a separate windowFig. 1.A phylogeny of the species included in this study. Branch lengths are proportional to time except where long branches have been truncated by parallel diagonal lines (split between mammals and birds ∼292 Mya).With respect to selective pressures, both social and dietary complexities have been proposed as ultimate causes of cognitive evolution. The social intelligence hypothesis proposes that increased social complexity (frequently indexed by social group size) was the major selective pressure in primate cognitive evolution (6, 44, 48, 50, 87, 115, 120, 125141). This hypothesis is supported by studies showing a positive correlation between a species’ typical group size and the neocortex ratio (80, 81, 8587, 129, 142145), cognitive differences between closely related species with different group sizes (130, 137, 146, 147), and evidence for cognitive convergence between highly social species (26, 31, 148150). The foraging hypothesis posits that dietary complexity, indexed by field reports of dietary breadth and reliance on fruit (a spatiotemporally distributed resource), was the primary driver of primate cognitive evolution (151154). This hypothesis is supported by studies linking diet quality and brain size in primates (79, 81, 86, 142, 155), and experimental studies documenting species differences in cognition that relate to feeding ecology (94, 156166).Although each of these hypotheses has received empirical support, a comparison of the relative contributions of the different proximate and ultimate explanations requires (i) a cognitive dataset covering a large number of species tested using comparable experimental procedures; (ii) cognitive tasks that allow valid measurement across a range of species with differing morphology, perception, and temperament; (iii) a representative sample within each species to obtain accurate estimates of species-typical cognition; (iv) phylogenetic comparative methods appropriate for testing evolutionary hypotheses; and (v) unprecedented collaboration to collect these data from populations of animals around the world (70).Here, we present, to our knowledge, the first large-scale collaborative dataset and comparative analysis of this kind, focusing on the evolution of self-control. We chose to measure self-control—the ability to inhibit a prepotent but ultimately counterproductive behavior—because it is a crucial and well-studied component of executive function and is involved in diverse decision-making processes (167169). For example, animals require self-control when avoiding feeding or mating in view of a higher-ranking individual, sharing food with kin, or searching for food in a new area rather than a previously rewarding foraging site. In humans, self-control has been linked to health, economic, social, and academic achievement, and is known to be heritable (170172). In song sparrows, a study using one of the tasks reported here found a correlation between self-control and song repertoire size, a predictor of fitness in this species (173). In primates, performance on a series of nonsocial self-control control tasks was related to variability in social systems (174), illustrating the potential link between these skills and socioecology. Thus, tasks that quantify self-control are ideal for comparison across taxa given its robust behavioral correlates, heritable basis, and potential impact on reproductive success.In this study we tested subjects on two previously implemented self-control tasks. In the A-not-B task (27 species, n = 344), subjects were first familiarized with finding food in one location (container A) for three consecutive trials. In the test trial, subjects initially saw the food hidden in the same location (container A), but then moved to a new location (container B) before they were allowed to search (Movie S1). In the cylinder task (32 species, n = 439), subjects were first familiarized with finding a piece of food hidden inside an opaque cylinder. In the following 10 test trials, a transparent cylinder was substituted for the opaque cylinder. To successfully retrieve the food, subjects needed to inhibit the impulse to reach for the food directly (bumping into the cylinder) in favor of the detour response they had used during the familiarization phase (Movie S2).Thus, the test trials in both tasks required subjects to inhibit a prepotent motor response (searching in the previously rewarded location or reaching directly for the visible food), but the nature of the correct response varied between tasks. Specifically, in the A-not-B task subjects were required to inhibit the response that was previously successful (searching in location A) whereas in the cylinder task subjects were required to perform the same response as in familiarization trials (detour response), but in the context of novel task demands (visible food directly in front of the subject).  相似文献   

14.
Topological motifs in synaptic connectivity—such as the cortical column—are fundamental to processing of information in cortical structures. However, the mesoscale topology of cortical networks beyond columns remains largely unknown. In the olfactory cortex, which lacks an obvious columnar structure, sensory-evoked patterns of activity have failed to reveal organizational principles of the network and its structure has been considered to be random. We probed the excitatory network in the mouse olfactory cortex using variance analysis of paired whole-cell recording in olfactory cortex slices. On a given trial, triggered network-wide bursts in disinhibited slices had remarkably similar time courses in widely separated and randomly selected cell pairs of pyramidal neurons despite significant trial-to-trial variability within each neuron. Simulated excitatory network models with random topologies only partially reproduced the experimental burst-variance patterns. Network models with local (columnar) or distributed subnetworks, which have been predicted as the basis of encoding odor objects, were also inconsistent with the experimental data, showing greater variability between cells than across trials. Rather, network models with power-law and especially hierarchical connectivity showed the best fit. Our results suggest that distributed subnetworks are weak or absent in the olfactory cortex, whereas a hierarchical excitatory topology may predominate. A hierarchical excitatory network organization likely underlies burst generation in this epileptogenic region, and may also shape processing of sensory information in the olfactory cortex.Structural and functional plasticity at excitatory synapses in cortical networks represents a fundamental mechanism for encoding sensory representations and memory. As a result, neuronal ensembles that are connected with high probability emerge as functional units to produce a population code of the environment. The topology of such excitatory circuits should contain signatures—as global topological motifs—that reflect the encoding strategy. The cortical column is a well-studied example of such a motif (1). Columnar cortices contain substantial distributed connectivity and some brain areas, such as association cortex, high-order cortices, and the piriform cortex, lack a pronounced columnar structure. In the piriform (olfactory) cortex, there exists only a rudimentary understanding of the relationship between network structure and cortical function. The axons of individual piriform pyramidal neurons ramify widely throughout the olfactory cortex, and only show patchiness on a very broad scale (24). Consistent with this architecture, neural activity in response to individual odorants is distributed broadly across the olfactory cortex as detected by 2-deoxyglucose, c-fos expression, multiunit recording, and population calcium imaging (58). Likewise, the receptive fields of individual neurons in piriform cortex and anterior olfactory cortex are broad (9, 10). Broad receptive fields in piriform cortex reflect convergence of input from many olfactory bulb glomeruli (11) and are strongly influenced by recurrent connectivity (12).These observations support a highly distributed population representation but reveal little about what processing function the piriform cortex performs. Physiological and anatomical studies have provided some clues. For example, neuronal responses in piriform cortex are specific for category of odorant (13), and odor identity and similarity are separately encoded in anterior and posterior piriform cortex, respectively (14), suggesting hierarchical coding. The endopiriform (EN) and preendopiriform nucleus (pEN), immediately subjacent to the piriform cortex, have dense recurrent connectivity and dense connectivity with overlying areas of piriform cortex (15, 16). The pEN, also called area tempestas, is a highly epileptogenic locus (16, 17). However, the physiological role of its dense connectivity is unknown (15, 18).To probe excitatory connectivity in the olfactory cortex, we isolated excitatory synaptic activity in a tailored brain slice containing the ventral anterior piriform cortex (APCV), the pEN, the anterior olfactory cortex (AOC; also called anterior olfactory nucleus). Using weak stimulation of the lateral olfactory tract (LOT) input while blocking GABAergic inhibition and NMDA receptors, we evoked transient, all-or-none, network-wide bursts of excitation. Network-wide transient bursts are a dynamic circuit property shared by the hippocampus, neocortex, and piriform cortex in disinhibited recording conditions (1921). We used the pairwise variance patterns detectable in the fine structure of these bursts as a probe of excitatory network topology. We compared whole-cell recordings from randomly selected pairs of principal neurons in olfactory cortex with patterns generated in simulated networks with a range of network topologies. Our findings suggest that excitatory connectivity in olfactory cortex is neither random nor organized into local or distributed subnetworks. Rather, it shows hierarchical connectivity.  相似文献   

15.
Brain stimulation, a therapy increasingly used for neurological and psychiatric disease, traditionally is divided into invasive approaches, such as deep brain stimulation (DBS), and noninvasive approaches, such as transcranial magnetic stimulation. The relationship between these approaches is unknown, therapeutic mechanisms remain unclear, and the ideal stimulation site for a given technique is often ambiguous, limiting optimization of the stimulation and its application in further disorders. In this article, we identify diseases treated with both types of stimulation, list the stimulation sites thought to be most effective in each disease, and test the hypothesis that these sites are different nodes within the same brain network as defined by resting-state functional-connectivity MRI. Sites where DBS was effective were functionally connected to sites where noninvasive brain stimulation was effective across diseases including depression, Parkinson''s disease, obsessive-compulsive disorder, essential tremor, addiction, pain, minimally conscious states, and Alzheimer’s disease. A lack of functional connectivity identified sites where stimulation was ineffective, and the sign of the correlation related to whether excitatory or inhibitory noninvasive stimulation was found clinically effective. These results suggest that resting-state functional connectivity may be useful for translating therapy between stimulation modalities, optimizing treatment, and identifying new stimulation targets. More broadly, this work supports a network perspective toward understanding and treating neuropsychiatric disease, highlighting the therapeutic potential of targeted brain network modulation.A promising treatment approach for many psychiatric and neurological diseases is focal brain stimulation, traditionally divided into invasive approaches requiring neurosurgery and noninvasive approaches that stimulate the brain from outside the skull. The dominant invasive treatment is deep brain stimulation (DBS) in which an electrode is surgically implanted deep in the brain and used to deliver electrical pulses at high frequency (generally 120–160 Hz) (1, 2). In some instances, the therapeutic effects of DBS resemble those of structural lesions at the same site, but in other cases DBS appears to activate the stimulated region or adjacent white matter fibers (1, 2). DBS systems are approved by the US Food and Drug Administration (FDA) for treatment of essential tremor and Parkinson''s disease, have humanitarian device exemptions for dystonia and obsessive compulsive disorder, and are being explored as a therapy for many other diseases including depression, Alzheimer’s disease, and even minimally conscious states (1, 36).Although DBS can result in dramatic therapeutic benefit, the risk inherent in neurosurgery has motivated research into noninvasive alternatives (79). Transcranial magnetic stimulation (TMS) and transcranial direct current stimulation (tDCS) have received the most investigation (1013). TMS uses a rapidly changing magnetic field to induce currents and action potentials in underlying brain tissue, whereas tDCS involves the application of weak (1–2 mA) electrical currents to modulate neuronal membrane potential. Depending on the stimulation parameters, both TMS and tDCS can be used to excite (>5 Hz TMS, anodal tDCS) or inhibit (<1 Hz TMS, cathodal tDCS) the underlying cortical tissue (10). These neurophysiological effects are well validated only for the primary motor cortex (M1) and can vary across subjects; however the terms “excitatory” and “inhibitory” stimulation are used often and are used here as a shorthand to refer to TMS or tDCS at these parameters. The primary clinical application and FDA-approved indication is high-frequency (i.e., excitatory) TMS to the left dorsolateral prefrontal cortex (DLPFC) for treatment of medication-refractory depression (1419). However, TMS and tDCS have shown evidence of efficacy in a number of other neurological and psychiatric disorders (1013).How invasive and noninvasive brain stimulation relate to one another has received relatively little attention. Because of the different FDA-approved indications, patient populations, sites of administration, and presumed mechanisms of action, they have remained largely separate clinical and scientific fields. However, these boundaries are beginning to erode. First, the patient populations treated with invasive or noninvasive brain stimulation are starting to converge. For example, the primary indication for TMS is depression, and the primary indication for DBS is Parkinson''s disease, but DBS is being investigated as a treatment for depression, and TMS is being investigated as a treatment for Parkinson''s disease (4, 2025). Second, although therapeutic mechanisms remain unknown, invasive and noninvasive brain stimulation share important properties. In both cases, the effects of stimulation propagate beyond the stimulation site to impact a distributed set of connected brain regions (i.e., a brain network) (4, 10, 2633). Given increasing evidence that these network effects are relevant to therapeutic response (4, 3436), it is possible that invasive and noninvasive stimulation of different brain regions actually modify the same brain network to provide therapeutic benefit.Linking invasive and noninvasive brain stimulation and identifying the relevant brain networks is important for several reasons. First, findings could be used to improve treatments. For example, TMS treatment of depression is limited by the inability to identify the optimal stimulation site in the left DLPFC (15, 18, 3739). Using resting-state functional-connectivity MRI (rs-fcMRI), a technique used to visualize brain networks based on correlated fluctuations in blood oxygenation (4042), the efficacy of different DLPFC TMS sites has been related to their correlation with the subgenual cingulate, a DBS target for depression (43). rs-fcMRI maps with the subgenual cingulate thus might be used to select an optimal TMS site in the DLPFC, perhaps even individualized to specific patients (44). Because identification of the ideal stimulation site is a ubiquitous problem across diseases and brain-stimulation modalities (1, 15, 18, 3739), such an approach could prove valuable across disorders. Second, although the primary goal of therapeutic brain stimulation is to help patients, it also can provide unique and fundamental insight into human brain function. Investigating how different types of stimulation to different brain regions could impart similar behavioral effects is relevant to understanding the functional role of brain networks.Here we investigate all neurological and psychiatric diseases treated with both invasive and noninvasive brain stimulation. We list the stimulation sites that have evidence of efficacy in each disease and test the hypothesis that these sites represent different nodes in the same brain network as visualized with rs-fcMRI. Further, we determine whether this approach can identify sites where stimulation is ineffective and determine which type of noninvasive brain stimulation (excitatory or inhibitory) will prove effective. To test these hypotheses, we take advantage of a unique rs-fcMRI dataset collected from 1,000 normal subjects, processed to allow precise subcortical and cortical alignment between subjects and with anatomical brain atlases (4547).  相似文献   

16.
Brain development is largely shaped by early sensory experience. However, it is currently unknown whether, how early, and to what extent the newborn’s brain is shaped by exposure to maternal sounds when the brain is most sensitive to early life programming. The present study examined this question in 40 infants born extremely prematurely (between 25- and 32-wk gestation) in the first month of life. Newborns were randomized to receive auditory enrichment in the form of audio recordings of maternal sounds (including their mother’s voice and heartbeat) or routine exposure to hospital environmental noise. The groups were otherwise medically and demographically comparable. Cranial ultrasonography measurements were obtained at 30 ± 3 d of life. Results show that newborns exposed to maternal sounds had a significantly larger auditory cortex (AC) bilaterally compared with control newborns receiving standard care. The magnitude of the right and left AC thickness was significantly correlated with gestational age but not with the duration of sound exposure. Measurements of head circumference and the widths of the frontal horn (FH) and the corpus callosum (CC) were not significantly different between the two groups. This study provides evidence for experience-dependent plasticity in the primary AC before the brain has reached full-term maturation. Our results demonstrate that despite the immaturity of the auditory pathways, the AC is more adaptive to maternal sounds than environmental noise. Further studies are needed to better understand the neural processes underlying this early brain plasticity and its functional implications for future hearing and language development.One of the first acoustic stimuli we are exposed to before birth is the voice of the mother and the sounds of her heartbeat. As fetuses, we have substantial capacity for auditory learning and memory already in utero (15), and we are particularly tuned to acoustic cues from our mother (69). Previous research suggests that the innate preference for mother’s voice shapes the developmental trajectory of the brain (10, 11). Prenatal exposure to mother’s voice may therefore provide the brain with the auditory fitness necessary to process and store speech information immediately after birth (12, 13).There is evidence to suggest that prenatal exposure to the maternal voice and heartbeat sounds can pave the neural pathways in the brain for subsequent development of hearing and language skills (14). For example, the periodic perception of the low-frequency maternal heartbeat in the womb provides the fetus with an important rhythmic experience (15, 16) that likely establishes the neural basis for auditory entrainment and synchrony skills necessary for vocal, gestural, and gaze communication during mother–infant interactions (17, 18).Studies examining the neural response to the maternal voice soon after birth have found activation in posterior temporal regions, preferentially on the left side, as well as brain areas involved in emotional processing including the amygdala and orbito-frontal cortex (19). Similarly, Beauchemin et al. have found activation in language-related cortical regions when newborns listened to their mother’s voice, whereas a stranger’s voice seemed to activate more generic regions of the brain (20). In addition, Partanen et al. have shown that the neural response to maternal sounds depends on experience as full-term newborns react differentially to familiar vs. unfamiliar sounds they were exposed to as fetuses, suggesting correlation between the amount of prenatal exposure and brain activity (21). Taken together, the above studies suggest that the mother’s voice plays a special role in the early shaping of auditory and language areas of the brain.Numerous animal studies have shown that brain development relies on developmentally appropriate acoustic stimulation early in life (2232). Auditory deprivation during critical periods can adversely affect brain maturation and lead to long-lasting neural despecialization in the auditory cortex (AC), whereas auditory enrichment in the early postnatal period can enhance neural sensitivity in the primary AC, as well as improve auditory recognition and discrimination abilities.Preterm infants are born during a critical period for auditory brain development. However, the maternal auditory nursery provided by the womb vanishes after a premature birth as the preterm newborn enters the neonatal intensive care unit (NICU). The abrupt transition of the fetus from the protected environment of the womb to the exposed environment of the hospital imposes significant challenges on the developing brain (33). These challenges have been associated with neuropathologic consequences, including reduction in regional brain volumes, white matter microstructural abnormalities, and poor cognitive and language outcomes in preterm compared with full-term newborns (3441).Considering the acoustic gap between the NICU environment and the womb, it is not surprising that auditory brain development is compromised in preterm compared with full-term infants (42, 43). Numerous studies have suggested that the auditory environment available for preterm infants in the NICU may not be conducive for their neurodevelopment (4447). These concerns are derived from the frequent reality that hospitalized preterm newborns are overexposed to loud, toxic, and unpredictable environmental noise generated by ventilators, infusion pumps, fans, telephones, pagers, monitors, and alarms (4851), whereas at the same time they are also deprived of the low-frequency, patterned, and biologically familiar sounds of their mother’s voice and heartbeat, which they would otherwise be hearing in utero (33, 45). In addition, the hospital environment contains a significant amount of high-frequency electronic sounds (52, 53) that are less likely to be heard in the womb because of the sound attenuation provided by maternal tissues and fluid within the intrauterine cavity (5456). Efforts to improve the hospital environment for preterm neonates have primarily focused on reducing hospital noise and maintaining a quiet environment. However, exposing medically fragile preterm newborns to low-frequency audio recordings of their mothers on a daily basis has been less acknowledged to be of necessity, and the extent to which such maternal sound exposure can influence brain maturation after an extremely premature birth has been a matter of much debate.The present study aimed to determine whether enriching the auditory environment for preterm newborns with authentic recordings of their mother’s voice and heartbeat sounds in the first month of life would result in structural alterations in the AC. The rationale driving this question lies in the fact that such enriched maternal sound stimulation would otherwise be present had the baby not been born prematurely.  相似文献   

17.
A series of mono- and dinuclear alkynylplatinum(II) terpyridine complexes containing the hydrophilic oligo(para-phenylene ethynylene) with two 3,6,9-trioxadec-1-yloxy chains was designed and synthesized. The mononuclear alkynylplatinum(II) terpyridine complex was found to display a very strong tendency toward the formation of supramolecular structures. Interestingly, additional end-capping with another platinum(II) terpyridine moiety of various steric bulk at the terminal alkyne would lead to the formation of nanotubes or helical ribbons. These desirable nanostructures were found to be governed by the steric bulk on the platinum(II) terpyridine moieties, which modulates the directional metal−metal interactions and controls the formation of nanotubes or helical ribbons. Detailed analysis of temperature-dependent UV-visible absorption spectra of the nanostructured tubular aggregates also provided insights into the assembly mechanism and showed the role of metal−metal interactions in the cooperative supramolecular polymerization of the amphiphilic platinum(II) complexes.Square-planar d8 platinum(II) polypyridine complexes have long been known to exhibit intriguing spectroscopic and luminescence properties (154) as well as interesting solid-state polymorphism associated with metal−metal and π−π stacking interactions (114, 25). Earlier work by our group showed the first example, to our knowledge, of an alkynylplatinum(II) terpyridine system [Pt(tpy)(C ≡ CR)]+ that incorporates σ-donating and solubilizing alkynyl ligands together with the formation of Pt···Pt interactions to exhibit notable color changes and luminescence enhancements on solvent composition change (25) and polyelectrolyte addition (26). This approach has provided access to the alkynylplatinum(II) terpyridine and other related cyclometalated platinum(II) complexes, with functionalities that can self-assemble into metallogels (2731), liquid crystals (32, 33), and other different molecular architectures, such as hairpin conformation (34), helices (3538), nanostructures (3945), and molecular tweezers (46, 47), as well as having a wide range of applications in molecular recognition (4852), biomolecular labeling (4852), and materials science (53, 54). Recently, metal-containing amphiphiles have also emerged as a building block for supramolecular architectures (4244, 5559). Their self-assembly has always been found to yield different molecular architectures with unprecedented complexity through the multiple noncovalent interactions on the introduction of external stimuli (4244, 5559).Helical architecture is one of the most exciting self-assembled morphologies because of the uniqueness for the functional and topological properties (6069). Helical ribbons composed of amphiphiles, such as diacetylenic lipids, glutamates, and peptide-based amphiphiles, are often precursors for the growth of tubular structures on an increase in the width or the merging of the edges of ribbons (64, 65). Recently, the optimization of nanotube formation vs. helical nanostructures has aroused considerable interests and can be achieved through a fine interplay of the influence on the amphiphilic property of molecules (66), choice of counteranions (67, 68), or pH values of the media (69), which would govern the self-assembly of molecules into desirable aggregates of helical ribbons or nanotube scaffolds. However, a precise control of supramolecular morphology between helical ribbons and nanotubes remains challenging, particularly for the polycyclic aromatics in the field of molecular assembly (6469). Oligo(para-phenylene ethynylene)s (OPEs) with solely π−π stacking interactions are well-recognized to self-assemble into supramolecular system of various nanostructures but rarely result in the formation of tubular scaffolds (7073). In view of the rich photophysical properties of square-planar d8 platinum(II) systems and their propensity toward formation of directional Pt···Pt interactions in distinctive morphologies (2731, 3945), it is anticipated that such directional and noncovalent metal−metal interactions might be capable of directing or dictating molecular ordering and alignment to give desirable nanostructures of helical ribbons or nanotubes in a precise and controllable manner.Herein, we report the design and synthesis of mono- and dinuclear alkynylplatinum(II) terpyridine complexes containing hydrophilic OPEs with two 3,6,9-trioxadec-1-yloxy chains. The mononuclear alkynylplatinum(II) terpyridine complex with amphiphilic property is found to show a strong tendency toward the formation of supramolecular structures on diffusion of diethyl ether in dichloromethane or dimethyl sulfoxide (DMSO) solution. Interestingly, additional end-capping with another platinum(II) terpyridine moiety of various steric bulk at the terminal alkyne would result in nanotubes or helical ribbons in the self-assembly process. To the best of our knowledge, this finding represents the first example of the utilization of the steric bulk of the moieties, which modulates the formation of directional metal−metal interactions to precisely control the formation of nanotubes or helical ribbons in the self-assembly process. Application of the nucleation–elongation model into this assembly process by UV-visible (UV-vis) absorption spectroscopic studies has elucidated the nature of the molecular self-assembly, and more importantly, it has revealed the role of metal−metal interactions in the formation of these two types of nanostructures.  相似文献   

18.
In humans, spontaneous movements are often preceded by early brain signals. One such signal is the readiness potential (RP) that gradually arises within the last second preceding a movement. An important question is whether people are able to cancel movements after the elicitation of such RPs, and if so until which point in time. Here, subjects played a game where they tried to press a button to earn points in a challenge with a brain–computer interface (BCI) that had been trained to detect their RPs in real time and to emit stop signals. Our data suggest that subjects can still veto a movement even after the onset of the RP. Cancellation of movements was possible if stop signals occurred earlier than 200 ms before movement onset, thus constituting a point of no return.It has been repeatedly shown that spontaneous movements are preceded by early brain signals (18). As early as a second before a simple voluntary movement, a so-called readiness potential (RP) is observed over motor-related brain regions (13, 5). The RP was found to precede the self-reported time of the “‘decision’ to act” (ref. 3, p. 623). Similar preparatory signals have been observed using invasive electrophysiology (8, 9) and functional MRI (7, 10), and have been demonstrated also for choices between multiple-response options (6, 7, 10), for abstract decisions (10), for perceptual choices (11), and for value-based decisions (12). To date, the exact nature and causal role of such early signals in decision making is debated (1220).One important question is whether a person can still exert a veto by inhibiting the movement after onset of the RP (13, 18, 21, 22). One possibility is that the onset of the RP triggers a causal chain of events that unfolds in time and cannot be cancelled. The onset of the RP in this case would be akin to tipping the first stone in a row of dominoes. If there is no chance of intervening, the dominoes will gradually fall one-by-one until the last one is reached. This has been coined a ballistic stage of processing (23, 24). A different possibility is that participants can still terminate the process, akin to taking out a domino at some later stage in the chain and thus preventing the process from completing. Here, we directly tested this in a real-time experiment that required subjects to terminate their decision to move once a RP had been detected by a brain–computer interface (BCI) (2531).  相似文献   

19.
20.
Adolescence is a developmental period associated with an increase in impulsivity. Impulsivity is a multidimensional construct, and in this study we focus on one of the underlying components: impatience. Impatience can result from (i) disregard of future outcomes and/or (ii) oversensitivity to immediate rewards, but it is not known which of these evaluative processes underlie developmental changes. To distinguish between these two causes, we investigated developmental changes in the structural and functional connectivity of different frontostriatal tracts. We report that adolescents were more impatient on an intertemporal choice task and reported less future orientation, but not more present hedonism, than young adults. Developmental increases in structural connectivity strength in the right dorsolateral prefrontal tract were related to increased negative functional coupling with the striatum and an age-related decrease in discount rates. Our results suggest that mainly increased control, and the integration of future-oriented thought, drives the reduction in impatience across adolescence.Adolescence stands out as a particularly interesting developmental period because impulsivity seems to be greater at this age than during childhood or adulthood (1). This increased impulsivity is a part of healthy development and is thought to be crucial for the acquisition of skills needed for adult life (2, 3). However, increased impulsivity during adolescence also leads to unhealthy outcomes. For instance, compared with both children and adults, more teens need emergency department services secondary to accidents or experimenting with drugs or alcohol (4). Accidents are also the main reason for the increased mortality rate associated with adolescence (5). Certainly, an important contributor to the increase in negative outcomes is the increased access to risky situations (e.g., cars, alcohol, drugs, sexual activity) and decrease in parental oversight (6, 7). Nonetheless, impulsive behavior is elevated during this critical developmental period, highlighting the need to better understand the psychological and neural processes whose maturation across adolescence underlie these changes.When investigating impulsivity it is important to recognize that it is a multidimensional construct. Impulsivity can be broken down into at least three independent components: acting without thinking, impatience, and sensation/novelty seeking (6, 8, 9). Importantly, each of these three components shows significant developmental trends across adolescence and each has been independently associated with self-reported risky behavior. For instance, research indicates that impatience is related to increased adolescent substance use and risk-taking (1013). This study focuses on developmental changes in the construct of impatience.Intertemporal choice paradigms have proven to be an effective tool for quantifying impatience. Numerous studies have shown that rates of delay discounting measured in these tasks decline with age and are correlated with adolescent academic success (14), substance use (15), conduct disorder (16), and a range of developmental disorders, including attention deficit hyperactivity disorder (ADHD) (17). Research suggests that multiple cognitive and neural processes underlie delay discounting (1820). On the one hand, more impatient behavior can result from oversensitivity to immediate rewards (21). For example, there are several studies that show that people are more impatient when an immediate option is present compared with when both options are in the future (22). In addition, Luo et al. (23) showed that striatal activity is greater when participants are presented with immediate rewards versus preference-matched delayed rewards. On the other hand, more patient behavior may result from control processes that bias attention away from immediate rewards and/or emphasize the importance of future goals (24, 25). There are several studies that show that individual differences in future orientation are associated with more patient choices (1, 21, 26). Explicitly instructed future orientation leads to (i) within-subject reduction in discount rates (2729) and (ii) increased activity in the brain’s executive control network (30). Because there are (at least) two possible routes to greater or lesser impulsivity, it is challenging to determine how specific processes contribute to developmental differences by studying behavior alone. Here, we leverage our knowledge of the neural correlates of delay discounting to gain a deeper understanding of developmental changes in impatience.Neuroimaging studies have consistently shown that delay discounting recruits corticostriatal circuitry (19). Generally, striatal circuits are divided into two networks: a ventral valuation network that is involved in representing the incentive value of the different options, and a dorsal control network that is involved in maintaining future goals and inhibiting prepotent responses (18, 24). Important nodes in the valuation network include regions associated with the mesolimbic dopamine system, particularly the ventral striatum, amygdala, and ventromedial prefrontal cortex. The control network includes the dorsal striatum, dorsal anterior cingulate cortex (dACC), dorsal and ventral lateral prefrontal cortex (dlPFC/vlPFC), and the posterior parietal cortex (19). Recent studies have shown that discount rates are not only associated with regional changes within the valuation and control networks, but also depend on the strength of connections within and between these networks (31). For instance, we recently showed that frontostriatal tracts predicted decreased delay discounting, whereas subcorticostriatal tracts predicted increased delay discounting in adults (21).Neurodevelopmental models hypothesize that the different rates of maturation of the valuation and control networks account for increased impulsivity in adolescence. These models propose that there is a relatively early developing valuation system during adolescence, followed by a more slowly maturing cognitive control system (32, 33). Indications of the early development of the valuation system include increases in striatal dopamine levels, dopamine receptor availability, and blood oxygen level-dependent (BOLD) activity related to rewarding stimuli in early adolescence (34, 35) (but see refs. 36 and 37). In addition, many functional imaging studies, using tasks that require cognitive control, have shown increased engagement of lateral prefrontal areas across adolescence (3840). These results raise the question of whether adolescents are impulsive because they are (i) more sensitive to immediate rewards, (ii) less capable of relying on the prefrontal cortex for self-control and future orientation, or (iii) both.The current study was designed to investigate the processes underlying impatient behavior during the transition in and out of adolescence, and to determine how these behavioral changes relate to developmental changes in the structure and function of striatal circuits. We investigated developmental changes in behavior and brain in using a delay-discounting task. In addition, we used the Zimbardo Time Perspective Index (ZTPI) (41) to measure developmental changes in psychological components of impatient behavior—specifically, present hedonism (sensitivity to immediate rewards) and future orientation (control processes). Next, we used diffusion tensor imaging (DTI) to characterize different striatal tracts and their connectivity strength (4245). Finally, we explored the structure–function relationship within structurally defined striatal tracts by performing psychophysiological interaction (PPI) analyses of functional magnetic resonance imaging (fMRI) data collected during the delay-discounting task (Fig. 1A). This multimodal developmental MRI study affords a unique perspective on normative development of impulsive behavior.Open in a separate windowFig. 1.(A) Participants made choices between smaller, sooner (SS) and larger, later rewards (LL) in the delay-discounting task. Participants were required to choose between both immediate and delayed rewards and between two delayed rewards. (B) Behavioral measures were analyzed for three potential dependencies on age: linear increase (mean centered), adolescent peak (i.e., linear increase2), and adolescent emerging (i.e., adolescent peak × mean centered linear increase). (C) Discount rates estimated from the delay-discounting task were smaller with age across adolescence. (D) This change in delay discounting was reflected in an increase in future orientation but not present hedonism in the ZTPI. (E) Specifically, delay discount rates were not significantly correlated with measures of present hedonism, but were negatively correlated with self-reported future orientation. All age-related plots show the best-fitting age trend model in black, with shadowed error bars indicating SEM.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号