首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 453 毫秒
1.
The N1 and P2 event‐related potentials (ERPs) are attenuated when the eliciting sounds coincide with our own actions. Although this ERP attenuation could be caused by central processes, it may also reflect a peripheral mechanism: the coactivation of the stapedius muscle with the task‐relevant effector, which reduces signal transmission efficiency in the middle ear, reducing the effective intensity of concurrently presented tones, which, in turn, elicit lower amplitude auditory ERPs. Because stapedius muscle contraction attenuates frequencies below 2 kHz, no attenuation should occur at frequencies above 2 kHz. A self‐induced tone paradigm was administered with 0.5, 2.0, and 8.0 kHz pure tones. Self‐induced tones elicited attenuated N1 and P2 ERPs, but the magnitude of attenuation was not affected by tone frequency. This result does not support the hypothesis that ERP attenuation to self‐induced tones are caused by stapedius muscle contractions.  相似文献   

2.
The event‐related potential (ERP) correlates of sound detection are attenuated when eliciting sounds coincide with our own actions. The role of attention in this effect was investigated in two experiments by presenting tones separated by random intervals. In the homogeneous condition of Experiments 1 and 2, the same tone was repeated, whereas in the mixed condition of Experiment 1, tones with five different frequencies were presented. Participants performed a time‐interval production task by marking intervals with keypresses in Experiment 1, and tried to produce keypress‐tone coincidences in Experiment 2. Although the auditory ERPs were attenuated for coincidences, no modulation by the multiplicity of tone frequencies in Experiment 1, or by the task‐relevancy of tones and coincidences in Experiment 2, was found. This suggests that coincidence‐related ERP attenuation cannot be fully explained by voluntary attentional mechanisms.  相似文献   

3.
It is well known that sensory perception can be attenuated when sensory stimuli are controlled by self‐initiated actions. This phenomenon is explained by the consistency between forward models of anticipated action effects and actual sensory feedback. Specifically, the brain state related to the binding between motor processing and sensory perception would have inhibitory function by gating sensory information via top‐down control. Since the brain state could casually influence the perception of subsequent stimuli of different sensory modalities, we hypothesize that pain evoked by nociceptive stimuli following the self‐initiated tactile stimulation would be attenuated as compared to that following externally determined tactile stimulation. Here, we compared psychophysical and neurophysiological responses to identical nociceptive‐specific laser stimuli in two different conditions: self‐initiated tactile sensation condition (STS) and nonself‐initiated tactile sensation condition (N‐STS). We observed that pain intensity and unpleasantness, as well as laser‐evoked brain responses, were significantly reduced in the STS condition compared to the N‐STS condition. In addition, magnitudes of alpha and beta oscillations prior to laser onset were significantly larger in the STS condition than in the N‐STS condition. These results confirmed that pain perception and pain‐related brain responses were attenuated when the tactile stimulation was initiated by subjects’ voluntary actions, and exploited neural oscillations reflecting the binding between motor processing and sensory feedback. Thus, our study elaborated the understanding of underlying neural mechanisms related to top‐down modulations of the analgesic effect induced by self‐initiated tactile sensation, which provided theoretical basis to improve the analgesic effect in various clinical applications.  相似文献   

4.
To examine how people deal with perceivable consequences of their voluntary actions, we recorded event‐related potentials (ERPs) during a self‐paced, two‐choice random generation task. Sixteen participants were asked to press one of two buttons randomly at a regular but self‐selected interval of once per 1–2 s. Each button press produced either a 1000‐Hz or 2000‐Hz tone, but participants were told that the tones were irrelevant to the task. The button–tone combinations were initially fixed, but in subsequent blocks, a button press infrequently produced the tone associated with the opposite button (p=.15). This cognitively mismatched tone elicited N2, P3, and late positive potential (or positive slow wave) of the ERP and delayed the timing of the next button press. These results suggest that action effects are difficult to ignore and that an action effect that is different from a performer's expectation may cause task disruption.  相似文献   

5.
This study investigated the influence of action‐associated predictive processes on visual ERPs. In two experiments, we sought evidence for sensory attenuation (SA) indexed by ERP amplitude reductions for self‐induced stimuli when compared to passive viewing of the same images. We assessed if SA is (a) present for both ecological and abstract stimuli (pictures depicting hands or checkerboards), (b) modulated by the degree of stimulus predictability (certain or uncertain action‐effect contingencies), and (c) sensitive to laterality of hand movements (dominant or subdominant hand actions). We found reduced occipital responses in the early 77–90 ms time interval (C1 component), irrespective of stimulus type, predictability, or the laterality of hand movements. However, the subsequent P1 component was increased (rather than reduced) for all action‐associated stimuli. In addition, this P1 effect was influenced by the degree of stimulus predictability for ecological stimuli only. Finally, the posterior N1 component was not modulated by self‐initiated actions. Overall, our findings indicate that movement‐related predictive processes attenuate early visual responses. Moreover, we propose that amplitude modulations in the P1 time range reflect the interaction between expectation‐based SA and attention‐associated amplitude enhancements. These results can have implications for assessing the influence of action‐associated predictions on visual processing in psychiatric disorders characterized by aberrant sensory predictions and alterations in hemispheric asymmetry, such as schizophrenia.  相似文献   

6.
Self‐suppression refers to the phenomenon that sensations initiated by our own movements are typically less salient, and elicit an attenuated neural response, compared to sensations resulting from changes in the external world. Evidence for self‐suppression is provided by previous ERP studies in the auditory modality, which have found that healthy participants typically exhibit a reduced auditory N1 component when auditory stimuli are self‐initiated as opposed to externally initiated. However, the literature investigating self‐suppression in the visual modality is sparse, with mixed findings and experimental protocols. An EEG study was conducted to expand our understanding of self‐suppression across different sensory modalities. Healthy participants experienced either an auditory (tone) or visual (pattern‐reversal) stimulus following a willed button press (self‐initiated), a random interval (externally initiated, unpredictable onset), or a visual countdown (externally initiated, predictable onset—to match the intrinsic predictability of self‐initiated stimuli), while EEG was continuously recorded. Reduced N1 amplitudes for self‐ versus externally initiated tones indicated that self‐suppression occurred in the auditory domain. In contrast, the visual N145 component was amplified for self‐ versus externally initiated pattern reversals. Externally initiated conditions did not differ as a function of their predictability. These findings highlight a difference in sensory processing of self‐initiated stimuli across modalities, and may have implications for clinical disorders that are ostensibly associated with abnormal self‐suppression.  相似文献   

7.
Research has so far focused on neural mechanisms that allow us to predict the sensory consequences of our own actions, thus also contributing to ascribing them to ourselves as agents. Less attention has been devoted to processing the sensory consequences of observed actions ascribed to another human agent. Focusing on audition, there is consistent evidence of a reduction of the auditory N1 ERP for self‐ versus externally generated sounds, while ERP correlates of processing sensory consequences of observed actions are mainly unexplored. In a between‐groups ERP study, we compared sounds generated by self‐performed (self group) or observed (observation group) button presses with externally generated sounds, which were presented either intermixed with action‐generated sounds or in a separate condition. Results revealed an overall reduction of the N1 amplitude for processing action‐ versus externally generated sounds in both the intermixed and the separate condition, with no difference between the groups. Further analyses, however, suggested that an N1 attenuation effect relative to the intermixed condition at frontal electrode sites might exist only for the self but not for the observation group. For both groups, we found a reduction of the P2 amplitude for processing action‐ versus all externally generated sounds. We discuss whether the N1 and the P2 reduction can be interpreted in terms of predictive mechanisms for both action execution and observation, and to what extent these components might reflect also the feeling of (self) agency and the judgment of agency (i.e., ascribing agency either to the self or to others).  相似文献   

8.
When performing sensory tasks, knowing the potentially occurring goal‐relevant and irrelevant stimulus events allows the establishment of selective attention sets, which result in enhanced sensory processing of goal‐relevant events. In the auditory modality, such enhancements are reflected in the increased amplitude of the N1 ERP elicited by the onsets of task‐relevant sounds. It has been recently suggested that ERPs to task‐relevant sound offsets are similarly enhanced in a tone‐focused state in comparison to a distracted one. The goal of the present study was to explore the influence of attention on ERPs elicited by sound offsets. ERPs elicited by tones in a duration‐discrimination task were compared to ERPs elicited by the same tones in not‐tone‐focused attentional setting. Tone offsets elicited a consistent, attention‐dependent biphasic (positive‐negative—P1‐N1) ERP waveform for tone durations ranging from 150 to 450 ms. The evidence, however, did not support the notion that the offset‐related ERPs reflected an offset‐specific attention set: The offset‐related ERPs elicited in a duration‐discrimination condition (in which offsets were task relevant) did not significantly differ from those elicited in a pitch‐discrimination condition (in which the offsets were task irrelevant). Although an N2 reflecting the processing of offsets in task‐related terms contributed to the observed waveform, this contribution was separable from the offset‐related P1 and N1. The results demonstrate that when tones are attended, offset‐related ERPs may substantially overlap endogenous ERP activity in the postoffset interval irrespective of tone duration, and attention differences may cause ERP differences in such postoffset intervals.  相似文献   

9.
Space is a dimension shared by different modalities, but at what stage spatial encoding is affected by multisensory processes is unclear. Early studies observed attenuation of N1/P2 auditory evoked responses following repetition of sounds from the same location. Here, we asked whether this effect is modulated by audiovisual interactions. In two experiments, using a repetition‐suppression paradigm, we presented pairs of tones in free field, where the test stimulus was a tone presented at a fixed lateral location. Experiment 1 established a neural index of auditory spatial sensitivity, by comparing the degree of attenuation of the response to test stimuli when they were preceded by an adapter sound at the same location versus 30° or 60° away. We found that the degree of attenuation at the P2 latency was inversely related to the spatial distance between the test stimulus and the adapter stimulus. In Experiment 2, the adapter stimulus was a tone presented from the same location or a more medial location than the test stimulus. The adapter stimulus was accompanied by a simultaneous flash displayed orthogonally from one of the two locations. Sound‐flash incongruence reduced accuracy in a same‐different location discrimination task (i.e., the ventriloquism effect) and reduced the location‐specific repetition‐suppression at the P2 latency. Importantly, this multisensory effect included topographic modulations, indicative of changes in the relative contribution of underlying sources across conditions. Our findings suggest that the auditory response at the P2 latency is affected by spatially selective brain activity, which is affected crossmodally by visual information.  相似文献   

10.
The suppression of the auditory N1 event‐related potential (ERP) to self‐initiated sounds became a popular tool to tap into sensory‐specific forward modeling. It is assumed that processing in the auditory cortex is attenuated due to a match between sensory stimulation and a specific sensory prediction afforded by a forward model of the motor command. The present study shows that N1 suppression was dramatically increased with long (~3 s) stimulus onset asynchronies (SOA), whereas P2 suppression was equal in all SOA conditions (0.8, 1.6, 3.2 s). Thus, the P2 was found to be more sensitive to self‐initiation effects than the N1 with short SOAs. Moreover, only the unspecific but not the sensory‐specific N1 components were suppressed for self‐initiated sounds suggesting that N1‐suppression effects mainly reflect an attenuated orienting response. We argue that the N1‐suppression effect is a rather indirect measure of sensory‐specific forward models.  相似文献   

11.
The current study investigated the relationship between planning processes and feedback monitoring during music performance, a complex task in which performers prepare upcoming events while monitoring their sensory outcomes. Theories of action planning in auditory‐motor production tasks propose that the planning of future events co‐occurs with the perception of auditory feedback. This study investigated the neural correlates of planning and feedback monitoring by manipulating the contents of auditory feedback during music performance. Pianists memorized and performed melodies at a cued tempo in a synchronization‐continuation task while the EEG was recorded. During performance, auditory feedback associated with single melody tones was occasionally substituted with tones corresponding to future (next), present (current), or past (previous) melody tones. Only future‐oriented altered feedback disrupted behavior: Future‐oriented feedback caused pianists to slow down on the subsequent tone more than past‐oriented feedback, and amplitudes of the auditory N1 potential elicited by the tone immediately following the altered feedback were larger for future‐oriented than for past‐oriented or noncontextual (unrelated) altered feedback; larger N1 amplitudes were associated with greater slowing following altered feedback in the future condition only. Feedback‐related negativities were elicited in all altered feedback conditions. In sum, behavioral and neural evidence suggests that future‐oriented feedback disrupts performance more than past‐oriented feedback, consistent with planning theories that posit similarity‐based interference between feedback and planning contents. Neural sensory processing of auditory feedback, reflected in the N1 ERP, may serve as a marker for temporal disruption caused by altered auditory feedback in auditory‐motor production tasks.  相似文献   

12.
Unexpected changes in task-irrelevant auditory stimuli are capable to distract processing of task-relevant visual information. This effect is accompanied by the elicitation of event-related potential (ERP) components associated with attentional orientation, i.e. P3a and reorienting negativity (RON). In the present study we varied the demands of a visual task in order to test whether the RON component -- as an index of attentional reorientation after distraction -- is confined to a semantic task requiring working memory. In two ERP experiments we applied an auditory-visual distraction paradigm in which subjects were instructed to discriminate visual stimuli preceded by a task-irrelevant sound, this being either a standard tone (600 Hz, 88%) or a deviant tone (660 Hz, 12%). The visual stimuli were numbers which had to be judged on basis of a semantic (odd or even) or physical feature (either size or colour). As expected, deviance related ERP components namely the mismatch negativity (MMN), P3a, and RON were elicited. Importantly, the RON was affected by the variation of the task: within the semantic task an early RON and within the physical task a late RON was obtained. These results suggest that the RON component reflects two functionally distinct processes of attentional allocation after distraction: refocusing on task-relevant information on the working memory level, and general reorientation of attention, e.g. preparation for the upcoming task.  相似文献   

13.
Goal-directed action presupposes the previous integration of actions and their perceptual consequences (action-effect binding). One function of action-effect bindings is to select actions by anticipating their consequences. Another, not yet well understood function is the prediction of action-contingent feedback. We used a probabilistic learning task and ERP analyses to compare the processing of explicit, performance-related feedback with the processing of task-irrelevant response-contingent stimuli. Replicating earlier findings, we found that negative performance feedback produced a feedback-related negativity (NFB), presumably related to response outcome evaluation. Interestingly, low-probability but task-irrelevant action effects elicited a signal similar to the NFB, even though it had a shorter duration. Response delays on trials following negative feedback and following low-probability action effects were correlated with one another. These observations suggest that automatically acquired action-effect relations are exploited for anticipating upcoming events. Like task-relevant performance feedback, task-irrelevant action effects serve as a basis for action monitoring processes, presumably mediated by medial frontal cortex.  相似文献   

14.
Unexpected novel sounds can capture our attention and impair performance. Recent behavioral research revealed that only novel sounds that provided target‐related (but not task‐related) information impaired performance. This poses the question of the automaticity of novelty processing and its expression at the behavioral level. In an auditory‐visual oddball paradigm, the informational content of sounds regarding the time and probability of target occurrence was varied. Independent from the informational content, novel, and deviant sounds elicited the P3a, an ERP‐component related to novelty processing. In contrast, impaired performance was only observed if target‐related information was provided. Results indicate that distractor sounds are automatically evaluated as potentially significant, but that the consequences for behavior depend on further processes such as the processing of the given information.  相似文献   

15.
During sleep, the brain network processes sensory stimuli without awareness. Stimulation must affect differently brain networks in sleep versus wake, but these differences have yet to be quantified. We recorded cortical activity in stage 2 (SII) sleep and wake using EEG while a tone was intermittently played. Zero‐lag correlation measured input to pairs of sensors in the network; cross‐correlation and phase‐lag index measured pairwise corticocortical connectivity. Our analysis revealed that under baseline conditions, the cortical network, in particular the central regions of the frontoparietal cortex, interact at a characteristic latency of 50 ms, but only during wake, not sleep. Nonsalient auditory stimulation causes far greater perturbation of connectivity from baseline in sleep than wake, both in the response to common input and corticocortical connectivity. The findings have key implications for sensory processing.  相似文献   

16.
Many everyday activities require time‐pressured sensorimotor decision making. Traditionally, perception, decision, and action processes were considered to occur in series, but this idea has been successfully challenged, particularly by neurophysiological work in animals. However, the generality of parallel processing requires further elucidation. Here, we investigate whether the accumulation of a decision can be observed intrahemispherically within human motor cortex. Participants categorized faces as male or female, with task difficulty manipulated using morphed stimuli. Transcranial magnetic stimulation, applied during the reaction‐time interval, produced motor‐evoked potentials (MEPs) in two hand muscles that were the major contributors when generating the required pinch/grip movements. Smoothing MEPs using a Gaussian kernel allowed us to recover a continuous time‐varying MEP average, comparable to an EEG component, permitting precise localization of the time at which the motor plan for the responding muscle became dominant. We demonstrate decision‐related activity in the motor cortex during this perceptual discrimination task, suggesting ongoing evidence accumulation within the motor system even for two independent actions represented within one hemisphere.  相似文献   

17.
In a cross‐modal rhyming study with visual pseudoword primes and auditory word targets, we found a typical ERP rhyming effect such that nonrhyming targets elicited a larger N400/N450 than rhyming targets. An orthographic effect was also apparent in the same 350‐ to 600‐ms epoch as the phonological effect: The rhyming effect for targets with rime orthography that did not match their primes' (e.g., tain‐“sane”) was smaller over the left hemisphere than the rhyming effect for targets with rime orthography that did match their primes' (e.g., nain‐“gain”), although the spellings of the auditory word targets were never explicitly shown. Our results indicate that this cross‐modal ERP rhyming effect indexes both phonological and orthographic processing—for auditory stimuli for which no orthography is presented in the task. This pattern of findings is consistent with the notion of coactivation of sublexical orthography and phonology in fluent adult readers as they both read and listen.  相似文献   

18.
Recent work suggests that dissociable activity in theta and delta frequency bands underlies several common ERP components, including the no‐go N2/P3 complex, which can better index separable functional processes than traditional time‐domain measures. Reports have also demonstrated that neural activity can be affected by stimulus sequence context information (i.e., the number and type of preceding stimuli). Stemming from prior work demonstrating that theta and delta index separable processes during response inhibition, the current study assessed sequence context in a go/no‐go paradigm in which the number of go stimuli preceding each no‐go was selectively manipulated. Principal component analysis of time‐frequency representations revealed differential modulation of evoked theta and delta related to sequence context, where delta increased robustly with additional preceding go stimuli, while theta did not. Findings are consistent with the view that theta indexes simpler initial salience‐related processes, while delta indexes more varied and complex processes related to a variety of task parameters.  相似文献   

19.
Previous ERP studies shown support for the idea that alcohol‐related stimuli are particularly salient to individuals who report low sensitivity (LS) to alcohol's effects (a known risk factor for alcohol‐related problems), leading such stimuli to spontaneously capture their attention and interfere with self‐regulatory goal pursuit. The current study investigated LS individuals' use of reactive and proactive cognitive control in response to alcohol‐related stimuli. Participants performed an alcohol Stroop task in which they indicated the font color of alcohol‐ and nonalcohol‐related words while ERPs were recorded. The probability of alcohol and nonalcohol words was manipulated to test predictions derived from Dual Mechanisms of Control theory. Among LS individuals, infrequent alcohol‐related words elicited slower responses and larger N2 amplitude, consistent with these stimuli eliciting enhanced reactive control responses. Amplitude of the frontal slow wave (FSW) component, associated with proactive control, was marginally larger among LS individuals when alcohol words were more frequent, but response accuracy was lower. These findings demonstrate that LS individuals experience conflict when presented with task‐irrelevant alcohol‐related stimuli, even in a context where conflict arguably should not be present. Findings further suggest that LS individuals can effectively implement reactive control to deal with this conflict when it is infrequent but have difficulty implementing proactive control in the context of more frequent conflict.  相似文献   

20.
Abnormalities in self–other voice processing have been observed in schizophrenia, and may underlie the experience of hallucinations. More recent studies demonstrated that these impairments are enhanced for speech stimuli with negative content. Nonetheless, few studies probed the temporal dynamics of self versus nonself speech processing in schizophrenia and, particularly, the impact of semantic valence on self–other voice discrimination. In the current study, we examined these questions, and additionally probed whether impairments in these processes are associated with the experience of hallucinations. Fifteen schizophrenia patients and 16 healthy controls listened to 420 prerecorded adjectives differing in voice identity (self‐generated [SGS] versus nonself speech [NSS]) and semantic valence (neutral, positive, and negative), while EEG data were recorded. The N1, P2, and late positive potential (LPP) ERP components were analyzed. ERP results revealed group differences in the interaction between voice identity and valence in the P2 and LPP components. Specifically, LPP amplitude was reduced in patients compared with healthy subjects for SGS and NSS with negative content. Further, auditory hallucinations severity was significantly predicted by LPP amplitude: the higher the SAPS “voices conversing” score, the larger the difference in LPP amplitude between negative and positive NSS. The absence of group differences in the N1 suggests that self–other voice processing abnormalities in schizophrenia are not primarily driven by disrupted sensory processing of voice acoustic information. The association between LPP amplitude and hallucination severity suggests that auditory hallucinations are associated with enhanced sustained attention to negative cues conveyed by a nonself voice.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号