首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   64篇
  免费   4篇
儿科学   1篇
基础医学   11篇
临床医学   3篇
内科学   2篇
神经病学   44篇
特种医学   1篇
外科学   1篇
综合类   1篇
预防医学   4篇
  2022年   1篇
  2021年   3篇
  2020年   1篇
  2019年   4篇
  2018年   5篇
  2017年   2篇
  2015年   2篇
  2014年   4篇
  2013年   17篇
  2012年   4篇
  2011年   7篇
  2010年   3篇
  2009年   4篇
  2008年   4篇
  2007年   3篇
  2006年   1篇
  2005年   1篇
  2004年   1篇
  1998年   1篇
排序方式: 共有68条查询结果,搜索用时 31 毫秒
1.
2.
3.
Manual asymmetries emerge very early in development and several researchers have reported a significant right-hand bias in toddlers although this bias fluctuates depending on the nature of the activity being performed. However, little is known about the further development of asymmetries in preschoolers. In this study, patterns of hand preference were assessed in 50 children aged 3–5 years for different activities, including reaching movements, pointing gestures and symbolic gestures. Contrary to what has been reported in children before 3 years of age, we did not observe any difference in the mean handedness indices obtained in each task. Moreover, the asymmetry of reaching was found to correlate with that of pointing gestures, but not with that of symbolic gestures. In relation to the results reported in infants and adults, this study may help deciphering the mechanisms controlling the development of handedness by providing measures of manual asymmetries in an age range that has been so far rather neglected.  相似文献   
4.
Background: Although the efficacy of treatments for spoken verb and sentence production deficits in aphasia has been documented widely, less is known about interventions for written verb and written sentence production deficits.

Aims: This study documents a treatment aiming to improve production of (a) written subject-verb sentences (involving intransitive verbs) and (b) written subject-verb-object sentences (involving transitive verbs).

Methods & Procedures: The participant, a 63-year-old female aphasic speaker, had a marked language comprehension deficit, apraxia of speech, relatively good spelling abilities, and no hemiplegia. The treatment involved intransitive verbs producing subject-verb active sentences and transitive verbs producing subject-verb-object active non-reversible sentences. The treatment was undertaken in the context of current UK clinical practice.

Outcomes & Results: Statistical improvements were noted for the trained sets of verbs and sentences. Other improvements were also noted in LW's ability to retrieve some non-treated verbs and construct written sentences. Treatment did not generalise to sentence comprehension and letter spelling to dictation.

Conclusions: Our participant's ability to write verbs and sentences improved as a result of the treatment.  相似文献   
5.
Request and emblematic gestures, despite being both communicative gestures, do differ in terms of social valence. Indeed, only the former are used to initiate/maintain/terminate an actual interaction. If such a difference is at stake, a relevant social cue, i.e. eye contact, should have different impacts on the neuronal underpinnings of the two types of gesture. We measured blood oxygen level‐dependent signals, using functional magnetic resonance imaging, while participants watched videos of an actor, either blindfolded or not, performing emblems, request gestures, or meaningless control movements. A left‐lateralized network was more activated by both types of communicative gestures than by meaningless movements, regardless of the accessibility of the actor's eyes. Strikingly, when eye contact was taken into account as a factor, a right‐lateralized network was more strongly activated by emblematic gestures performed by the non‐blindfolded actor than by those performed by the blindfolded actor. Such modulation possibly reflects the integration of information conveyed by the eyes with the representation of emblems. Conversely, a wider right‐lateralized network was more strongly activated by request gestures performed by the blindfolded than by those performed by the non‐blindfolded actor. This probably reflects the effect of the conflict between the observed action and its associated contextual information, in which relevant social cues are missing.  相似文献   
6.
In everyday conversation, listeners often rely on a speaker's gestures to clarify any ambiguities in the verbal message. Using fMRI during naturalistic story comprehension, we examined which brain regions in the listener are sensitive to speakers' iconic gestures. We focused on iconic gestures that contribute information not found in the speaker's talk, compared with those that convey information redundant with the speaker's talk. We found that three regions—left inferior frontal gyrus triangular (IFGTr) and opercular (IFGOp) portions, and left posterior middle temporal gyrus (MTGp)—responded more strongly when gestures added information to nonspecific language, compared with when they conveyed the same information in more specific language; in other words, when gesture disambiguated speech as opposed to reinforced it. An increased BOLD response was not found in these regions when the nonspecific language was produced without gesture, suggesting that IFGTr, IFGOp, and MTGp are involved in integrating semantic information across gesture and speech. In addition, we found that activity in the posterior superior temporal sulcus (STSp), previously thought to be involved in gesture‐speech integration, was not sensitive to the gesture‐speech relation. Together, these findings clarify the neurobiology of gesture‐speech integration and contribute to an emerging picture of how listeners glean meaning from gestures that accompany speech. Hum Brain Mapp 35:900–917, 2014. © 2012 Wiley Periodicals, Inc.  相似文献   
7.
Vocabulary acquisition represents a major challenge in foreign language learning. Research has demonstrated that gestures accompanying speech have an impact on memory for verbal information in the speakers' mother tongue and, as recently shown, also in foreign language learning. However, the neural basis of this effect remains unclear. In a within‐subjects design, we compared learning of novel words coupled with iconic and meaningless gestures. Iconic gestures helped learners to significantly better retain the verbal material over time. After the training, participants' brain activity was registered by means of fMRI while performing a word recognition task. Brain activations to words learned with iconic and with meaningless gestures were contrasted. We found activity in the premotor cortices for words encoded with iconic gestures. In contrast, words encoded with meaningless gestures elicited a network associated with cognitive control. These findings suggest that memory performance for newly learned words is not driven by the motor component as such, but by the motor image that matches an underlying representation of the word's semantics. Hum Brain Mapp, 2011. © 2010 Wiley‐Liss, Inc.  相似文献   
8.
You are queuing at the bus stop, and notice that someone suddenly turns her walk into a run: typically, you assume that she wants to catch the bus and may want to tell the driver to wait. Faced with a sudden speed change, rather than considering it bizarre or unnatural, observers attach a meaning to it, and act consequently. In a social context, speed of a movement often bears as much significance as its form, and can be adapted to vehicle precise meanings. This pragmatic rule facilitates decoding of non-verbal messages from other individuals, but may not necessarily apply when observing one's own movements, for which intentions should be informative enough. Hence, the range of motion speeds labeled as ‘natural’ could be broader for other people's actions compared to one's own. We explored this possibility through a task in which human observers decided whether speed of a gesture had been artificially modified. A virtual hand was presented, which – unbeknownst to participants – moved according to the kinematics of either the observer, or another individual. Although a self/other distinction was never required, participants applied different criteria when dealing with self compared to other people's gestures, suggesting that the brain implicitly extracts identity information before any overt judgment is produced. Interestingly, observers were reluctant to labeling movements of another individual as artificial, in keeping with the hypothesis that large variations in movements’ speed can vehicle social messages, and therefore are not regarded a priori as unnatural.  相似文献   
9.
Viewing hand gestures during face-to-face communication affects speech perception and comprehension. Despite the visible role played by gesture in social interactions, relatively little is known about how the brain integrates hand gestures with co-occurring speech. Here we used functional magnetic resonance imaging (fMRI) and an ecologically valid paradigm to investigate how beat gesture-a fundamental type of hand gesture that marks speech prosody-might impact speech perception at the neural level. Subjects underwent fMRI while listening to spontaneously-produced speech accompanied by beat gesture, nonsense hand movement, or a still body; as additional control conditions, subjects also viewed beat gesture, nonsense hand movement, or a still body all presented without speech. Validating behavioral evidence that gesture affects speech perception, bilateral nonprimary auditory cortex showed greater activity when speech was accompanied by beat gesture than when speech was presented alone. Further, the left superior temporal gyrus/sulcus showed stronger activity when speech was accompanied by beat gesture than when speech was accompanied by nonsense hand movement. Finally, the right planum temporale was identified as a putative multisensory integration site for beat gesture and speech (i.e., here activity in response to speech accompanied by beat gesture was greater than the summed responses to speech alone and beat gesture alone), indicating that this area may be pivotally involved in synthesizing the rhythmic aspects of both speech and gesture. Taken together, these findings suggest a common neural substrate for processing speech and gesture, likely reflecting their joint communicative role in social interactions.  相似文献   
10.
Everyday communication is accompanied by visual information from several sources, including co‐speech gestures, which provide semantic information listeners use to help disambiguate the speaker's message. Using fMRI, we examined how gestures influence neural activity in brain regions associated with processing semantic information. The BOLD response was recorded while participants listened to stories under three audiovisual conditions and one auditory‐only (speech alone) condition. In the first audiovisual condition, the storyteller produced gestures that naturally accompany speech. In the second, the storyteller made semantically unrelated hand movements. In the third, the storyteller kept her hands still. In addition to inferior parietal and posterior superior and middle temporal regions, bilateral posterior superior temporal sulcus and left anterior inferior frontal gyrus responded more strongly to speech when it was further accompanied by gesture, regardless of the semantic relation to speech. However, the right inferior frontal gyrus was sensitive to the semantic import of the hand movements, demonstrating more activity when hand movements were semantically unrelated to the accompanying speech. These findings show that perceiving hand movements during speech modulates the distributed pattern of neural activation involved in both biological motion perception and discourse comprehension, suggesting listeners attempt to find meaning, not only in the words speakers produce, but also in the hand movements that accompany speech. Hum Brain Mapp, 2009. © 2009 Wiley‐Liss, Inc.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号