Abstract: | A multivariate pattern-classification system was developed for the study of facial electromy-ographic (EMG) patterning in 12 female subjects during affect-laden imagery and for posed facial expressions. A parameter-extraction procedure identified the dynamic EMG signal properties which accorded the maximal degree of self-reported emotion discrimination. Discriminant analyses on trialwise EMG vectors allowed assessment of specific EMG-site conformations typifying rated emotions of happiness, sadness, anger, and fear. The discriminability among emotion-specific EMG conformations was correlated with subjective ratings of affective-imagery vividness and duration. Evidence was obtained suggesting that the EMG patterns encoded complex, “blended” reported affective states during the imagery. Classification analyses produced point-predictions of reported emotional states in 10 of the 12 subjects, and provided the first computer pattern recognition of self-reported emotion from psychophysiological responses. |