期刊文献+

视觉与听觉情绪信息关系判断中的交互作用 被引量:2

Interaction of Visual and Auditory Emotional Information during Their Relationship Evaluation
原文传递
导出
摘要 通过要求被试判断同时呈现的视听信息情绪效价的关系,考察视听情绪信息整合加工特点。语义与韵律的情绪效价在实验1中无冲突,在实验2中有冲突。两个实验一致发现当面孔为积极表情时,被试对视听通道情绪信息关系判断更准确;实验2还发现,当面孔为消极表情时,相比于韵律线索,被试根据语义线索对视听情绪信息关系判断更迅速。上述结果说明当视听信息同时呈现时,视觉信息可能先行加工,并影响到随后有关视听信息关系的加工。 Auditory information can transfer emotional message through rhythm and semantic meaning. In some cases, the emotional implication from the rhythm and the semantic might even be inconsistent. If such complicated auditory information is simultaneously presented with visual information like faces, what kind of interaction will happen when people process such information? To investigate this question is the aim of our present study. The sound materials were produced by two professional speakers, a male and a female. In Experiment 1, speakers read neutral words with a happy or angry rhythm respectively, and the participants' task was to judge whether the emotion of the rhythm of the speech was consistent with the facial expression which was presented at the same time. In Experiment 2, speakers read positive words with an angry rhythm or read negative words with a happy rhythm respectively, and the participants' task was to judge either the emotion of the rhythm of the speech or the semantics of speech was consistent with the facial expression which was presented at the same time. There were 43 participants (21 females) taking part in Experiment 1 and 40 participants (20 females) in Experiment 2. Each experiment included 120 trails. In Experiment 1, half of the trials were consistent and the other half inconsistent. In Experiment 2, half were consistent with the semantic information and the other half consistent with the rhythm information. The key- response mapping was counterbalanced across participants. Repeated measures ANOVA with facial expression and rhythm-emotion consistency as the within-subjects factors in Experiment 1 and with facial expression and judgment cue as the within-subjects factors in Experiment 2 were performed on the participant's mean reaction time and accuracy. The results revealed that (1) When the facial expression was positive, participants were more accurate in judging the relationship of information between the visual and auditory channels [Experiment 1 .F (1, 42) = 15.41,p 〈 .001, partial n2 = .27;Experiment 2:F (1, 39) = 6.82,p 〈 0.05, partial n2 =0. 15]. (2) In Experiment 1, when the valence of the rhythm was consistent with the facial expression, the judgment of the relationship of information between the visual and auditory channels was faster IF (1, 42) = 37.63,p 〈 0.001, partial n2 = 0.47] and more accurate[F (1, 42) = 21.80,p 〈0 .001,partial 72=0 .34]. (3) In Experiment 2, when the facial expression was negative, compared with the rhythm clues of the words, the semantic clues could facilitate participants' response in judging the relationship of visual and auditory stimuli [F (1, 39) = 15.78,p 〈 .001, partial 1/2 = .41]. The results suggested that when the visual and auditory stimuli were presented at the same time, the visual information was processed in advance and then affected the auditory information processing. Whether the emotional valence of the visual and auditory stimulus was conflicting or not, positive facial expression promoted the cognitive judgment about the relationship between the visual and auditory information. When the emotional valence of visual and auditory stimulus was congruent, it brought out an Easy Processing phenomenon. When the emotional valence of visual and auditory stimulus was conflicting, negative facial expression and the semantic information in the auditory channel could promote each other's processing. The present study innovatively explored the separated role of the semantic emotional information and the rhythmic emotional information of a word on the judgment about the relationship between the visual and auditory stimuli. It initially revealed that the semantic information had a speed advantage while the rhythm information had an accuracy advantage when the visual stimulus was a negative facial expression.
出处 《心理科学》 CSSCI CSCD 北大核心 2016年第4期842-848,共7页 Journal of Psychological Science
基金 国家自然科学基金项目(31271107) 教育部人文社会科学研究青年基金项目(11YJC90010) 辽宁省儿童人格发展与教育创新团队(WT2013007)项目的资助
关键词 跨通道 面孔表情 声音情绪 韵律语义冲突 cross-modal, facial expression, emotional voice, rhythm- semantic conflicts
  • 相关文献

参考文献25

二级参考文献170

  • 1刘世熠.脑电图与心理的大脑生理机制研究的几个問题[J].心理学报,1961,5(3):141-154. 被引量:9
  • 2莫书亮,苏彦捷.儿童情绪表情识别的眼睛线索之发展研究[J].心理科学,2004,27(6):1365-1367. 被引量:9
  • 3王妍,罗跃嘉.大学生面孔表情材料的标准化及其评定[J].中国临床心理学杂志,2005,13(4):396-398. 被引量:184
  • 4徐琴美,何洁.儿童情绪理解发展的研究述评[J].心理科学进展,2006,14(2):223-228. 被引量:57
  • 5林焘 王理嘉.语音学教程[M].北京:北京大学出版社,..
  • 6White M. Representation of facial expressions of emotion. The American Journal of Psychology, 1999, 112 (3) :371 - 381.
  • 7ohman A, Dimberg U. Facial expressions as conditioned stimuli for electrodermal responses: A case of preparedness ? Journal of Personality and Social Psychology, 1978, 36:1251 - 1258.
  • 8Baron-Cohen S, Wheelwright S, Jolliffe T. Is there a"Language of the Eyes"? Evidence from normal adults, and adults with autism or Asperger Syndrome. Visual Cognition, 1997, 4 (3) :311 - 331.
  • 9Batki A, Baron - Coken S, Wheelwright S, Connellan J, Ahliwalia J. Is there an innate gaze module? Evidence from human neonates. Infant Behaviour and Development, 2000, 23:223 - 229.
  • 10Dunlap K, The role of the eye muscles and mouth muscles in the expression of emotions. Genetic Psychology Monographs, 1927, 2 : 199 - 233.

共引文献90

同被引文献4

引证文献2

二级引证文献2

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部