:::

詳目顯示

回上一頁
題名:歌詞對音樂情緒加工的影響:行為與ERP研究
書刊名:心理學報
作者:張偉霞王莞琪周臨舒蔣存梅
出版日期:2018
卷期:2018(12)
頁次:1346-1355
主題關鍵詞:音樂情緒語言歌詞N400LPCMusical emotionLanguageLyrics
原始連結:連回原系統網址new window
相關次數:
  • 被引用次數被引用次數:期刊(0) 博士論文(0) 專書(0) 專書論文(0)
  • 排除自我引用排除自我引用:0
  • 共同引用共同引用:0
  • 點閱點閱:5
本研究探討了歌詞對音樂情緒加工的影響。實驗1使用情感啟動范式,帶有歌詞與無歌詞音樂片段為啟動刺激,與音樂情緒一致或不一致的面孔圖片為目標刺激,被試任務是既快又準確地判斷目標面孔的情緒。結果顯示,無論音樂是否帶有歌詞,聽者在一致條件下的反應都比不一致條件更快更準確,這表明聽者能加工音樂傳達的情緒信息。實驗2進一步通過電生理手段探討歌詞影響音樂情緒加工的神經機制。研究結果顯示,盡管聽者對帶有歌詞和無歌詞音樂情緒的加工都產生了啟動效應,但是無歌詞音樂條件在250~450ms時間窗口產生了N400效應,而帶有歌詞音樂條件在500~700 ms時間窗口誘發了LPC效應,該結果表明,歌詞影響了大腦加工音樂情緒的時間進程。本研究結果將在一定程度上為音樂與語言關系的探究提供依據。
Music and language are unique to the human beings.It has been suggested that music and language have a common origin as an emotional protolanguage.The development of socialisation has resulted in the development of language into a symbolic communication system with explicit semantics.By contrast,music has become an important means of emotional expression.However,whether language with explicit semantics affects the emotional processing of music remains uncertain.Given that songs contain melody and lyrics,previous behavioural studies have focused on songs to analyse the influence of lyrics on the processing of musical emotion.However,several studies have also shown the influence of lyrics,although such findings are relatively contradictory.Thus,the current study used behavioural and electrophysiological measurements to investigate the impact of lyrics on the processing of musical emotion.Experiment 1 analysed whether the emotional connotations in music with and without lyrics could be perceived by listeners at the behavioural level.Experiment 2 further investigated whether there are different neural responses to emotions conveyed by melodies with and without lyrics.A cross-modal affective priming paradigm was used in Experiments 1 and 2,in which musical excerpts served as the prime and emotional faces as target.To avoidthe impact of familiarity,120 musical stimuli were selected from European opera.Each was sung by a vocalist with and without lyrics,thereby resulting in 240 musical stimuli in two versions as potential prime stimuli.A total of 160 facial expressions affectively congruent or incongruent with the preceding musical stimuli were selected as potential target stimuli.Three pre-tests were conducted to ensure the validity of the stimuli.Eventually,60 musical stimuli for each music version were selected as the prime stimuli,whilst 120 images were used as the target stimuli,thereby resulting in 240 music–image pairs.To ensure that each stimulus appears only once for each participant,two lists were prepared using a Latin square design.Each prime and target was presented in either the congruent or incongruent condition within each list.Thus,each list comprised 120 trials,with 30 trials in each condition.During the experiment,the two lists were equally distributed across the participants.A total of 40 healthy adults participated in Experiment 1.They were asked to judge as quickly and accurately as possible whether the emotion of the target was happy or sad.The accuracy and reaction time were collected.Meanwhile,20 healthy adults participated in Experiment 2.They were required to judge whether the emotion between music and image was congruent or incongruent whilst their EEG waveforms were recorded.ERPs were analysed and compared between conditions at the time windows of 250~450 ms and 500~700 ms after the onset of the target.The Experiment 1 results showed that when faces were primed by music either with or without lyrics,the participants responded faster and more accurately under affectively congruent condition compared with affectively incongruent condition.This finding indicated that the emotional connotations in music with and without lyrics could both be perceived.The ERP results in Experiment 2 showed that distinct neural mechanisms were activated by music with and without lyrics.Specifically,when faces were primed by music without lyrics,a larger N400 was elicited in response to affectively incongruent pairs than to affectively congruent pairs at the time window of 250~450 ms.However,when faces were primed by music with lyrics,a more positive LPC was observed in response to the affectively incongruent pairs than to the affectively congruent pairs at 500~700 ms.This finding confirms the results of Experiment 1,thereby suggesting that the emotion conveyed by music with and without lyrics could be perceived by the listeners.Moreover,the emotional processing between music with and without lyrics differs in the time course of neural processing.That is,the emotional processing of music with lyrics lagged behind that of music without lyrics.In conclusion,the present results suggest that the neural processing of emotional connotations in music without lyrics preceded that of music with lyrics,although the emotional connotations conveyed by music with and without lyrics could both be perceived.These findings also supported theory of musical philosophy,which suggests that music without lyrics can express emotion more immediately and more directly than music with lyrics owing to the lack of "translation" from the propositional system.On the other hand,considering that lyrics influenced the time course of emotional processing in music with lyrics,our results also provide evidence that the emotional processing of music and language may share neural resources to some extent.
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
QR Code
QRCODE