Investigating the Impact of Sound Angular Position on the Listener Affective State

Emotion recognition from sound signals represents an emerging field of recent research. Although many existing works focus on emotion recognition from music, there seems to be a relative scarcity of research on emotion recognition from general sounds. One of the key characteristics of sound events is the sound source spatial position, i.e. the location of the source relatively to the acoustic receiver. Existing studies that aim to investigate the relation of the latter source placement and the elicited emotions are limited to distance, front and back spatial localization and/or specific emotional categories. In this paper we analytically investigate the effect of the source angular position on the listener’s emotional state, modeled in the well-established valence/arousal affective space.

Towards this aim, we have developed an annotated sound events dataset using binaural processed versions of the available International Affective Digitized Sound (IADS) sound events library. All subjective affective annotations were obtained using the Self Assessment Manikin (SAM) approach. Preliminary results obtained byprocessing these annotation scores are likely to indicate a systematic change in the listener affective state as the sound source angular position changes. This trend is more obvious when the sound source is located outside of the visible field of the listener.

Share This Post