Abstract
Music significantly influences emotional and cognitive functions, and its direct impact on emotion processing is well-documented. The proliferation of music on social media platforms highlights the need to consider music as a context during the processing of other stimuli. This aimed to understand how specific acoustic features of widely consumed music influence human emotion processing. Manipulated acoustic cues—pitch, tempo, and timbre—in popular instrumental music from social media were used to influence perceived emotion of image stimuli. An innovative online experimental platform was developed and used to collect data from 109 participants (aged 15–62 years). Participants were exposed to music excerpts in which one of the three acoustic components was altered (high, low, or normal state) while viewing images from the International Affective Picture System and rated their emotional responses using the Self-Assessment Manikin questionnaire. Statistical analysis revealed that tempo was the strongest acoustic predictor of emotional arousal, with distinct patterns across age and gender. Increased tempo was also associated with higher valence ratings. Pitch manipulations had a stronger impact on the arousal levels of older participants, while timbre primarily influenced valence perception in younger listeners. Significant gender differences were also observed: male participants were more sensitive to changes in pitch and timbre, while female participants were more responsive to tempo variations. These findings contribute to a deeper understanding of how specific acoustic characteristics of contemporary, widely disseminated music shape emotional perception and highlight the complex interplay of acoustic cues with individual differences such as age and gender.