Abstract
In this paper, we introduce an interactive background music synthesis algorithm guided by visual content. We leverage a cascading strategy to synthesize background music in two stages: Scene Visual Analysis and Background Music Synthesis. First, seeking a deep learning-based solution, we leverage neural networks to analyze the sentiment of the input scene. Second, real-time background music is synthesized by optimizing a cost function that guides the selection and transition of music clips to maximize the emotion consistency between visual and auditory criteria, and music continuity. In our experiments, we demonstrate the proposed approach can synthesize dynamic background music for different types of scenarios. We also conducted quantitative and qualitative analysis on the synthesized results of multiple example scenes to validate the efficacy of our approach.
Author supplied keywords
Cite
CITATION STYLE
Wang, Y., Liang, W., Li, W., Li, D., & Yu, L. F. (2020). Scene-Aware Background Music Synthesis. In MM 2020 - Proceedings of the 28th ACM International Conference on Multimedia (pp. 1162–1170). Association for Computing Machinery, Inc. https://doi.org/10.1145/3394171.3413894
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.