A Review of Research on Emotional Music Generation

Authors

  • Yuhang Wang

DOI:

https://doi.org/10.54097/7dzwv369

Keywords:

Emotional music generation; Rule-based; Deep learning; Reinforced learning.

Abstract

With the expansion of the development field of artificial intelligence, emotional music generation’s potential in the field of music generation has been gradually explored. In this paper, rule-based, deep learning and reinforcement learning methods for emotional music generation are analyzed, respectively. Rule-based method generates music according to certain theories and patterns, which is interpretable but has a single style. Deep learning generates music by learning the characteristics of music data through neural network models, which have rich emotional levels but rely on a large amount of data and have poor flexibility. Reinforcement learning allows agents to optimize behavior strategies in interaction with the environment to generate music, which is flexible but complex. Through comparative analysis, it is found that the rule-based method can be used as a basic framework to constrain other methods, deep learning has outstanding performance in creating diversity and generation quality, and reinforcement learning has obvious advantages in generating specific emotions. The integration of the three methods is expected to further improve the effect of emotional music generation and provide strong support for music creation.

Downloads

Download data is not yet available.

References

[1] Huang C Z A, Vaswani A, Uszkoreit J, et al. Music transformer: Generating music with long-term structure. In Proc. Int. Conf. Learn. Represent. 1-14, 2014.

[2] Wang M, El-Fiqi H, Hu J, et al. Convolutional neural networks using dynamic functional connectivity for EEG-based person identification in diverse human states. IEEE Transactions on Information Forensics and Security, 2019, 14(12): 3259-3272.

[3] Song T, Zheng W, Song P, et al. EEG emotion recognition using dynamical graph convolutional neural networks. IEEE Transactions on Affective Computing, 2018, 11(3): 532-541.

[4] Zhong P, Wang D, Miao C. EEG-based emotion recognition using regularized graph neural networks. IEEE Transactions on Affective Computing, 2020, 13(3): 1290-1301.

[5] Du G, Su J, Zhang L, et al. A Multi-Dimensional Graph Convolution Network for EEG Emotion Recognition. IEEE Transactions on Instrumentation and Measurement, 2022, 71: 1-11.

[6] [6] Hung H T, Ching J, Doh S, etal. EMOPIA:A multi-modalpop piano dataset for emotion recognition and emotion-based musicgeneration∥ Proceedingsofthe 22nd International Society for Music Information Retrieval Conference. Online: ISMIR,2021:318-325.

[7] Alabdulkarim A, Li W, Martin L J, et al. Goal-directed story generation: Augmenting generative language models with reinforcement learning [EB/OL].(2021-12-16)[2023-01-27]. https:∥arxiv.org/pdf/ 2112.08593

[8] Heng W. Based on the figure of the neural network electrical brain music emotion recognition and generate (master's degree thesis, Beijing jiaotong university). https://kns.cnki.net /kcms2/article/abstract?v=NK8hpUzgeRUg4JMN1q6WyIjpU 2023, euLh4fHS-JDybN5IHT6 mp1hxhjeelNq50gxraMeTKJHcfTlpPDHtD8XpTuF6lHJaqF8gd5BxkricHRSqIrkmlgPBz0fHMNx4FJ7PgdK&uniplatform=NZKP T&language=CHS

[9] Wang X H. Research on Transformer-based emotional music generation (Master thesis, 2005). Wuhan University of Light Industry). 2024.

[10] Shen Z X, Xie X L, Yin H, Yang L & Li H F. Affective Music Generation with Reinforcement Learning Guided Pre-trained Models. Journal of Fudan University (Natural Science Edition)2024, (03),336-343. doi:10.15943/j.cnki.fdxb-jns.20240605.001.

Downloads

Published

11-05-2025

How to Cite

Wang, Y. (2025). A Review of Research on Emotional Music Generation. Highlights in Science, Engineering and Technology, 138, 65-69. https://doi.org/10.54097/7dzwv369