Download PDFOpen PDF in browser

Mood-Based Emotional Analysis for Music Recommendation

EasyChair Preprint no. 10377

10 pagesDate: June 11, 2023


Music plays a crucial role in enhancing mood and motivation, contributing to overall well-being. Recent studies have emphasized the profound impact of music on human brain activity, with individuals exhibiting positive responses to music stimuli. Considering this, people increasingly rely on music to align with their emotional states and personal preferences. This research focuses on a system that employs computer vision techniques to recommend songs based on the user's mood. By analyzing facial expressions, the system accurately identifies the user's emotional state, facilitating an automated music selection process. This approach not only saves significant time but also eliminates the need for manual song browsing. The proposed system eliminates the necessity of human intervention traditionally associated with mood-based music selection, leveraging computer vision technology to automate the process. By employing algorithms such as Haar Cascade and CNN, facial elements are extracted, enabling real-time emotion identification. The use of an internal camera further optimizes system design efficiency, reducing costs compared to alternative methods. This paper presents an in-depth exploration of the implementation and advantages of an automated mood-based music recommendation system, highlighting the role of facial expression analysis in enhancing user experiences.

Keyphrases: Convolutional Neural Network, dataset, deep learning, face detection, feature extraction, Haar cascade.

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
  author = {Roshani Raut and Dhruv Goel},
  title = {Mood-Based Emotional Analysis for Music Recommendation},
  howpublished = {EasyChair Preprint no. 10377},

  year = {EasyChair, 2023}}
Download PDFOpen PDF in browser