Vous êtes sur la page 1sur 13

ABSTRACT

 Music emotion recognition(MER) detects the naturally emotional expression of


people for a music clip.
 It is an interdisciplinary research involving not only signal processing and machine
learning, but the understanding of auditory perception, psychology, and
musicology.
 Experiments will be conducted via the measure of the correlation between
diverse emotional expressions and various musical cues.
CONTENTS

1)Introduction
2)Literature Survey
3)Problem Statement
4)Objectives
5)Scope of the project
6)Methodology
7)Requirements
8)Reference
INTRODUCTION
 Music Emotion Recognition(MER) System has been developed since the
emotion vary from person to person while listening to the same music.
 Characteristics such as timbre, intensity and rhythm are used.
 The ambiguity of emotional description has been reduced.
 Valency Arousal plane is used in which emotion states may vary according to
ever changing melody.
LITERATURE SURVEY
 Hevner helped in categorizing various adjectives into 8 different groups each
representing a class of mood.
 Russell later came up with the circumplex model.
 Thayer too came up with a dimensional model plotted along two axes with
mood represented by a two dimensional co-ordinate system.
 Yi Liu and YueGao extracted the Audio Features such as Rhythm, timbre,
intensity. They used and studied the Gaussian Mixture Model and Support
Vector Machine as classifier.
 Chia-Chu Liu and team presented an emotion detection and classification
system for pop music.
PROBLEM STATEMENT
 Music listeners have tough time creating and cut off the play-list manually when
they have hundred’s of songs.
 It is also difficult to keep track of all the songs.
 Sometimes songs that are added are never used.
 User’s have to manually select songs every time based on interest and mood.
 Currently in existing application, music is organized using play-list and play-list
songs cannot be modified or altered in one click.
 User’s have to manually change or update each songs in their playlist every
time.
OBJECTIVES
 Classification of songs than just using traditional information
 Artificial intelligence acquires the capacity to detect mood of a person
emotions.
 Machine learning is part of artificial intelligence
 There is slight difference between mood and emotion.
 Detect the user’s mood and accomplish it with the suitable music.
SCOPE OF THE PROJECT

 Human mood, cultural back ground, individual personality, choice of song


etc.. affects the categorization of music.
 Adjectives describing emotions can be ambiguous.
 Choosing a song suiting our mood.
METHODOLOGY

Mood of the music is a basic aspect and finds its usefulness in music retrieval
systems or mood taxonomy applications.
REQUIREMENTS
HARDWARE REQUIREMENTS
 System : intel core i5
 Harddisk : 1TB
 Monitor : 15 VGA Colour
 Mouse : Logitech
 RAM : 4GB

SYSTEM SOFTWARE
 Operating System : Window xp/Window7
 Software tool : Anaconda
 Code language : Python
 Database : Mysql
 Hosting : AWS(Amazon Web Server)
EXPECTED OUTCOME
 We can detect the mood of the person
 Based on the mood on the person we can post the addvitisments
 It helps in the promoting marketing producuts
 Example:Amazon adds, Youtube adds, Facebook adds etc..
REFERENCES
 [1] Vahida Z Attar, Aniruddha Ujlambkar “Mood based classification of
Indian Popular Music” Modelling Symposium (AMS), 2012 Sixth Asia,
ISBN: 978-0-7695-4730-5/12, 29-31 May 2012.
 [2] Yi Liu , Yue Gao –”Acquiring mood Information from songs in large
music database”, Fifth International Joint Conference on INC, IMS,
and IDC, ISBM: 978-0-7695-3769-6/09 , 25-27 Aug. 2009.
 [3] Hanaa M. Hussain1, Khaled Benkrid1, Huseyin Seker2, Ahmet T.
Erdogan-” FPGA Implementation of K-means Algorithm for
Bioinformatics Application:An Accelerated Approach to Clustering
Microarray Data” NASA/ESA Conference on Adaptive Hardware and
Systems., ISBN: 978-1-4577-0599-1/11 , 6-9 June 2011.
 [4] JungHyun Kim, SungMin Kim, Won Young Yoo-”Music mood
classification model based on arousal-valence values.” ICACT, ISBN
978-89- 5519-155-4, 2011.