now It`s free!

International Conference on Computer Science, Electronics & Electrical Engineering-2015
IDENTIFICATION OF RAGAS AND THEIR
SWARAS
Prithvi Upadhyaya
Shreeganesh Kedilaya B
Electronics and Communications
Srinivas School of Engineering
Mukka, India
[email protected]
Electronics and Communications
Srinivas School of Engineering
Mukka, India
[email protected]
ABSTRACT
In this work, an attempt is made to identify/ differentiate Ragas in association with Carnatic Classical
Music. The audio input is taken from Carnatic Music piece, like alapanas, krithis, arohana-avarohana patterns,
a system is built to identify the Raga associated with it. The objective is to analyze the Raga without the prior
knowledge of it and to find the swaras present in it by using the spatial domain approach, and to identify the
Raga associated with it, in any pitch comfortable with the singer, known the pitch value.
Keywords—Clustering method, Gamaka, Praat software, Raga, Tala.
I.
INTRODUCTION
Music is something that enchants everyone. Classical music has its own format of rendering. The
Indian Classical music is widely divided into Hindustani Classical music and Carnatic Classical music. Though
both the Carnatic Classical music as well as the Hindustani Classical music have the same framework, they
differ in many factors. The Indian classical music has its own way ofrendering or it has a specific technique for
rendering.Western music is widely studied but the Indian classical music is not explored much. This could be
because of its difficult technique of rendering and also because, acquiring the skill to analyze it will take years
together. It has its foundations in 2 elements Raga and Tala. Raga refers to the tune with which the melody is
made and Tala refers to the rhythmic pattern which keeps repeating in a cyclic fashion [5]. Raga is a series of 5
or more musical notes upon which the melody is made. The musical notes are also called as Swaras in the
classical music world. Identifying the Raga associated with the Indian classical music requires a deep
knowledge in the field of Classical music. Knowledge of different type of Ragas is essential to identify a Raga
otherwise. Therefore this kind of system will be helpful even for the beginners who want to learn about this
beautiful art form – the Carnatic classical music. Therefore an attempt is made to electronically identify the
Ragas in Carnatic classical music. This work is mainly for the art-lovers and the learners, to help them in
identifying the Raga associated in any music.
The input can be vocal or instrumental audio clip. The system is designed in such a way that it
is capable of identifying the Raga in any pitch of the audio file, known the pitch in which the music is rendered.
The input in the wave format undergoes filtering and then the audio samples are segmented. For each segment a
pitch detection algorithm is used to identify the pitch frequency. Knowing the base frequency of the input, the
system is designed to calculate therelative frequency to find the notes present in the audio sample. Hence the
system calculates the notes and displays the name of the Raga.
II.
LITERATURE REVIEW
The Indian classical music is not extensively studied but western music is explored to a greater extent.
In the work proposed by A. S. Krishna[3], taking Hidden Markov Model as the classifier, some Melakartha
Ragas are identified, wherein they have used the vocal input database. The datatbase includes only the swara
sequence as it appears in arohana-avarohana of Ragas. Gaurav Pandey, Chaithanya Mishra and Paul Ipe [2] in
their paper have mentioned about ―Tansen‖ a system developed for identifying any Raga in Hindustani Classical
music. The main limitation of this work is that they have used only a single pitch i.e. the G pitch for vocal audio.
1
©2015 ISRASE
ISRASE eXplore Digital Library
International Conference on Computer Science, Electronics & Electrical Engineering-2015
In the work, ―Swara identification for South Indian classical music‖, by Rajeshwari Sridhar & T.V. Geetha[4]
raga is identified by segmenting the audio in integral multiple of the Tala used. The work presented here in this
paper is similar to this, but segmentation is not done based on the Tala instead segmentation frame length
depends on the maximum speed in which the gamakas in the musical notes are rendered.
III.
PROPOSED SYSTEM
Fig. 1.Flow Diagram of the proposed system.
The flow diagram of the proposed system is as shown in Fig.1. The input can be vocal or instrumental
audio clip. The system is designed in such a way that it is capable of identifying the Raga in any pitch of the
audio file, known the pitch frequency in which the music is rendered. The input in the wave format is sampled
first to reduce the data size. The sampling is done with a quality acceptable for music signal. It is then filtered
with a low pass filterto remove high frequency noise. Then the audio samples are segmented. For each segment
a pitch detection algorithm is used to identify the pitch frequency. Knowing the base frequency of the input, the
system is designed to calculate the relative frequency to find the notes present in the audio sample. Hence
knowing thenotes the system outputs the name of the Raga which is present in the database of the system.
A. Filtering
Before any of the process the audio is sampled at 44.1kHz, which is a good sampling rate for a music
signal. The audio input is then filtered to remove high frequency noise by low pass filtering it. Only if the audio
taken is from an open environment prone to noise, filtering will be of use.
B. Feature Extraction
Feature Extraction is done by segmenting the audio into frames of equal and convenient length and
pitch frequency in each frame is identified.Segmentation frame length depends on the speed with which the
gamakas are present in the audio file taken. The human ear can interpret a sound if it exists for atleast 25ms. The
frame length is taken keeping this into account. The feature extracted is the pitch histogram, to find the number
of notes present in the audio piece taken.
C. Note Identification
Ones we get the pitch frequency value in each segment of the audio input, the system matches the
range of note frequenciesof the notes appearing in Carnatic music to the pitch frequency obtained. That is, the
pitch contour is mapped to the possible swaras that can appear in an octave.
D. Classifier
Clustering algorithm is used for classification by having the pitch histogram. The algorithm is made to
choose the most dominant note, among the component of swara in case of vikrithi swaras. Vikrithi swaras are
the swaras other than Sa and Pa, wherein Sa and Pa are called as Prakrithi swaras, i.e. they are constant, unlike
vikrithi swaras, which have different components and either of, each component of each swaras appear in
Janaka Ragas [5].
E. Database Comparison
Ones we get the various notes present, it will be compared with the database present. The Raga to
which it best suites will be the output Raga.
F. Raga Identification
Thus the name of the computed output Raga will be displayed by the system, which is the Raga,
matching the present in the database. The algorithm for the flow of the code is as shown in Fig.2. the code is
written in Mat-Lab.
2
©2015 ISRASE
ISRASE eXplore Digital Library
International Conference on Computer Science, Electronics & Electrical Engineering-2015
Fig. 2. Algorithm for the code in Mat-Lab
IV.
RESULTS AND ANALYSIS
The database chosen has vocal audio clips sung by various signers, both male and female. The database also
included the instrumental audio clips of Mandolin and Veena. The audio clips are of 2-3 minutes duration. The
code also gives accurate results for real-time audio inputs. The table shown in Table.1. gives the details of the
observation made to train the system for feature extraction.
Table. 1.Frequency ratios of each note with frequency of Sa, as obtained from the experiment.
Position
Swaras/ notes
Notation
Frequency Ratio
with Sa
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
Shadja
Shuddha Rishabha
Chathushruthi Rishabha
Shatshruthi Rishabha
Shuddha Gandhara
Sadharana Gandhara
Anthara Gandhara
Shuddha Madhyama
Prathi Madhyama
Panchama
Shuddha Daivatha
Chathushruthi Daivatha
Shatshruthi Daivatha
Shuddha Nishada
Kaishiki Nishada
Kakali Nishada
Sa
Ri1
Ri2
Ri3
Ga1
Ga2
Ga3
Ma1
Ma2
Pa
Da1
Da2
Da3
Ni1
Ni2
Ni3
1
1.025
1.1
1.19
1.1
1.19
1.25
1.31
1.41
1.485
1.56
1.68
1.775
1.68
1.775
1.885
In the Table shown in Table.1, the values given in the last row, i.e. the frequency ratio with respect to
Sa is either the average value of frequency range obtained for each swara or the most prominent value. The
most prominent value is obtained from pitch histogram, wherein the pitch frequency is repeated the most, in a
small interval of time.
3
©2015 ISRASE
ISRASE eXplore Digital Library
International Conference on Computer Science, Electronics & Electrical Engineering-2015
Fig. 3.Plot of pitch frequencies obtained in Mat-Lab for Kalyani Raga Arohana-Avarohana pattern using Mandolin instrument
V.
CONCLUSION
The system was trained for 2 parent(Janaka) or Melakartha Raga and tested using Mat-lab software
and gives an accuracy of 80% for vocal and 85% accuracy for instrumental audio. The drawback of this work is
that the near-by Ragas cannot be detected. As a future work, the system can be tried of enhancement by using
other different robust classifiers and have an algorithm to have greater accuracy to find the near-by Ragas.
REFERENCES
[1]
[2]
[3]
[4]
[5]
Ranjani H. G, Arthi S and T. V. Sreenivas, ‖Carnatic music analysis: shadja, swara identification and raga verification in alapana using
stochastic models‖, IEEE workshop on Applications of Signal Processing to Audio and Acoustics, October 16-19,2011, New Paltz, NY.
Gaurav Pandey, Chaithanya Mishra and Paul Ipe, ―Tansen: A system for automatic Raga identification‖,IICAI, 2003
A. Srinath Krishna, P.V. Rajkumar, K.P. Saishankar and Dr. Mala John, ―Identification of Carnatic Raagas using Hidden Markov Models‖,
9th IEEE International Symposium on Applied Machine Intelligence and Informatics, January 27-29, 2011, Smolenice, Slovakia
Rajeshwari Sridhar, T. V. Geetha, ―SwaraIdentification for South Indian Classical Music‖, 9th Internatonal IEEE conference on Information
Technology.
P. Sambamoorthy, ―South Indian Music‖, The Indian Music Publishing House, 1998.
4
©2015 ISRASE
ISRASE eXplore Digital Library