Bill Manaris : Fall 2005 / CSCI 221 Homework 1

Assigned Date: Monday, Sep. 5, 2005 (sec 2 +1 day)
Due Date: Wednesday, Sep. 21, 2005 (sec 2 +1 day)
Due Time: Noon

Last modified on August 24, 2006, at 09:45 AM (see updates)


This assignment focuses on data abstractions, object-oriented design, biosignals, biofeedback, data sonification ... and having fun!


Music and Nature/Nature and Music

According to Scaletti [1],

[t]he idea of representing data in sound is an ancient one. For the ancient Greeks music was not an art-for-art's sake, practiced in a vacuum, but a manifestation of the same ratios and relationships as those found in geometry or in the positions and behaviors of the planets.

Pythagoras, Plato, and Aristotle worked on quantitative expressions of proportion and beauty, such as the golden ratio. Pythagoreans, for instance, quantified harmonious musical intervals in terms of proportions (ratios) of the numbers 1, 2, 3, 4 and 5. This scale became the basis for the well-tempered scales refined by J.S. Bach and others - the scales used in Western music.

Data Sonification

According to Wikipedia [2],

Sonification is the use of non-speech audio to convey information or perceptualize data. Due to the specifics of auditory perception, such as temporal and pressure resolution, it forms an interesting alternative to visualization techniques, gaining importance in various disciplines. It has been well established for a long time already as Auditory Display in situations that require a constant awareness of some information (e.g. vital body functions during an operation).
One of the first successful applications of the sonification is the well-known Geiger counter, a device measuring ionizing radiation. Amount of audible clicks is directly dependent on the radiation level in immediate vicinity of the device.


You will write a set of programs that convert biosignals to "music".

To do so you will employ a set of classes from jMusic. jMusic is a programming library for sound and music applications in Java.

First you need to set up jMusic on your computer.

Sonification of Data

The data files you will sonify contain:

  1. skin conductance level (SCL) data,
  2. raw heart (blood pressure) data,
  3. peak values of raw heart data, and
  4. time intervals between peaks.

The raw data is captured many times per second. For example, see sample data.

Figures 1 and 2 visualize these sample data.

Fig. 1 Sample raw heart data.

Fig. 2 Sample SCL data.

Your job is to convert such data to "music".

There is no correct way to map data to sound. The trick is to decide what aspects of the data you would like to make easily perceivable by mapping them to sound. Implement each of the following:

  1. Map raw heart data to note pitch (remember to scale to integer range 0-127). Keep note duration constant (e.g., EN). For example, here is a possible sonification of Fig. 1. Also, here is a possible sonification of the B1E sample.
  2. Map difference between subsequent intervals to note duration (remember to map to a real range, such as 0.1-1.5, or something like that). Map difference between subsequent peak values to pitch. For example, here is a possible sonification of the B1E sample.
  3. Map local variability of each interval to note pitch and/or duration. Local variability for each (i-th) interbeat interval is the deviation of the heart rate from the local average,
    d[i] = abs(tNN[i] - average(tNN, i)) / average(tNN, i)
where tNN is the sequence (array) of intervals between peaks, abs is the absolute value, and average(tNN, i) returns the average of intervals within a narrow, 5-beat wide window (i.e., average of tNN values from indices i-4 to i) [3]. For example, here is a possible sonification of the B1E sample.
Bonus: SCL may be sonified similarly (Hint: use a different MIDI instrument). Play both "melodies" in parallel. Explore combinations (e.g., use (2) for heart data and (1) for SCL).
Bonus: Map intervals between peaks of raw heart data to note duration. Map numeric difference between high and low peak ("strength" of heart beat) to note pitch. (Hint: look at three values in-a-row to filter out "false" peaks.)
Bonus: Research the Internet for ideas on how to map the numbers to music. Implement those as separate methods. Reference your source(s) in your method documentation and project README file.


Classes to submit: contains the following methods: contais methods that convert data to pitch or duration (see (system driver)

Your code should be fully documented via javadoc. Use assert to test method preconditions.

Include the following in each class:

       Certification of Authenticity:

       I certify that this submission is entirely my own work, 
       as per course collaboration policy.

       Name: ________________________ Date: ___________


  1. Open your BlueJ project.
  2. Open (edit) each source file and generate the class interface (javadoc). This can be done within the editor window either by pressing CTRL/J , or selecting the Interface drop-down menu item (on the right). (Note: This is necessary to generate your documentation for grading.)
  3. Under the Project menu, click Create Jar File... . In the dialog box that opens, select Include Source, and press Continue.
  4. Email the generated .jar file to, by the due date and time.


Here are some biosignals to play with. Note that these signals contain both SCL and raw heart data, as well as intervals and peak values.


1. Quote from Carla Scaletti, "Sonification - An Ancient Idea Made Feasible by New Technology", ACM SIGRAPH '93 - Course Notes 81, Aug. 1993, p. 4.2.

2. Sonification, Wikipedia entry, accessed Sep. 2, 2005.

3. J. Kalda, M. Sakki, M. Vainu, M. Laan, "Zipf's Law in Human Heartbeat Dynamics", arXiv:physics/0110075 v1 26 Oct 2001.

(Printable View of