04.07.2013 Views

Download - CCRMA - Stanford University

Download - CCRMA - Stanford University

Download - CCRMA - Stanford University

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

initial encoding at peripheral levels to its source-based representation at more central levels. Not only<br />

can thi:> improve our basic understanding of auditory processing but also can suggest ways in which<br />

humans can optimize their performance in detecting and evaluating signals of interest within their<br />

acoustic environment.<br />

6.4.4 Absolute Pitch. Absolute Tempo. Absolute Loudness<br />

Daniel Levitin<br />

Broadly speaking, my research is concerned with the psychology of structure and perceptual organization.<br />

How does the brain organize the world around us. create categories, and parse a dense perceptual field?<br />

To answer these questions. I have been examining principles of visual and auditory perception (how the<br />

brain groups basic elements into objects).<br />

More specifically, my current research projects include work on:<br />

• absolute pitch, including issues about learning, etiology, and categorical perception<br />

• circular statistical models for psychological research<br />

• vowel perception<br />

• memory for musical events<br />

• perception of simultaneity of events (intra-modally and cross-modally)<br />

• music perception and Williams' syndrome patients<br />

• tone deafness/tune deafness, dysmelodia. and amusia<br />

• the search for visual perceptual primitives<br />

For more information, please see http://wvw-ccrma.stanford.edu/~levitin/research.html.<br />

6.5 Machine Recognition in Music<br />

6.5.1 Optical Recognition of Printed Music: A New Approach<br />

Walter Hewlett<br />

Recent projects in optical recognition of printed music have tended to give top priority to the extraction<br />

of pitch symbols (i.e.. noteheads). Noteheads give some information about duration (i.e.. they are filled<br />

or unfilled), but definitive information also requires the accurate reading of stems, flags, and beams.<br />

Symbols for articulation (staccato marks, dynamics, slurs, and so forth) are sometimes ignored if the<br />

intended use of the scanned material is in sound applications.<br />

In an effort to create a scanning front-end for the CCARH databases of classical music, which are stored<br />

in an attribute-rich format (MuseData) to support notation, sound, and analysis, we have taken the<br />

following approach: large objects are identified first. This clarifies contextual properties that may bear<br />

on pitch (key signatures, clef changes, octave-transposition signs), duration (beams, stems, and flags).<br />

and articulation (slurs, ornaments, et al.). The pitch content of the notehead is the last item to be<br />

recognized and completes the representation.<br />

50

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!