04.07.2013 Views

CCRMA OVERVIEW - CCRMA - Stanford University

CCRMA OVERVIEW - CCRMA - Stanford University

CCRMA OVERVIEW - CCRMA - Stanford University

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

6.4 Controllers and Musical Instruments<br />

6.4.1 The Metasaxophone Project<br />

C. Matthew Burtner<br />

The Metasaxophone Project was formed in 1997 in order to explore applications of the extended saxophone.<br />

The project simultaneously pursues research in computer music, composition and performance<br />

practice. This demonstration will focus on the computer metasaxophone's use of sensor technology and<br />

embedded systems to enable real-time expressive control of virtual strings. Musical examples will be<br />

drawn from my recent composition, S-Trance-S (2001).<br />

For more information please visit: http://wwH.metasax.com/.<br />

6.4.2 TouchSound: Haptics in Sound Editing<br />

Lonny Chu<br />

Recent studies in haptics have shown that force-feedback interfaces can improve user efficiency and<br />

accuracy while decreasing the cognitive load required to accomplish computer tasks. These types of<br />

results can be greatly beneficial to music as we strive to create interfaces that allow the user to become<br />

immersed in the musical experience without being overly conscious of specific physical gestures. While<br />

current sound editing systems require the musician to use devices such as keyboards and mice, passive<br />

scroll wheels, or passive joysticks while editing sound, TouchSound uses a force-feedback mouse, a<br />

vibrotactile mouse, and a force-feedback knob to investigate how programmable forces can improve<br />

the sound editing experience.<br />

The 2 primary goals of TouchSound are:<br />

1. Show that force-feedback interfaces improve user performance in editing sound.<br />

2. Explore the design processes necessary for creating pertinent haptic effects, or haptic icons, that<br />

will assist the musician in using the environment.<br />

For the first goal, experiments will be performed to measure performance in basic sound editing tasks<br />

such as locating the onset and offset of a sound sample. Various haptic effects such as detents, pops,<br />

textures, walls, and damping will be used to construct the haptic environment as the user is tracked in<br />

accomplishing tasks such as locating defined points in the sound. Additionally, subjective data will be<br />

collected to show that haptics also increases user fulfillment and decreases stress levels. Future work will<br />

then investigate issues involving the design of haptic icons for artistic purposes.<br />

6.4.3 The Accordiatron: A New Gestural MIDI Controller<br />

Michael Gurevich<br />

The Accordiatron is a MIDI controller for interactive performance based on the paradigm of a conventional<br />

squeeze box or concertina. It senses and encodes the gestures of a performer using the standard<br />

communication protocol of MIDI, allowing for flexible mappings of performance data to sonic parameters.<br />

When used in conjunction with a real-time signal processing environment, the Accordiatron can<br />

become an expressive, versatile musical instrument. It features a combination of discrete and continuous<br />

sensory data, providing the subtle expressiveness and control necessary for interactive music.<br />

44

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!