Much of what we might call ‘high-art music’ occupies the difficult end of listening for contemporary audiences. Concepts such as pitch, meter and even musical instruments often have little to do with such music, where all sound is typically considered as possessing musical potential. As a result, such music can be challenging to educationalists, for students have few familiar pointers in discovering and understanding the gestures, relationships and structures in these works. This paper describes ongoing projects at the University of Hertfordshire that adopt an approach of mapping interactions within visual spaces onto musical sound. These provide a causal explanation for the patterns and sequences heard, whilst incorporating web interoperability thus enabling potential for distance learning applications. While so far these have mainly driven pitch-based events using MIDI or audio files, it is hoped to extend the ideas using appropriate technology into fully developed composition tools, aiding the teaching of both appreciation/analysis and composition of contemporary music.
|Publication status||Published - 2003|