Capturing the acoustic response of historical spaces for interactive music performance and recording

2004 ◽  
Vol 116 (4) ◽  
pp. 2484-2484
Author(s):  
Wieslaw Woszczyk ◽  
William Martens
2009 ◽  
Vol 33 (4) ◽  
pp. 69-82 ◽  
Author(s):  
Dan Overholt ◽  
John Thompson ◽  
Lance Putnam ◽  
Bo Bell ◽  
Jim Kleban ◽  
...  

1997 ◽  
Vol 102 (5) ◽  
pp. 3182-3182
Author(s):  
Vijay Iyer ◽  
Jeff Bilmes ◽  
Matt Wright ◽  
David Wessel

2015 ◽  
Vol 20 (3) ◽  
pp. 340-348 ◽  
Author(s):  
Felipe Otondo

The role of spatial design in music has become more prominent in recent years, mostly because of the affordability of powerful software and hardware tools. Although spatial audio tools are widely used nowadays in studios and concert halls, there are only few examples of robust and comfortable wearable sound systems with a suitable acoustic response. A wireless body-worn loudspeaker prototype featuring original costume elements, a hybrid full-range loudspeaker array and an improved acoustic response was designed and implemented. The size, shape and acoustic performance of the prototype was optimised using data gathered from anechoic measurements and interviews with performers and audiences. Future developments of this project will consider the implementation of an extended multi-channel performance platform to explore sonic and spatial relationships created by several wearable devices on stage synchronised with a multi-loudspeaker diffusion system.


2009 ◽  
Vol 14 (2) ◽  
pp. 197-207 ◽  
Author(s):  
Georg Essl ◽  
Michael Rohs

Mobile phones offer an attractive platform for interactive music performance. We provide a theoretical analysis of the sensor capabilities via a design space and show concrete examples of how different sensors can facilitate interactive performance on these devices. These sensors include cameras, microphones, accelerometers, magnetometers and multitouch screens. The interactivity through sensors in turn informs aspects of live performance as well as composition though persistence, scoring, and mapping to musical notes or abstract sounds.


2020 ◽  
pp. 86-88
Author(s):  
Rafael Ramirez ◽  
Sergio Giraldo ◽  
Zacharias Vamvakousis

Active music listening is a way of listening to music through active interactions. In this paper we present an expressive brain-computer interactive music system for active music listening, which allows listeners to manipulate expressive parameters in music performances using their emotional state, as detected by a brain-computer interface. The proposed system is divided in two parts: a real-time system able to detect listeners’ emotional state from their EEG data, and a real-time expressive music performance system capable of adapting the expressive parameters of music based on the detected listeners’ emotion. We comment on an application of our system as a music neurofeedback system to alleviate depression in elderly people.


Leonardo ◽  
2014 ◽  
Vol 47 (3) ◽  
pp. 260-261
Author(s):  
Roger T. Dean

Serial music, which is mainly non-tonal, superimposes compositional freedom onto an unusually rigorous process of pitch-sequence transformations based on ‘tone rows’: a row is usually a sequence of notes using each of the 12 chromatic pitches once. Compositional freedom comprises forming chords from the sequences, and in multi-strand music, also in simultaneously presenting different segments of pitch-sequences. The present project coded a real-time serial music composer for automatic or interactive music performance. This Serial Keyboardist Collaborator can perform keyboard music which is impossible for a human to realize. Surprisingly, it was also useful in making more tonal music based on the same rigorous pitch-sequence generation.


Sign in / Sign up

Export Citation Format

Share Document