Vous êtes sur la page 1sur 4

PATRON-GENERATED MUSIC: STIMULATING COMPUTER MUSIC EXPRESSION IN NIGHTCLUBS

Yago de Quay Faculdade de Engenharia da Universidade do Porto R. de Mouzinho da Silveira 317 2 4050-421 Porto, Portugal yagodequay@gmail.com
Abstract. Recent studies suggest that nightclubs are ideal places for multimedia user participation, however little work has been done in adequately implementing interactive music systems. The objective of this study is to test the use of Motion Capture, Music Information Retrieval and Body-music Mapping, in fostering music expression by patrons in nightclubs. The system was implemented in three nightclubs in Portugal and Norway and feedback was collected from patrons, DJs, and nightclub staff. Results suggest that the use of different Motion Capture systems and simple Body-music Mappings encourages higher participation, and that Music Information Retrieval can correctly provide harmonic content of western club music. The study concludes that this system can provide a real-world alternative for musical interaction in mainstream nightclubs. Keywords: music information retrieval, motion capture, body-music mapping, nightclubs, interaction

Introduction
Nightclubs are popular, excellent multimedia places for user participation as pointed out by Bayliss, Lock & Sheridan [1], nonetheless, Gates, Subramanian & Gutwin [2] state that little is known about how interaction occurs inside these venues, and current implementations of interactive systems within this context are limited to non-musical audience-to-audience interaction. Furthermore, Bertini, Magrini & Tarabella [3], Lee, Nakra & Borchers [4], and Futrelle & Downie [5] observed that a gap exists between interactive music research results and general real-life interactive music practice. The limited bibliography on music interaction in dance clubs, like the work done by Ulyate & Bianciardi [6], deliberately side step the current role of the DJ. This paper combines the domains of Music Information Retrieval (MIR), Motion Capture (MoCap) and Body-music Mapping to suggest a practical systemic approach for stimulating user's musical expression in nightclubs. Results were presented in three events in Norway and Portugal between April and October 2010.

Nightclubs
The first event was debuted in a small club located in the Portuguese suburbs, frequented mostly by teenagers with a preference for trance music. The second and third events were exhibited in one renowned nightclub located in the center of Norway's capital, also heavily frequented by teenagers with a preference for top 40 dance hits. Results were collected from feedback by and users, DJs and nightclub staff and well as field tests. Based on the information provided by the staff over 400 people attended the three events.

Motion Capture
Motion Capture (MoCap) refers to the process of storing human movement into a digital format. MoCap technologies are either: Optical, relying on computer vision techniques, or non-optical, based on sensors. Applications are mostly limited to the film industry, army and medicine [7-9]. In their study of controllers used for computer musical interfaces, Marshall & Wanderley [10] concluded that depending on the musical task, users express a preference for certain sensors. Both optical and nonoptical MoCap technologies were implemented in the three events. These were mapped to sound effects

and musical events. The optical system featured an Optitrack infrared camera mounted on the ceiling towards a platform against a wall (Fig. 1). Motion was analyzed by calculating the quantity of motion, that is, calculating the sum of all active pixels in a video feed. The non-optical system consisted of two Wii remotes (Fig. 2). Accelerometers inside these controllers provided information on how the they were being swung. To stimulate interaction, lights were positioned on the raised platform where the Optitrack camera was pointing, and colorful ribbons were attached to the Wii Remotes.

Figure 1 - Patron dancing on the platform

Figure 2 - Participants swinging the Wii remotes

Music Information Retrieval


Music in a nightclub is inherently unpredictablethe theme of an event, expected audience, club layout, club owner and club promoter are some of the several factors that influence the DJ's selection of songs [2]. According to Gouyon et al. [11] Music Information Retrieval (MIR) aims at understanding and modeling, with the help of computers, aspects of the sound and music communication chain. The MIR software was built on the MAX/MSP programming environment and relied on three pitch trackers: zsa.freqpeak~, segment~, and analyzer~. By interpolating the values of each pitch tracker, the software was able to extract in real-time the harmonic contents of the songs played by the DJ and provide a selection of suitable notes and chords. The MoCap systems would then manipulate the notes and chords extracted from the music.

Body-music Mapping
Miranda & Wanderley [12] define mapping as the liaison or correspondence between control parameters (derived from performer actions) and sound synthesis parameters. Mappings may involve the clarity and time between a systems input/output, interaction complexity, predictability, and psychological associations between movement and sound. Quantity of motion detected by the Optitrack infrared camera was applied on a threshold to trigger notes while acceleration data from both Wii remotes separately triggered notes and manipulated the frequency of a band pass filter acting on chords. Notes and chords were selected from the Music Information Retrieval patch and played on two local speakers assigned to each motion capture installation. Only people close to the MoCap systems could hear the notes and chords, leaving the main dance floor unaffected.

Results
The three events were carried out in two Nightclubs with different infrastructures, music styles and club light effects. The MIR software performed well and no off tune notes were observed. The MoCap camera worked best when filming against a wall to eliminate background visual noise. To avoid interference with club lights, the camera had to accept infra-red (IR) light and filter out visible light. The blue tooth communication used by Wii Remotes was robust and could extend up to 10 meters. The remotes themselves were resistant and withstood hard thrashing.

Patrons, DJs, nightclub staff, and observation provided feedback as to how the systems were being used. Findings suggest that simple mappings, fast sonic response and visual feedback worked best. Users found the Wii Remotes very easy to manipulate, and the movement to sound relationship clear. The Optitrack camera's intention was more obscure because it lacked visual feedback. Some patrons did not engage with the systems because the interaction required them to move too much. Initially, users were captivated by the sounds they produced, most however, soon lost interest with the limited possibility of sound effects.

Discussion
This paper suggested a method of combining music information retrieval, motion capture technologies and body-music mapping to foster adhoc music making by patrons inside nightclubs. Patrons could make music through an Optitrack Infra-red camera and Wii remotes, while a Music Information Retrieval (MIR) software provided harmonic content by analyzing the DJ's music. The most challenging part consisted in developing tools that could be flexible to each nightclub's environment and stimulate user participation. Although proven robust, the Music Information Retrieval patch was only tested with western music and might not be suitable for other styles. In his summary of the progress in the field of MIR, Downie [13], argues that there is limited research non-western music. Works by Blaine & Perkins, [14] Tahiroglu & Erkut [15] and Feldmeier & Paradiso [16] with group computer music making suggest that further work can be done on implementing group music expression tools in public venues. Results encouraged more work on visual feedback, audio effects and passive systems for observers. They suggest that interactive events may become an attractive entertainment alternative in nightclubs and tools are robust enough for DJs, musicians, club owners and researchers to apply and build upon. Most importantly, it seems that innovative ways of using new technologies are steadily overcoming barriers for adoption of interactive music and pushing the boundaries of participation in nightclubs.

Conclusion
Nightclubs are ideal multimedia places for user participation. But despite the nightclubs emphasis on music, there has been little implementation of interactive music systems, and the ones that exist have side stepped the current role of the DJ. This studys results suggest that Motion Capture, Music Information Retrieval, and Body-music Mapping can provide a practical alternative for computer music expression in Nightclubs. However, further work is needed to extend and replicate these findings, and to develop social tools for music making.

Acknowledgements
Kristian Nymoen for helping develop part of the computer vision tools, COST SID for partially funding the project, and fourMs lab (Oslo) for providing many of the materials. Bibliography 1. 2. Bayliss, A., S. Lock, and J.G. Sheridan. Augmenting expectation in playful arena performances with ubiquitous intimate technologies. in PixelRaiders 2. 2005. Sheffield. Gates, C., S. Subramanian, and C. Gutwin, DJs' perspectives on interaction and awareness in nightclubs, in Proceedings of the 6th conference on Designing Interactive systems. 2006, ACM: University Park, PA, USA. p. 70-79. Bertini, G., M. Magrini, and L. Tarabella, An Interactive Musical Exhibit Based on Infrared Sensors, in Computer Music Modeling and Retrieval, R. Kronland-Martinet, T. Voinier, and S. Ystad, Editors. 2006, Springer Berlin / Heidelberg. p. 92-100. Lee, E., T.M. Nakra, and J. Borchers, You're the conductor: a realistic interactive conducting system for children, in Proceedings of the 2004 conference on New interfaces for musical expression. 2004, National University of Singapore: Hamamatsu, Shizuoka, Japan. p. 68-73.

3.

4.

5. 6.

Futrelle, J. and S. Downie, Interdisciplinary Research Issues in Music Information Retrieval: ISMIR 2002. Journal of New Music Research, 2003. 32(2): p. 121-131. Ulyate, R. and D. Bianciardi. The interactive dance club: avoiding chaos in a multi participant environment. in NIME '01: Proceedings of the 2001 conference on New interfaces for musical expression. 2001: National University of Singapore. Furniss, M. Motion Capture. forum/papers/furniss.html. 2004. Available from: http://web.mit.edu/comm-

7. 8. 9. 10.

Skogstad, S.A.v.D., A.R. Jensenius, and K. Nymoen, Using IR Optical Marker Based Motion Capture for Exploring Musical Interaction. 2010, Oslo: The University of Oslo. Kitagawa, M. and B. Windsor, MoCap for Artists: Workflow and Techniques for Motion Capture, ed. P. Temme. 2008, Burlington, MA: Focal Press. Marshall, M. and M. Wanderley, Evaluation of Sensors as Input Devices for Computer Music Interfaces, in Computer Music Modeling and Retrieval, R. Kronland-Martinet, T. Voinier, and S. Ystad, Editors. 2006, Springer Berlin Heidelberg. p. 130-139. Fabien Gouyon, et al., Content processing of music audio signals., in Sound to sense, sense to sound: A state-of-the-art in Sound and Music Computing, Pietro Polotti and D. Rocchesso, Editors. 2008, Logos Verlag Berlin GmbH: Berlin. Miranda, E.R. and M.M. Wanderley, New Digital Musical Instruments: Control and Interaction Beyond the Keyboard. 2006: A-R Editions, Inc. 295 Downie, J.S., MIR/MDL Evaluation: Making Progress. The MIR/MDL evaluation project white paper collection, edition #3: Establishing music information retrieval (MIR) and music digital library (MDL) evaluation frameworks: Preliminary foundations and infrastructures. 2003, Illinois. Blaine, T. and T. Perkis, The Jam-O-Drum interactive music system: a study in interaction design, in Proceedings of the 3rd conference on Designing interactive systems: processes, practices, methods, and techniques. 2000, ACM: New York City, New York, United States. p. 165-173. Tahiroglu, K. and C. Erkut. ClaPD: A testbed for control of multiple sound sources in interactive and participatory contexts. in The PureData Convention. 2007. Montreal, Canada. Feldmeier, M. and J.A. Paradiso, An Interactive Music Environment for Large Groups with Giveaway Wireless Motion Sensors. Computer Music Journal, 2007. 31(1): p. 50-67.

11.

12. 13.

14.

15. 16.

Vous aimerez peut-être aussi