Abstract
Music is a very important part of our lives. People enjoy listening to the music, and many of us find a special pleasure in creating the music. Computers further extended many aspects of our musical experience. Listening to, recording, and creating music is now easier and more accessible to various users. On the other hand, various computing applications exploit the music in order to better support the interaction with users. However, listening to music is generally a passive experience. Although we may change many parameters, the music we listen to generally does not reflect our response, or does so very roughly.In this paper we present a flexible framework that enables active creation of instrumental music based of the implicit dynamics and content of human-computer interaction. Our approach is application independent, and it provides a mapping of musical features to the abstraction of user interaction. This mapping is based on analysis of the dynamic and content of the human-computer interaction. In contrast to the most existing interactive music composition tools, which require explicit interaction with the system, we have provided a more flexible solution that implicitly maps user interaction parameters to the various musical features.
Original language | English |
---|---|
Title of host publication | Proceedings of the 13th annual ACM international conference on Multimedia |
Place of Publication | New York, USA |
Publisher | Association for Computing Machinery, Inc |
Pages | 996-1004 |
ISBN (Print) | 1-59593-044-2 |
Publication status | Published - 2005 |
Event | conference; 13th annual ACM International Multimedia Conference, November 06-11, 2005, Singapore; 2005-11-06; 2005-11-11 - Duration: 6 Nov 2005 → 11 Nov 2005 |
Conference
Conference | conference; 13th annual ACM International Multimedia Conference, November 06-11, 2005, Singapore; 2005-11-06; 2005-11-11 |
---|---|
Period | 6/11/05 → 11/11/05 |
Other | 13th annual ACM International Multimedia Conference, November 06-11, 2005, Singapore |