Abstract
This paper explores the idea of using virtual textural terrains as a means of generating haptic profiles for force-feedback controllers. This approach breaks from the paradigm established within audio-haptic research over the last few decades where physical models within virtual environments are designed to transduce gesture into sonic output. We outline a method for generating multimodal terrains using basis functions, which are rendered into monochromatic visual representations for inspection. This visual terrain is traversed using a haptic controller, the NovInt Falcon, which in turn receives force information based on the grayscale value of its location in this virtual space. As the image is traversed by a performer the levels of resistance vary, and the image is realized as a physical terrain. We discuss the potential of this approach to afford engaging musical experiences for both the performer and the audience as iterated through numerous performances.
Original language | English (US) |
---|---|
Pages (from-to) | 38-41 |
Number of pages | 4 |
Journal | Proceedings of the International Conference on New Interfaces for Musical Expression |
State | Published - 2017 |
Event | International conference on New Interfaces for Musical Expression, NIME 2017 - Copenhagen, Denmark Duration: May 15 2017 → May 19 2017 |
Keywords
- Cross modal mapping
- Haptic interfaces
- Multimodal interaction
- Performance
- Terrain
ASJC Scopus subject areas
- Control and Systems Engineering
- Signal Processing
- Instrumentation
- Music
- Human-Computer Interaction
- Hardware and Architecture
- Computer Science Applications