Virtual reality (VR), augmented reality (AR), and mixed reality (MR) systems present users with realistic simulations. The most common application is gaming, but did you know that VR/AR/MR can also create virtual musical instruments (VMIs)?
VMIs may or may not mimic traditional physical musical instruments. Physical instruments can produce beautiful music, but they can also be hard to learn and are bound by physical and acoustic constraints. By removing many of these constraints, VMI designers can focus on new challenges such as
- making the instrument easy to learn,
- adapting to different performers,
- producing new electronic sounds,
- operating at the phrase or composition level rather than the note level,
- supporting performers with musical intelligence, and
- allowing collaboration among multiple players.
This September 2017 Computing Now theme presents articles and a video on VMI 3D user interface design and implementation. Although some VMIs are still in the preliminary design stage, these instruments show great potential in the areas of musical games, music education, and live music performance.
VMI Technical Challenges
Even when a VMI imitates a traditional instrument, users cannot always play it in exactly the same way. Technical challenges include:
- Precise finger tracking and mapping to note markers. Most VMIs use only hand gestures. Although advanced sensors can track fingers, tracking multiple fingers in 3D free space is still difficult. In addition, sensor capacity limits the number of note markers (for example, it’s hard to represent all of a piano’s keys).
- Trigger and gesture strategies. Most physical instruments have mechanisms both to trigger notes and to control them continuously. Even with reasonable 3D finger tracking mechanisms, offering effective control over multiple parameters in real time is difficult.
- Hardware device setup. Positions and directions of video sensors need to avoid finger occlusion to achieve precise tracking and note triggering. Sometimes, multiple cameras are required (for example, tracking a violin player’s left and right hands might require two cameras).
- Haptic feedback. Most musicians rely on force feedback on a particular physical musical instrument to control musical expressions such as volume and vibrato. Force feedback is difficult to implement in VMIs. Although audio feedback is helpful, interfaces without some haptic feedback can be difficult to use.
- Sound resource selection. Most VMIs use a musical instrument digital interface (MIDI) with off-the-shelf sound libraries. This offers limited control and range of sound. However, it’s possible to develop sound synthesis algorithms that offer better control.
Future technological advancements will undoubtedly address many of these challenges.
The Articles
Florent Berthaut, Victor Zappi, and Dario Mazzanti’s “Scenography of Immersive Virtual Musical Instruments” analyzes various stage setups for immersive VMIs based on six factors: audience visibility, musician visibility, audience immersion, musician immersion, gesture continuity, and virtual-physical merging. The authors suggest that presentation and context are important in VMI application and appreciation.
“ChromaChord: A Virtual Musical Instrument” presents ChromaChord, which uses a VR headset (the Oculus Rift DK2) and a hand-tracking camera (a Leap Motion controller). Additional sensors detect headset motion to accurately align position markers and adjust instrument settings. Author John Fillwalk illustrates the value of considering sensor and display capabilities and limitations when designing VMIs.
In “Interval Player: Designing a Virtual Musical Instrument using In-Air Gestures,” Wallace Lages and his colleagues describe another VMI that uses Leap Motion but takes a different approach to triggering melody notes. In their VMI, hand gestures control interval change directions and velocity, with intervals representing the distances between two notes and the largest interval being five. The non-dominant hand can play harmony.
Marcio Cabral and his colleagues present their VMI in “Crosscale: A 3D Virtual Musical Instrument Interface.” Touchable virtual spheres map to notes and chords, and pitches appear on a 2D grid; simple trajectories on a pitch grid can produce musical patterns more readily than on a 1D keyboard layout. The system uses Oculus Rift for visualization and Razer Hydra for gesture input.
In “Cirque des Bouteilles: The Art of Blowing on Bottles,” Daniel Zielasko and his colleagues describe a 3D interface in which users blow air into virtual bottles using a microphone; a virtual air stream visually indicates the blowing strength. Users select bottles with a Leap Motion controller, and Oculus Rift DK2 displays a virtual hand model that maps to the user’s gestures.
Alec G. Moore and his colleagues highlight their immersive music composition system in “Wedge: A Musical Interface for Building and Playing Composition-Appropriate Immersive Environments.” The user drags keyboard notes to desired positions, stacking them vertically to create chords. This structure helps users focus on composing sequences and chords instead of on individual notes.
Nikolas Burks, Lloyd Smith, and Jamil Saquer developed a virtual bass xylophone using Microsoft Kinect sensors. “A Virtual Xylophone for Music Education” details how the VMI tracks physical mallets in 3D space and uses the relative positions of user-drawn bars to trigger corresponding notes.
Understanding associations between sounds and images is important for improving VMIs. Rob Hamilton’s “Perceptual Coherence as an Analytic for Procedural Music and Audio Mappings in Virtual Space” analyzes how listeners form associations between animated avatars and sounds by varying the relationships between avatar motion properties and interactively synthesized sound properties.
The Industry Perspective
In this month’s video, Rob Hamilton, an assistant professor of music and media at Rensselaer Polytechnic Institute, discusses connections between video games and musical instruments. His artistic pursuits have led to new perspectives on what it means to compose, play, and experience music. Rather than adapt VR technologies to emulate musical instruments, he and his collaborators have extended virtual environments with creative, interactive sonifications. Exploring these virtual spaces and objects constitutes a new kind of musical performance.
Read: Full-text video transcript (pdf).
Conclusion
With VMIs, digital control and audio signal processing allow us to explore musical instruments and interactions in exciting new ways. As with the evolution of physical instruments, VMI designers will undoubtedly incorporate new technologies into future designs, creating opportunities for better sensors, haptic feedback, displays, acoustic transducers, and sound synthesis methods.
Guest Editors
Roger Dannenberg is a professor of computer science, art, and music at Carnegie Mellon University. His research in computer music includes the development of automatic computer accompaniment systems, languages and systems for sound synthesis and composition, and the widely used open source audio editor Audacity. Dannenberg composes experimental electronic works and opera, and he plays the trumpet. Contact him at rbd@cs.cmu.edu.
Timothy K. Shih is a professor at the National Central University, Taiwan. His research interests include multimedia computing and distance learning. Shih has served as an associate editor of IEEE Transactions on Learning Technologies and IEEE Transactions on Multimedia. He is a Computing Now advisory board member. Contact him at timothykshih@gmail.com.