Mogees project delivers haptic symphony (w/ video)

Mogees project delivers haptic symphony (w/ video)

(PhysOrg.com) -- Creative sound-making is as fluid and changing as it implies, incorporating everything from troupes that bang on every hard surface imaginable to creators of electronic music, to musicians who craft their notes to reflect real conversation, to the new phenomenon, the Mogees.

The Mogees is a project that stems from the department of computing at Goldsmiths, University of London, where researcher Bruno Zamborlin collaborates with a team at IRCAM in Paris to experiment with new methods for “gestural interaction” in coming up with novel ways of making sounds. The project has released a video that, besides delighting every four year old on the planet, opens the minds of researchers. The video shows the use of a contact microphone and audio processing software to construct a gesture-recognizing touch interface from assorted surfaces—a tree trunk, a balloon, a glass panel at a bus stage, and an inflated balloon. Also, different gestures control different sounds.

Wooden panels sound like a bicycle bell; playing on a balloon makes sounds like a crystal hanging ornament in the wind. Other surfaces reveal sounds, by slapping or brushing, hitting, or finger tapping, that include tribal string pianos in the heat of a musical narrative.

The Mogees project turns any surface into a gestural musical interface, using the button-like silver microphone and audio processing software. But just how does it work? ExtremeTech carries the more lucid of attempted explanations: The contact microphone has multiple microphones, creating a stereo image of a sound that’s made. A PC cable connection picks up the finger vibrations for analysis and converts them into gestures. A visual programming language (MaxMSP) turns the gestures into sounds.

“Mogees is an interactive gestural-based surface for realtime audio mosaicing,” is the somewhat intimidating definition appearing on the Department of Computing site at Goldsmiths. A helpful explanation, however, also contributes toward understanding what is going on.

“When the performer touches the surface, Mogees analyses the incoming audio signal and continuously looks for its closest segment within the sound database. These segments are played one after the other over time: this technique is called concatenative synthesis.”

A surface, for example, can be played with any tool such as hands and Mogees will always try to find a correspondent sound to it. It can also be applied to other sound sources such as voice or acoustic/electric instruments.

Zamborlin began the project because he liked the idea of being able to touch a real surface when creating electronic music. “Touching real surfaces allows users to experience haptic feedback on what they do and enhancing their relationship with the device."

Researcher and developer Norbert Schnell is named as part of the Mogees effort, and the project also makes reference to its use of the “MuBu environment for MaxMSP.”

Max is a visual programming language for music and multimedia; electronic use it for unique sound-making tools. The program is highly modular with most routines in the form of shared libraries. MuBu is a sound description buffer for real-time interactive audio processing.

More information: www.brunozamborlin.com/mogees/

© 2011 PhysOrg.com

Citation: Mogees project delivers haptic symphony (w/ video) (2012, January 5) retrieved 29 March 2024 from https://phys.org/news/2012-01-mogees-haptic-symphony-video.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Human-computer music performances use system that links music and musical gestures (w/ Video)

0 shares

Feedback to editors