Still trying to catch up with my notes from the Mobile Music Workshop. I doubt that i’ll manage to post everything, the workshop’s blog and flickr account and tag might be helpful if you want to fill in the blanks.
Yolande Harris and gps trace of a boat on anchor
One of my favourite talks was Yolande Harris‘ presentation of Taking Soundings – Investigating Navigations and Orientations in Sound. During her presentation she explained how she connected live sounds to a gps receiver and discovered that although the gps doesn’t move the sounds revealed changes in the reading of the satellite. She then mapped the traces of these sound over a certain period of time and it turns out that although the device is static, the readings indicate movements over relatively large distances. In such context, a building seems to be a “mobile” entity. Which leads to questions such as what does the word “mobile” actually mean? Is the entire system actually floating?
Atau Tanaka, Guillaume Valadon and Christophe Berger presented their paper Social Mobile Music Navigation Using The Compass, an interface that seeks to fuse elements of proximal interaction, geographic localization and social navigation to allow groups of wifi-equipped phone users to intuitively find friends, network connectivity or new music. Precursors of the project include TunA and Push!Music, Malleable Mobile Music, net_derive. Compass would facilitate music sharing tendencies witnessed when students use bluetooth or IR to exchange music on their mobile phones. A user of Compass seeking music to listen to would turn to its Compass to seach for friends who might be nearby. The user selects which friend he wants to contact and follows the Compass direction to walk within range of his or her friend. The system will then propose the two users to bootstrap a proximal network. Once this spontaneous private network is established, the two users compare playlists based on various musical criteria. A song of interest to the first user is then copied using the phone Wifi connectivity.
Dan Wilcox demoed the robotcowboy, a humanÂcomputer mobile performance that consists of a “one Âman bandâ€? wearable computer system which allows him to perform computer-based music without being tied down by the computer on stage. It is composed of a computer and input devices such as midi controllers, game controllers, and environmental sensors.
Wilcox’ paper (PDF) mentions previous music projects that focused on soundwalk such as Sonic City, Sonic Interface, Bodycoder – a body sensor array which controls live sounds through Max/MSP environment, the MIT musical jacket, and CosTune (PDF) – a wireless jam session with users wearing mobile gestural instruments such as gloves, jacket and pants.
Bernhard Garnicnig and Gottfried Haider took us to the nearby park for a demonstration of Craving, a Spatial Audio Narrative. Wearing headphones and a portable computer equipped with a software that determines their position via GPS, users can listen to voices and sounds placed in precise locations.
Image of the robotcowboy stolen from Christophe Berger‘s flickr stream.