Here’s my notes of the talk that Atau Tanaka gave about Mobile Music: Creating New Musical Forms For New Infrastrucutres at Futuresonic. I was particularly thrilled to hear him talk as one of his works, Bondage, was my favourite piece at last year’s edition of Ars Electronica.
Part of his current research draws on Mobile Music workshops held over three years and revolves around the concept of re-creating the experience of mobile music, making it more creative that its current market manifestations (mainly ringtones.)
Tanaka hopes to blend his expertise as an IA researcher and artist and come up with new applications for wireless infrastructures. Clearly the addition of iPod with mobile phone wasn’t such a success. Motorola iTunes wasn’t a commercial success, no mobile operator wanted to play, there was a lack of “vision” in the Motorola iTune product. In the 1970s, the walkman was used by many users to create a private sonic universe, it allowed them to live in their own sphere, isolated from their surrounding. Now, in the 2000s, we have the mobile phone. The device is inherently a networking tool, it facilitate communication and connection with others.
So on the one hand we have the personal sphere and on the other the networking device. What stands between the two? How can we use Human Computer Interaction, Mixed Reality technology and social software to conciate the two?
So far new mediums have led to new musics:
– the format of the 45 rmp have helped rock’n’roll bloom (Elvis);
– the 33rpm had a similar effect on the “concept album” (The Beatles);
– MTV (images) boosted music videos;
– CDs (75′) allowed for longer albums with more songs (Michael Jackson, Madonna);
– MD: user editing (karaoke);
– MP3: Peer to Peer. Like MD has led to a dematerialization of music.
The iPod hasn’t changed music, it has just added a convenience factor.
What characterizes the mobile element is the fact that it’s on the move and is dynamically following users; it also allows to make, share, locate and listen to music. Tanaka’s goal is to create a new form of music that allows the new mediums to find their “artistic voice.” To give mobile music what is called in music terms “idiomatic composition” (music written for a particular instrument: composing muci for piano rather than violin, for example.)
Muci mobility calls for interaction and connectivity. These two characteristics are absent from the iPod experience, although music inherently exhibits them because we play in groups, go to concerts together, share our favourite music with friends. Music is a living form of cultural expression, not a commodity to be sold and copied in a file system.
Tanaka wants to bring “musicking” to mobility. Musicking, a term coined by Christopher Small, is the activity of living out of music, of enjoying it in an active way.
Tanaka then gave an overview of his work to illustrate how he has so far explored the concepts of interaction and connectivity.
He first showed the BioMuse (1990-), a bioelectrical musical instrument that allows the performer to create music with muscular and neural activity. Tanaka was the first musician to be commissioned to work with the biomusical interface created at Stanford University. He used it during the performances of the Sensorband he formed in 1993 together with Zbigniew Karkowski and Edwin van der Heide. In this sensor instrument ensemble, there’s no drummer, no singer, no guitar player. Each member uses his body as an instrument to play music. Tanaka plays the BioMuse. Karkowski plays an invisible cage of infrared beams that, when broken, trigger a sample of sounds. Van der Heide plays a MIDI conductor using joystick-like controls.
Ten years later, Cecile Babiole, Laurent Dailleau, and Tanaka created a dynamic sound/image environment called Sensors_Sonics_Sights. Using sensors and gestures, the trio create a work of sound and sight, a laptop performance that goes beyond with the intensity of bodies in movement. Cécile Babiole generates images ans uses ultrasound sensors, Tanaka plays BioMuse again and Laurent Dailleau plays the theremin.
Along with performances, Tanaka also worked on installations, one of them is Constellations that connects the physical space of a gallery to the imaginary space of the internet through sound and image. “Visitors in the gallery navigate an onscreen universe of planets, invoking audio to stream into the gallery. The planetary system is the interface to a library of soundfiles existing on servers throughout the internet. Each planet represents a contribution from a different composer. The sounds coming from the network space resonate in the acoustical space of the gallery, connecting these two universes.”
Global String and Constellations
In 2002, he worked with Kasper Toeplitz on the Global String, a musical instrument wherein the network is the resonating body of the instrument, through the use of a real-time sound-synthesis server.
The musical string spanned the world. Its resonance circles the globe, allowing musical communication and collaboration among the people at each connected site. A physical string is connected to a virtual string on the network, it stretches from the floor to the ceiling of the space. On the floor is one end-the earth. Up above is the connection to the network, to one of the other ends somewhere else in the world. Vibration sensors translate the analog pulses to digital data.
Now Tanaka wants to come up with an artistic take on mobile music experience that would be simple enough to be enjoyed by a child (a child that has never played a piano can still get a lot of fun with it) and complex enough to be appreciated by a professional musician. Unlike video games, you wouldn’t be able to set its user level.
The projecy will use IA: using technology of sensors, accelerometer and gyroscope to allow for body imput. The system would use this live gesture and add it to the mobile device. All this would of course be enjoyed in a social context. Each member of a community of users would have such device and freely move (they can even be far away from each other). Three elements that form the chore of locative media will be integrated in the musical work: mobility, location awareness and social networking.
The final piece has already a name: Net_Derive and will be launched in Paris this Autumn (on October 6 and 7 at Maison Rouge, Paris). Net_Derive will be a piece of “Musicking mobility” that will extend IA music beyond stage and concert hall.
Last image from this PDF document.