Given my notoriously campy taste in music, you will be relieved to know that i'm going to carefully avoid reviewing the music side of Barcelona's International Festival of Advanced Music and Multimedia Art. What's left then? Fashion, a bit of advertising and the SonarMàtica exhibition.

Sonar's participants' fashion sense was tamer than i expected this year. Hop! Hop! Let's move on to the festival's advertising campaign which have, so far, shown an unconstrained taste for shocking, surprising and amazing. Taxidermied animals, Smiley, people with pee stains on their pants, creatures of worrying genetic heritage, notorious fraudsters and even Maradona have starred in Sonar's posters and promotional videos. Have a look at the photo set of Sonar's most provocative ad campaigns and at the video that the festival created back in 2001. That year, broadcasters refused to air the original video but didn't object to this ridiculously censored version.

One of the images for Sonar 2008

This time, the Sónar image is on the safe side but it is nevertheless striking. The heroes of the posters and video are cute majorettes from the world of dreams, who have lost their bearings in the land of the living as a result of calls from a fiendish telephone booth. Follow their 14 minute long adventures:

SonarMàtica is actually what usually brings me to Sonar. The title of the exhibition this year was Mecànics. It aimed to give a platform to some of the driving forces behind nowadays' artistic and mostly DIY creation: mostly centres of production based in Barcelona (with notable exceptions such as MediaLab Prado in Madrid) which were given the opportunity to showcase ongoing projects and postgraduate projects but also to organize workshops, tours and open rehearsals.

Mecànics is the third and final exhibition in the SonarMàtica XIXth Century trilogy, a research project drawing comparisons between the nineteenth century and the twenty-first century. Unlike the two previous exhibitions, Et Voilà!, which highlighted the relationship between magic and technology, and Future Past Cinema, which looked at the recovery of pre-film formats in contemporary, Mecànics had a fairly diluted identity/ The reason for that lays probably in the fact that the exhibition was showcasing the best of what Barcelona makes in art production center rather than exploring with brilliance and cohesion a defined theme. The result is rolllercoaster that leads you from gems to strikingly weak pieces.

Lovers of interactive tables were having a blast this year

I caught myself thinking i shouldn't have bothered. This edition of SonarMàtica had decided to write off SonarCinema, Digital à La Carte and also the artists talks and debates i had enjoyed so much last time i was there (unless, damn! i've missed it). A few projects i've (re)discovered in the exhibition made it worth the trip though:

The Sounds of Science (los sonidos de la ciencia), developed by Jay Barros during MediaLab Prado's Interactivos?'09: Garage Science workshop, uses off the shelf and mostly recycled equipment to create audio visual remixes of sounds and images captured from the urban micro-environment, to "lay-down" some beats and frequencies that serves as a musical score for a visual display of what exists beyond the realm of our everyday vision. At the heart of the project is a home-made microscope designed with a CCD sensor from a camera and the lens from a CD player. Image processing programs analyze various samples from protozoa gathered in urban environments and turn them into algorithms which provide the basis for visual and sound composition.


L'Orquestra dels Luthiers Drapaires (the Luthiers Drapaires Orchestra) is made of spectacular robotic instruments that have been created out of technological waste found on rubbish dumps and in the street. Telenoika has decorticated the waste and enhanced it with a little help from circuit prototyping and acoustic research.

"Luthiers Drapaires" is proof that the waste we generate provides enough raw material to build sophisticated devices. Besides, the growing amount of tools and information available online provide everyone with the possibility to access the knowledge needed to turn rubbish into artworks.

Video by mediateletipos

For Sónar, the orchestra was composed of a percussion set made of electromagnetic pistons; a theremin made from two radios; an adapted television which works as an oscilloscope; a guitar made of string, a crate of wine and the engines from a hair removal machine; and a set of automated tubular bells.

Prepared Turntable, 2008

Yuri Suzuki brought some much-needed poetry to the exhibition. He displayed some of his charming Physical Value of Sound pieces but also a 2004 piece called Jelly Fish Theremin. The movement of a fish in a horizontal bowl controls the sound, air- conditioning, the visual image and lighting.

Small gold fish were swimming inside the instrument at Sonar but the original work used a jellyfish: I used jellyfish as the control center, since jellyfish are made up of 98% water, and I thought that the will of the water would be reflected in the movement of the jellyfish, if only a little. If we were able to create a space controlled by jellyfish, wouldn't it be the ultimate place of relaxation?

And if you understand japanese...

For a pretty accurate and smart review of the exhibition with videos, just run to mediateletipos.

Image on the homepage courtesy Yuri Suzuki.

Sponsored by:

Another project developed last month during Vision Play, one of the Interactivos? workshops organized by Medialab Prado in Madrid. This time i asked Horacio González and Paola Guimerans to tell us something about biophionitos, a project they developed together with Igor González and other collaborators.


Biophionitos generates artificial life using a system similar to the zoetrope, an early animation device that produces an illusion of action from a rapid succession of static pictures. Horacio González, Paola Guimerans and Igor González added to the concept a touch of Processing and a whiff of Arduino to develop an interface able to create a physical animation which runs in an old-style but interactive phenakistoscope (one of them reacts to your caresses, another one wakes up when you talk to it, etc).

This virtual pet created with the system is made of a limited series of simple polygons which the program has modified in order to give the drawing what looks like a biological life.

The artists also uploaded online a tutorial to do your own Biophionitos.

Note to spanish-speaking readers and in particular to the lovely people at TECAT (a great blog about media art i just discovered courtesy of Marcos, os lo recomiendo) who have kindly translated some of the Interactivos? posts in spanish: i pasted at the bottom of this post the original answers of Horacio and Paola. They wrote me in spanish and i translated their text in english.


Can you tell us what lies behind the name Biophionitos? What did you decide to call the project this way?

Well, this is a strange story. Horacio and i have been working together as a team for a few years under the name VHPlab. A few months ago, he came to spend some time in New York and we had the opportunity to travel together to San Francisco.

One our way back, in the plane, we started developing the project and thinking about the physical visualization of an auto-generative image. After much discussion, we managed to shape the idea and for various reasons, we agreed on the fact that the image should allude to a living being. We therefore decided that it would be an animal as it appears on our logotype.

The project matured conceptually and the time came to give it a name. At that moment, we knew that the prefix 'bio' had to be part of the name and we decided to use a game to complete the name.

Our objective was to invent a name that doesn't exist, just like when you are a kid and call something you don't know with a word you've invented. During the process we were reminded of Fiona, a very special child, the daughter of friends whom we had just met over that trip to San Francisco. Although it might sound surprising we also started thinking of the longest and strangest words of the Spanish language: Parangaricutirimicuaros. It is a word almost impossible to pronounce, it comes from a tongue twister, that Horacio and his sister used to mis-pronounce as Paranguanitos, when they were kids. We used a fragment of each word to build the name of the project; bio-phio-nitos.


Why did you choose to keep a "vintage" and early cinema look to the project? How important is the retro design for biophionitos?

There is a side of the design which is retro and it is related to a series of ancient little games / toys that we had in mind while we were developing the project. One of them is the Cinexin. The idea to take some inspiration from a zoetrope was nevertheless related with a conceptual objective that we had set ourselves right from the start: we wanted to reflect on the form in which a generative animation should be presented.

Many artists who work with Processing have to face the issue of finding a way to pass from the digital world to the physical one, of overcoming the limit set by the screen. Our intention was to use a totally physical and very rudimentary interface to display an animation. We wanted the interface to highlight the simple principle that makes any animation work, while revealing what usually stays hidden: the trick.

An animation is nothing but a sequence of images very similar to each other. However, as this is usually imperceptible, the spectator sees them as something difficult to understand and magical, something that is beyond their reach.

We believe that technology must be open and we also believe that making a technology open source is not enough to make it truly open. Technology must be accessible, understandable, users should be able to use it consciously. With Biophionitos, we have tried to develop an auto-generative technology that reveals itself, that throughout its development unmasks its history and functioning.

In addition to the fact that the final design is retro and inspired by toys from previous decades, we also refer to the idea of the virtual pet and to the new digital toys. The idea of creating an interactive version of Biophionitos and the fact that spectators have the possibility to create their own virtual pet evoke the Tamagochi and the way users relate to them in order to keep them alive. We assigned sentiments to each boxes so that users would feel some kind of empathy when they activate the animation and would somehow feel the magic that those first inventions had.


What was the biggest challenge you met with when developing the project and how did you overcome it?

The projects has various dimensions and lectures. It was difficult to make them tangible, to have team members agree on their meanings and to establish a series of priorities that had to be met during the 15 days of the Madrid workshop.

The presentation to Interactivos meant that we had to clarify how the work would be distributed and the time necessary to fulfill each task. The team worked perfectly well. Horacio was in charge of developing the Processing application. Igor took care of the electronic, mechanical and Arduino side of the project. I was more active on the design and creative aspects of the work. But the help of the various collaborators was essential to enable us to complete the project on time.

On the other hand, Horacio couldn't come to Madrid during Interactivos. We had to work at a distance, discussing over skype or mobile phone.

Working from a distance is always frustrating, there is not direct relationship between what you ask and what you get in exchange. You must invest much more time in communication. Some issues and decisions taken on the spot end up emerging in the course of a conversation, one of the side had taken that issue for granted while the other side had no idea about it. Everything happens with a delay and during the waiting time misunderstandings happen. Each time i made a modification i wanted to know how it would affect the final result inside the zoetrope. However, i couldn't see it until someone printed the new Biophionito, recorded it in a video that he or she had then to upload on Youtube. Very often, they would just call me and try to describe what could be seen in order to accelerate the process but trying to imagine in my head what they were explaining was tremendously difficult, almost hilarious.


I'm interested in the generative part of the work. How does it work exactly? Which kind of data do you feed the system? Why not drawing the little creatures yourself?

It's the beta version of the program and it needs to be improved. We developed it over the course of the two weeks Interactivos workshop in Madrid. Each pet is made of a series of polygonal vertebrae; the first vertebra is the head of the animal and the rest constitutes its body. When the pet moves, the body follows the head in a decelerated way. Therefore, according to the direction and speed of the movement, the pet will stretch or shrink.

The idea is very simple, each vertebra is made of 2 segments. The first segment of each vertebra is attached to the previous vertebra and the second segment of each vertebra follows the first segment. As the answer of the second segment is not identical to the movement of the first segment, each vertebra will shrink or stretch according to the speed and direction that the previous vertebra has.

For the moment, when users create their pet, they can add vertebrae and shape it as they wish. It's a fairly rough way of drawing but for a first version it worked fairly well.

Our objective is to end up converting each of the points that compose the segments of the vertebrae in vertex of a unique Bézier curve. Doing so the drawing will be more detailed and free, its profile will then be curved and not polygonal.

The inclination, distance and the size of the various segments that form the pet condition the way each vertebra is going to stretch and shape. Although the process is always the same, each pet behave in its own way, because when they move each vertebra has a peculiar way to react. This modifies totally the way we perceive the movements of the pet.


Do you plan to develop the project any further?

We would love to see people make their own Biophionitos at home, using the D.I.Y. tutorial and that it could generate some online feedback about the results. Besides, after Sonarmatica, we were offered to give a few workshops about Biophionitos. It could be an interesting opportunity to enrich the project and hear about the perspective of the users.

For future versions, we would like to have the program analyze the way users design. It would use several characteristics such as the number of vertebrae or their proportions and on that basis, alter the path of the pet or the way some vertebrae connect to each other. In any case, the movement of the pet is deeply influenced by the final interface. Each zoetrope can only contain16 different images, the animation must therefore be very simple and cyclic. Our objective is not to create a complex nor spectacular animation but to develop an easy to use and fun interface that will generate a pet which, by its functioning, reveals the mystery behind animation.

Thanks Paola and Horacio!

More biophionitos images.

Now for the spanish version


Can you tell us what lies behind the name Biophionitos? What did you decide to call the project this way?

Paola > Bueno es una historia curiosa. Horacio y yo llevamos trabajando en equipo desde hace varios años bajo el nombre de VHPlab (www.vhplab.net).Hace unos meses, el vino a pasar una temporada a NY y tuvimos la oportunidad de viajar juntos a San Francisco.

Durante el vuelo de vuelta, comenzamos a desarrollar el proyecto y a reflexionar sobre la visualización física de una imagen auto-generativa. Después muchas discusiones, logramos dotar de forma a la idea y por diferentes motivos, entendimos que la imagen debería hacer alusión a un ser vivo. Entonces, decidimos que fuera un animal, como el que aparece en nuestro logotipo.

El proyecto fue madurando conceptualmente y llegó el momento de buscar un nombre. En ese momento, teníamos claro que el prefijo "bio" debía formar parte del nombre, así que decidimos completar el resto de la palabra partir de un juego.

Nuestro objetivo era inventar un nombre que no existiera, como cuando eres un niño y denominas a algo que no conoces con una palabra inventada. Durante el proceso nos acordamos de Fiona una niña muy especial, hija de unos amigos que acabábamos de conocer durante nuestro viaje a San Francisco. Por raro que parezca también nos vino a la cabeza una de las palabras mas largas y raras que existen en castellano: Parangaricutirimicuaros. Se trata de una palabra imposible de pronunciar, proveniente de un trabalenguas, que Horacio y su hermana solían pronunciar erróneamente como Paranguanitos, cuando eran pequeños. Utilizamos un fragmento de cada palabra para construir el nombre del proyecto; bio-phio-nitos.

Why did you choose to keep a "vintage" and early cinema look to the project? How important is the retro design for biophionitos?

Horacio > Hay un aspecto del diseño que es retro y está relacionado con toda una serie de juguetes antiguos que tuvimos en mente cuando desarrollamos el proyecto, como el Cinexin (http://usuarios.lycos.es/los80/id64.htm). Sin embargo, la idea de inspirarnos en un zootropo estaba relacionada con un objetivo conceptual que nos planteamos desde el principio; queríamos reflexionar sobre la forma en que se presenta una animación auto-generativa.

Muchos artistas que trabajan con Processing se encuentran con el problema de cómo trascender del mundo digital al físico, de cómo superar la limitación que supone la pantalla. Nuestra intención era utilizar un soporte enteramente físico y muy rudimentario para mostrar una animación. Queríamos que el propio soporte pusiese en evidencia el sencillo principio que hace funcionar a cualquier animación, revelando aquello que normalmente queda oculto; el truco.

Una animación no es más que una secuencia de imágenes muy similares entre si. Sin embargo, como normalmente esto resulta imperceptible, el espectador las percibe como algo incomprensible y mágico, algo que está más allá de su alcance. Nosotros creemos que la tecnología debe ser abierta y además creemos que la tecnología no es abierta únicamente por el echo de ser open source. La tecnología debe ser accesible, comprensible, los usuarios deben poder hacer un uso consciente de la misma. En Biophionitos hemos intentado desarrollar una tecnología auto-explicativa, que se revele a sí misma, que en su desarrollo, recoja su historia y su funcionamiento.

Paola > Por otro lado, añadir que aunque el diseño final es retro y está inspirado en juguetes de hace décadas, también contemplamos como referencia de la idea de la mascota virtual y de los nuevos juguetes digitales. La idea de crear una versión interactiva de Biophionitos y el hecho que el espectador tuviera la posibilidad de crear su propia mascota virtual, nos recordó a los Tamagochi y a como el usuario se relaciona con ellos para mantenerlos vivos. Asignamos frases cargadas de sentimientos a cada una de las cajas para conseguir cierta empatía por parte del espectador en el momento de activar la animación y así transmitir de algún modo la magia que tenían estos primeros inventos.

What was the biggest challenge you met with when developing the project and how did you overcome it?

Paola > El proyecto tiene muchas dimensiones y lecturas diferentes. Resultó difícil darle forma, que todos los miembros del equipo lo entendiesen del mismo modo y establecer una serie de prioridades, para garantizar que se pudiese llevar a cabo durante los quince días del taller en Madrid.

Presentarlo a Interactivos requería tener muy claro el reparto de trabajo y los tiempos de desarrollo. El equipo funciono muy bien, porque que cada uno se hizo cargo de una parte clave del proyecto. Horacio se encargó del desarrollo de la aplicación en Processing. Igor de la parte mecánica y electrónica, de Arduino. Y yo, del diseño y la creatividad. En cualquier caso, la ayuda de los diferentes colaboradores fue determinante para poder terminar en tan poco tiempo.

Por otro lado, Horacio no pudo venir a Madrid durante Interactivos. Tuvimos que trabajar a distancia, manteniendo conversaciones a través de Skype o del movil.

Horacio > Trabajar de forma remota es siempre algo frustrante, porque no hay una relación directa entre lo que demandas y lo que recibes. Es necesario invertir mucho más tiempo en la comunicación. Siempre hay cuestiones y decisiones tomadas en el momento, que una de las dos partes del equipo presupone y la otra desconoce hasta que aparecen en una conversación. Todo sucede en diferido y durante los tiempos de espera suele haber malentendidos. Cada vez que yo hacia un cambio, estaba deseando conocer como afectaba al resultado final en el zootropo. Sin embargo, no podía verlo hasta que alguien imprimía el nuevo Biophionito, lo grababa en video y lo subía a Youtube. En muchos casos me llamaban y trataban de describir como se veía para acelerar el proceso, pero resultaba tremendamente difícil, casi cómico, tratar de hacerse a la idea.

i'm interested in the generative part of the work. How does it work exactly? Which kind of data do you feed the system? Why not drawing the little creatures yourself?

Horacio > Es una primera versión del programa, aun necesita muchas modificaciones porque se desarrolló a lo largo de las dos semanas que duró interactivos en Madrid. Cada mascota se compone de una serie de vértebras poligonales; la primera de las vértebras es la cabeza del animal y el resto componen su cuerpo. Cuando la mascota se mueve, el cuerpo sigue a la cabeza de forma decelerada. Así, en función de la dirección y velocidad del movimiento, la mascota va estirándose y encogiéndose.

La idea es muy simple cada vértebra está compuesta de dos segmentos. El primer segmento de cada vértebra está pegado a la vértebra anterior y el segundo segmento de cada vértebra sigue al primer segmento. Como la respuesta del segundo segmento no es idéntica al movimiento del primer segmento, cada vértebra se encoge o estira en función de la velocidad y dirección a la que se mueve la vértebra anterior.

De momento, cuando el usuario crea su mascota puede ir añadiendo vértebras y dándoles la forma que desee, es un sistema un poco rudimentario de dibujo pero para una primera versión ha funcionado bastante bien.

Con el tiempo, nuestra intención es convertir cada uno de los puntos que componen los segmentos de las vértebras en vértices de una única curva bezier. Así será posible hacer un dibujo mas detallado y libre obteniendo un perfil curvo y no poligonal.

La inclinación, la cercanía y el tamaño de los distintos segmentos que componen la mascota condicionan la forma en que cada vértebra se estira y encoge. Aunque el recorrido es siempre el mismo, ninguna mascota se comporta del mismo modo, porque al girar cada una de las vértebras tiene una forma particular de reaccionar. Esto modifica enteramente la forma que percibimos el movimiento de la mascota.

Do you plan to develop the project any further?

Paola > Nos gustaría que la gente empezase a hacer Biophionitos en su casa gracias al D.I.Y. y que esto generase cierta clase de feedback con los resultados a través de la web. Además, después de estar en Sonarmatica, ha surgido la oportunidad de impartir algunos workshops sobre Biophionitos. Puede ser una oportunidad interesante para enriquecer el proyecto y conocer la perspectiva de los usuarios.

Horacio > Queremos que el programa analice la forma que dibuja el usuario, en futuras versiones. Que utilice ciertas características como el número de vértebras o su proporción, para alterar el recorrido de la mascota o la forma en que unas vértebras se relacionan unas con otras. En cualquier caso el movimiento de la mascota está muy condicionado por el soporte final. Cada zootropo puede contener únicamente 16 imágenes diferentes, por lo que la animación debe ser muy sencilla y cíclica. Nuestro objetivo no es hacer una animación compleja ni espectacular, sino desarrollar una interfaz sencilla y divertida para crear una mascota que, en su funcionamiento, revele al usuario el misterio de la animación su naturaleza.

0aaaamagi2.jpgopensourcery is what you get when you throw a master of bewitching installations and a "real" magician right into a workshop dedicated to magic and illusion.

The workshop was Interactivos?, it took place in June at the MediaLab Madrid. The illusionists are Zachary Lieberman and Mago Julián ("Julian the Magician" in english.)

opensourcery is a performance which marries camera based technology with old fashioned close magic to manipulate a live video image seamlessly and create new tricks. The custom developed software is completely open-source (thus the title) and designed as a starting point for imagining a new language of tricks and techniques for magical expression.

A few questions to Zack:

0aamagic2.jpg 0amagiccc3.jpg
The coin follows the finger, then moves on its own + The magic eye reveals the chosen card

I suspect that it was the first time that you worked with a magician. How did the collaboration go? Did Mago Julián come with an idea asking for some technical help or did you develop the whole concept together? And did he teach you a few tricks you plan to use in your future work?

First of all the collaborate came out from the excellent advice of José Luis Vicente and Oscar Abril, two Sonar curators who noticed a similarity between how my performance Drawn works and how the magician Mago Julián uses an overhead camera to perform close magic.

It was the first time I've ever worked with a magician, and it was really surprising to see the differences in how we work and make work.
For example, magicians are very secretive about their techniques - it took a long time for Julián to warm up to showing me some of the "behind the scenes" while we were working together. I still have absolutely no idea how he does many of things he does live. In contrast, I was eager to demonstrate all of the techniques and hidden systems that make the projects I've worked on, like Drawn or Manual Input Sessions work.

However, there are many, many similarities too - Magicians are essentially hackers, in most cases of physical objects, but also mental processes, and my software hacking and his object hacking worked completely well in parallel and we completely understood each other from day one. Also, we both like to create in the audience a sense of wonder.

Since we didn't have a lot of time to develop the project during interactivos, we started with Drawn, and he spent some time learning and performing with it. I also spent time examining his close magic performance, and learning about the kinds of things he might need in a performance. What was amazing is that his magic is so good, he really doesn't need any help with technology, so it was a very nice starting point. We started to identify needs - for example, to take a snapshot of an object and reveal / hide that snapshot when it is covered, so that his act can have a certain amount of freedom. I recoded a great deal of Drawn (in order to make a clean, open source project) and we spent a lot of time just playing with different effects and ideas.

An amazing thing that happened was that Mago Julián and Punkie (his wife and partner in the act) started completely hacking the software. They would take different bugs or problems and flip them into remarkable tricks. Every day I would come to the workshop and Julián would be like, "We have to show you this!" The last trick, for example, when Julián reveals the card through a magic eye was completely based on bugs in the rendering. I cringed the first time he showed me (as a programmer I hate to see those bugs) but the cringe quickly became a huge laugh.

Video (excerpt) of the performance at Sonar / SonarMática 2007:

How does opensourcery work technically?

It's software that is programmed in C++, using the openframeworks library, that takes a live video image and composites it with synthetic graphics and then reprojects the results to create something which is seamless and looks just like live video. The software is based on Drawn, and is completely open-source.

During the performance a second operator (in this case the Magician's wife) works backstage to control the software, but it could be programmed to work with wireless devices or switches.

0aainteractivvvvvvv.jpgDo you plan to go any further in the development of this piece?

Yes, one of the nice outcomes of the Sonar performances is that we have been invited to several Magic festivals. For me this is very exciting because while I typically work in new media festivals, I have never even been to magic festival, let alone performed in one. We are going to develop several new tricks and refine the current ones.

Additionally, we have made the software completely opensource, and we will be making in early fall a manual and tutorials available so that anyone who wants to perform these tricks (or develop new ones) should be able to. We look forward to other people participating or using the software. While a magician almost never reveals his tricks, we want to do the exact opposite.

Thanks Zack!

*Previous episodes: I Thought Some Daisies Might Cheer You Up, Delicate Boundaries, Palimpsesto and Augmented Sculpture v 1.0. Interview with Marcos García from MediaLab Madrid.

As part of Sonarama's celebration of "the year of japan," Toshio Iwai had been invited to make a performance of the Nintendo musical game electroplankton and of the sound + light musical instrument Tenori-On.

I don't need much to be convinced that the man is a genius. He could come on the stage and do absolutely nothing else that dance the flamenco and i would still clap my hands enthousiastically. People with more common sense than me were nevertheless mesmerized by his show. And i've got good news for you if you were not in Barcelona. Toshio Iwai will make another performance in Manchester as part of the Futuresonic festival at Academy 2, on Friday 21st July. I won't miss that, can't think of a better way to celebrate Belgium's National Day ;-)


The point wasn't to make wonderful music that you could listen to with your eyes closed but to show the possibilities of the instruments he developed. There were three of them on stage:

Iwai, Yu Nishibori, in charge of the Tenori-on project at YAMAHA were both playing Tenori-on and electroplankton, and Naoaki Kojima was playing Sound-Lens.

Sound-Lens, created by Iwai in 2001, is a mobile type art piece which converts light into sound. First realised as an installation piece, Sound-Lens was used at Sonar as a musical instrument. For the installation version, the participant is given a SOUND-LENS receiver and headset. They can then walk around the exhibit in search of light sources which are fixed on the walls and ceilings. When the receiver lens is held up against the lights, sounds which are hidden in the lights can be heard through the headset. Furthermore, each participant can enjoy the transformation of sounds interactively by moving the SOUND-LENS receiver around, adjusting its distance or angle in relation to the light sources.


For the Sonarama performance, 25 blue-green LEDs were place in a matrix on a stand in the center of the stage. LED's lights are designed to play a musical scale, so Naoaki Kojima was playing Sound-Lens like a musical instrument by moving the lens vertically or horizontally.

More about the performance: Luca Barbeni reviewed it on teknemedia (in italian), I put some images on flickr and i just read at Chris' place that the Tenori-on has now a weblog.

The Sonar festival isn't just music, clubbing and dancing. There were also tons of wmmna-esque works to eyeball at the SonarMática exhibition. Curated by Drew Hemment, José Luis de Vicente, Óscar Abril Ascaso and Advanced Music, the show, called Always On, is the third episode of a series of SonarMatica that focuses on the representation of territory.

The first year of SonarMatica, 2004, had been dedicated to Micronations, last year was called Randonnée and gave a glance through 21st century landscaping, from new figurativism to augmented reality to virtual architecture and datascapes.

This year the exhibition revolved around mobile culture and location projects.

During his talk at the Santa Monica Art Centre on Friday, Jose Luis de Vicente explained the origin of the title Always On. Oscar Abril Ascaso proposed the term "On" because it means "on, activated, switched on" but in catalan it also means "where." On suggests thus both an idea of place and an idea of activity.

rnd#06 underworld

De Vicente added that the idea of urban territory nowadays encompasses also a very crucial, ubiquous yet intangible element: connectivity. He mentionned several books: Hertzian Tales by Anthony Dunne, Design Noir: The Secret Life of Electronic Objects by Anthony Dunne and Fiona Raby and Me++, by William Mitchell and showed the always fascinating work of Richard Fenwick: rnd#06 underworld, a short video that shows a day in any given city: the network of communications and of broadcasts are super-imposed over the city like a veil.

In the past the walls of the city were its more important element, they were a symbol of its cultural and political power. The most powerful element of a city today is its fluid spaces, made of transports (the map of London metro is much more emblematic of the city than any other), electromagnetic waves, radio communications, mobile phone communications, etc. Some cities today define themselves by their tv or radio tower and no longer by their cathedral (cf. Berlin or Shanghai.) The transition from the cathedral to the antenna can sometimes take a very ironic form: the support pole for the golden angel weathervane on Guildford Cathedral in the UK is actually a mobile mast and supports several antennas.

The first ideas of fluid space emerged in the '60s with Superstudio's works and Archigram's Plug-In City.

The topography of networks doesn't coincide with the physical topography (cf. Graz' mobile phone landscape.)

Now back to the exhibition itself, it focusses thus on locative media and the technological and cultural works based on establishing a relationship between information and location.

Some projects were for the first time taking Sonar participants to the streets: i already mentionned Blast Theory's Day of the Figurines (after three days of constant texting i'm now very poor in phone credit but also missing my figurine, what's happening to her? wil i be able to get the same in September when Blast Theory will propose the full-fledge version of the game in Berlin?). Other outdoor projects:

- Akitsugu Maebayashi's fascinating Sonic Interface. Equipped with a computer in a backpack and headphones, you follow a guide through the city streets, shopping malls, or the underground. The sound you perceive through the headphones reflects the actula urban soundscape but with some suprises: the noises either come in mosaic or they are amplified or repeated. The subject, perceiving a shift between sight and sound finds himself in a new universe and, liberated from unified perception.

21webber.jpg 34teran.jpg
Sonic Interface and Life: A User's Manual

- Michelle Teran's Life: A User's Manual uses a very common tech device to "hack�? into surveillance cameras and see, what they see, on his/her own screen. As an art performance. Every day at 9 p.m, Michelle Teran was inviting people to follow her on a "sourveillance hacking" tour in Barcelona.

- Counts Media's famous Yellow Arrow;
- and Geocaching, an outdoor treasure-hunting game in which participants use a GPS receiver or other navigational techniques to hide and seek containers (called "geocaches" or "caches") anywhere in the world.

Inside the exhibtion:

- Antoni Abad's beautiful project that allow people with disabilities, prostitutes, gypsies and taxi drivers to broadcast from mobile phones;

67yhhy.jpg 11qqwq.jpg
zexe.net and TTSM

- Alejandro Duque's TTSM (Typewriter Tracklog Sewing Machine), by Alejandro Duque, uses a GPS device to track and save the data of a journey without destination. Artist's links: [k.0]_lab, s.o.u.p.

- Jeremy Wood - "GPS Drawing"- Meridians;

- Jens Brand's gPod / G-Player;

- a Zapped! kit by Preemptive Media. They were even showing a cockroach with rfid tag on the back. Last year, Preemptive Media had attached reprogrammed RFID tags to the roaches that, if placed in a Wal-Mart store, will taint its RFID database. The group distributed the roaches at the show's opening, sending them home with gallerygoers in Styrofoam coffee containers.

- Proboscis (UK), Urban Tapestries / Social Tapestries;

Bio Mapping

- Christian Nold's Bio Mapping which is one of my favourite projects ever! People are sent in the streets with a Bio Mapping tool to record their Galvanic Skin Response, a simple indicator of emotional arousal in conjunction with their geographical location. Using Google Earth, Bio Mapping indicates by the height of each track point the individuals' physiological arousal at that geographic location.

- Mark Shepard's Tactical Sound Garden Toolkit;

- Socialfiction's .walk that raises regimentation to an art form by giving instructions for a walk through a city. These instructions correspond to an algorithm and can be traced back to a simple computer programme;

- The Interpretive Engine for Various Places on Earth, by Jeff Knowlton and Naomi Spellman, is a location-based narrative that relies on Wi-Fi to tell a story specific to user location.

I really enjoyed Always On, the great mix of outdoor and indoor projects, the calm of the exhibition space just below Sonar By Day frenzy, and the focus on a particular theme was so clear that it put the individual projects into a broader perspective. I guess i need to see more shows like that, i'm so used to festivals that showcase the "latest" and the "coolest" or to new media art exhibitions with a very vague theme.

Jens Brand's G-Player (Global player) works like a CD-player. But instead of playing CDs, it plays the globe. The device knows the postion of more than a thousand satellites and enables you, by the use if a virtual 3D planetary model, to listen to an imaginary trace of a selected flying object.


SonaMatica was showing the portable version of the G-Player which is unsurprisingly called g-Pod. Select one satellite on the menu (400 satellite orbits are available) and the device will analyse in real time the topographical profile of the region that satellite is flying over at the moment. gPod/G-Player will then translate this data into sound. Oceans have no sound, flat topographies produce high frequencies and mountains regions low ones.

If you have an old i-pod that you want to get rid of, Brand might modify it for you with their "satnav" application.

More images.

 1  |  2  |  3 
sponsored by: