Interview of Marc Böhlen
Marc Böhlen's website has provided me with some amazing stories ever since i started blogging: from the Open Biometrics Project that i posted back in 2004, to the Universal Whistling Machine, first prize at Vida 7.0, and the cursing Amy and Klara.
Marc is trained in Stone Masonry (sic), Art History and Electrical Engineering and Robotics. He has been an invited speaker at Cornell University, Harvard University, The Ludwig-Maximilians-Universität München, the Banff New Media Institute and the Royal Institute of Technology in Stockholm, etc. Marc is currently directing the MediaRobotics Lab at the University of Buffalo (Department of Media Study).
A few days ago, i read about his Shoeveillance project, a surveillance system that tracks pedestrian traffic in public buildings at... feet level. I wanted to know more about it so i thought it made a good excuse for an interview:
Certainly! My approach overlaps to a strong degree with that of art-activisism. It is also different. If you leave the development of technology to market forces alone, the solutions leave out too many market-irrelevant, but important, interesting and pleasurable factors. Realtechsupport tries to cast engineering problems in a light that includes their social/cultural context, and tries to make this a design parameter that needs to be 'solved'(addressed, discussed, solved) with the same diligence as the isolated technical problem.
Shoeveillence is a good example of this approach. Shoeveillence is capable of monitoring the passage of people in to and out of a room (and by extension a building). It acknowledges the fact that, in this world, this might be necessary [There is a fire in a 100 story building – is there anyone on the 91^st floor?]. At the same time, it counters data-creep and prevents data that is not "needed" from being collected. [The camera sees only feet and legs up to knee height; the machine algorithm finds directionality of motion and objects that look like shoes only].
In this regard shoeveillence is related to previous work in biometrics. The Open Biometrics Project included the design of a fingerprint analysis system that made its probabilistic results apparent so one could watch the machine in action.
A friend of mine told me about an interaction designer who had devised a way for people living in a not very posh neighborhood of London to pass through the streets of their area and yet avoid the gaze of the CCTV network. But it turned out that people were not happy with the idea, they actually liked to be on surveillance camera. Do you think a system like Shoeveillance could make everyone happy: maximum data collection and minimum invasion?
Yes. Let's make everyone happy! Shoeveillence plays with the desire to be seen to some degree. Parading your shoes is a special kind of pleasure, an accepted form of exhibitionism. It is one I would like machines we share the world with to be fluent in. We will have to wait for compliments, though, the appreciation of good shoes is beyond AI today. In HCI (Human Computer Interaction) community, some people speak of 'shy sensors', sensors with low-bandwidth input (such as a button) from which you derive information based on the sensor's location. If you want to know if someone is sitting in a chair without watching them on a camera, for example, you put a button in the chair. Shoeveillence, however, takes in high-bandwidth data (streaming video). It is tamed physically (by its position on the ground and its lens system), disciplined programmatically (by its algorithm) not to notice anything but shoes and incapable of being invasive but geared to be persistently and maximally shoe centric. This is a new kind of problem solving, I think.
Shoeveillance has found a first application in 8-bit Architecture. Can you tell us something about that project? how does shoeveillance fit in 8-bit A?
8-bit Architecture is a concept for a new joint between lived spaces and synthetic systems in the widest sense. To date, our new technologies are add-ons to the buildings in which we live and they enforce a master-slave relationship. The "smart home" got it all wrong. We don't need more convenience in our lives.
I like one of your older project: Advanced Perception. Are you still investigating the field of "animal machine interaction design"?
Yes! That project really set the path for my work. The Universal Whistling Machine project continues the thrust with the question of communication beyond species boundaries. And the Glass Bottom Boat will move from land to sea creatures. I think we should replace HCI (Human Computer Interaction) with HAMI (Human Animal Machine Interaction). Animal activism meets robotics. The new world must have space for all of us.
How affected were the chicken by the presence of the robot?
At first indifferent. Once the robot moved, they were very frightened. But having the robot 'announce' its motion prior to actually moving (by activating the motors for a fraction of a second as to make a noise) made the chickens accept the machine much more readily. The robot was then instructed to avoid the feeding corner, so the chickens had their reserved territory. That seemed to help as well.
The results of these robot-chicken experiments were presented to both the scientific as well as the art communities. did they react in a different manner?
I had a famous chef (Rudi Stanish) cook omelets from the eggs of the chickens. In the gallery we had a taste-the-interaction session. People enjoyed the omelets and, hopefully, thought about a future world where animals, humans and robots roam freely.
I presented parts of the work in two scientific venues. There the interest was on "robots in adverse environments" (you can't sell omelets to the scientists). They were keen in hearing about how to deal with the messy side of things (dirt, chicken droppings, reliability over time, control mechanisms). But, in the end, the discussion also went to the 'high end' topics of shared spaces for different species. But the technical diversion was necessary to get to that point.
You seem to mingle with the artisitc as well as with the scientific communities. What's your crowd? How open is the scientific community to art projects?
My crowd: Mixed. I think people who are trying to (experimentally) find ways of living with the fallout from automation technologies respond to my work. I am not a visual artist. The science community is selectively appreciative of the work. I do solve problems and deal with the same kind of messy and complicated information processing issues the engineering sciences do. The criticism I get from the sciences is that I am weak on evaluating my "results". But I am really not into handing out questionnaires and doing factorial designs experiments. I understand the critique, and understand where it comes from, but limit myself in accepting it in order not to loose the thrust of the work.
On the other hand, one does have to really understand the questions the sciences are concerned about. You can not speak the dialect of art (and expect to be understood) when you play the science game. Else you are delegated to a beautifier (and your voice is not taken seriously beyond that task). This was an important insight into really getting discussions going across disciplines: you have to be in the disciplines. This is a huge problem with much of the current "art meets science" endeavors. It really takes more than talk to move between different domains of knowing.
Other recent projects on your website deal with the replication of human features in artificial systems as well. Do you feel that the quest for the machine that will look and sound exactly like we do is senseless?
Yes, the quest for the synthetic system that looks, feels, acts and sounds like us is a dead end, in my view. Synthetic systems designers (from literature, cybernetics, robotics, and AI) have always been attracted to the mimesis of human features. Mostly because humans see humans as the pinnacle of evolution. If you are going to make synthetic life, why not work off the 'best' example you can find, the human, so the argument goes. But there is a different argument I am more attracted to. What the machine affords (in the sense of what is is capable/incapable of) is fundamentally different from how we are.
Machines are their own species, they are aliens, in a way. Sensors can see and hear things outside of our human perceptual boundaries. We have no access to microseconds as a computer does. Entities that can access this kind of 'stuff' are different, in a similar way as a dog that can hear sounds I can not hear, is different from me because of that.
More on the projects discussed here at Realtechsupport.