Four years before the Apple Powerbook got “motion sensored” (and repurposed), Jonah Brucker-Cohen‘s LiveWindow installation attempted to translate the physicality of the real world into the virtual. Any vibration on the floor around the computer was relayed to a browser window which could be accessed via Internet. If the room sensed vibration, the window began to shake, and its text to fall down.
An earthquake sensor on the floor measured vibration in the room. When someone hit the monitor or jumped, the Geophone sensors sent a signal to the computer which then talked over the network through a server which relayed the message to the browser window. When the window got the message, it triggered some local javascript which shaked the window and made the text fall.
LiveWindow also worked with other inputs: a light sensory for ambient light changed the background color of the window, a microphone picked up room volume and changed the size fo the window, and the amount of movement in the room caused the window to move around accordingly.
LIVEWINDOW is one of the Physical Web Interfaces projects by Brucker-Cohen.
Via Coin-operated.