Another one in my obsessive list of interactive tables…
Lumisight Table, aka Interactive View-Dependent Display Table Surrounded by Multiple Users, by Yasuaki Kakehi from the University of Tokyo, can display information in various direction on a shared screen and capture multiple users’ gestures simultaneously.
When you use a computer, your eyes are focused on a display, and your hands are restricted to keyboard and mouse. But if you’re collaborating with others, nonverbal communication modalities, such as eye contact, facial expressions, and the handling of physical objects, are lost.
Around the Lumisight Table, users stay close enough to maintain nonverbal communication and collaborate while using computers, since its display is physically single but visually multiple.
Lumisight Table could be used for new kinds of video games, of poker, chess or mahjong, but also in networked applications and artistic creation and interaction.