Virtual chopsticks

Yoshifumi Kitamura and his team at the Human Interface Engineering Laboratory, in Osaka University, is investigating varieties of new tools by changing the software parameters of the tool with a standardized interface. These experiments should help them design a comfortable user interface and analyze processes that occur while people learn to use new tools.

manipulate2ss.jpg

They first tested the system with chopsticks because although they are very simple in form, they do have multiple functions. I am a complete moron in front of chopstick, but it seems that their tool enables even users who cannot handle real chopsticks properly to operate the virtual chopsticks in a way that matches his/her mental image obtained from multiple joint angles, which are measured as a finger motion.

Generating sign language by using the user’s intuitive hand action is a possible application.

It kills me but it comes from Nicolas again!