Multi-Touch Interaction Research

Jefferson Y. Han has developed, together with Philip L. Davidson, Casey M.R. Muller and Ilya D. Rosenberg, a new project that investigates bi-manual, multi-point, and multi-user input on a graphical interaction surface.

Multi-touch sensing enables you to interact with a system with more than one finger at a time, as in chording and bi-manual operations. Such system can also accommodate several users simultaneously, which is useful for larger interaction scenarios such as interactive walls and tabletops.


The sensing technology is force-sensing, and provides high resolution and scalability, allowing for sophisticated multi-point widgets for applications large enough to accomodate both hands and multiple users.

The drafting table style implementation on the images measures 36″x27″, is rear-projected, and has a sensing resolution of ~0.1″ at 50Hz. Applications receive events and stroke information using the lightweight OSC protocol over UDP.