Tele-Reality in the wild

Scientists at the University of California, San Diego have developed a technique for mixing images and video feeds from mobile cameras in the field to provide remote viewers with a virtual window into a physical environment. Reality Flythrough dynamically stitches together still images and live video feeds to create a 3D environment.

mccurdy7.jpg

The application fills in the gaps in coverage with the most recent still images captured during camera pans. The software then blends the imagery with transitions that simulate the sensation of a human performing a walking camera pan – even when one of the images is a still frame. If older images are not desirable, the fill-in images can be omitted, or shown in sepia, or include an icon displaying how old the photo is.

“Reality Flythrough creates the illusion of complete live camera coverage in a physical space. It’s a new form of situational awareness, and we designed a system that can work in unforgiving environments with intermittent network connectivity,” said UCSD professor Bill Griswold.

“With virtual tourism, for instance, you could walk down the streets of Bangkok to see what it will be like before getting there,” added Neil McCurdy. “Another really cool application is pre-drive driving instructions. Imagine going to your favorite mapping website, where currently you get a set of instructions to turn left here or right there, and instead, you can ‘fly’ through the drive before doing it.”

Video.
Via Eurekalert.