The picture taken by the mobile phone camera is analyzed in real-time by the mobile and the real foot of the player will be shown on a virtual pitch on the mobile’s display. Now the player only has to score. Direction and speed are determined by the real movement of the
PDA version of the game on Pasta and Vinegar.
The British government has announced a plan to develop ways of scrubbing carbon dioxide from the emissions of coal and gas-fired power stations and pumping it beneath the seabed to reduce the impact of fossil fuels on the climate. The greenhouse gas would be stored in depleted North Sea oil and gas fields within ten years.
Carbon capture, also called sequestration, involves passing flue gases from power stations through chemical solvents to remove the carbon dioxide. The gas is then compressed to liquify it, and sent by pipeline to oil or gas rigs. There, it is pumped underground into strata once filled with the fossil fuels.
If the schemes prove successful, they could reduce greenhouse emissions from power stations by up to 85 per cent.
Norway has been running a pilot sequestration project since 1996 in which more than a million tonnes of carbon dioxide have been pumped into empty oil strata in stable and sustainable fashion.
Via The Times.
Monkeybridge is a collaborative Augmented Reality game, where users do not have direct influence on the characters' behaviour; instead they indirectly control their movement by providing the agents with building blocks to walk on above the virtual ocean.
The characters make autonomous decisions based on their observation of an AR environment in which they are embedded. They can choose the path the walk on; decide how to get from one platform to the other, e.g. climb or jump when there is a slight difference in height between platform edges; automatically choose the straightest path from several available tiles; and fall into the water if there is no suitable piece of landing stage to walk on.
The game serves as a pilot application to examine how "smart" software and hardware components capable of observing and reacting to events in the physical and virtual world can be useful in AR applications.
Sparks is an ambient social networking interface that uses light to create an ambient interface designed to facilitate salient conversations by linking strangers who may not know each other but share mutual interests.
Before entering the Sparks environment, each user pre–selects a number of interests from a pool of keywords. Within the environment, Sparks projects the keywords in an aura on the floor around the user. The aura follows the user within the environment, and augments the visual cues people use to capture initial impressions about another person. However, the aura alone is insufficient to create connections between people in a large group. To help guide distant users with similar interests together, the common descriptors on their respective auras are connected by illuminated paths. To help them locate each other, the path’s thickness modulates to indicate proximity of connected individuals. Furthermore, to differentiate between individuals the user has or has not previously met, the link changes color.
Users can also interact with the paths by sending pulses along them to signal others with a shared interest. To send a pulse, they tap the interest projected within their aura. When the pulse reaches the recipient, a glow appears on the corresponding word.
When the system detects a group, a group pad automatically forms in between the individuals to indicate an ongoing conversation. When the group dissolves, the group pad fades away.
PDF of the research.
TARBoard (Tangible Augmented Reality System for Table-top Game Environment) uses augmented reality and a tangible user interface to let users play board or card game in a more interactive.
TARBoard consists of a glass table, two cameras and a mirror. Markers are attached to the back side of the cards (the front side shows the creature). A first camera tracks the image of markers reflected in the mirror below the table. Another camera, the "augmenting" one provides a 3D model of a creature when the card is flipped. A player cannot see the card processed by the other player.
Each player has its own cards which represent a dragon, a wolf, a goblin, etc. A creature has its own characteristics, such as health, power, special skills, etc.
The attacker turns over one of his cards and places it near the battle zone where the creature is augmented. Then, the defender turns over one of his cards and places it near the battle zone to defeat the first creature. When two creatures fight, they attack each other in turn. The one which runs out of health is eliminated. Its card is removed from the deck.
A 3D audio feedback will be implemented in a later version to enhance realism of the game.
PDF presenting the project.
The Straw-like User Interface, developed at the University of Electro-Communications in Tokyo, allows users to virtually experience the sensations of drinking. Such sensations are created by referencing sample data of actual pressures, vibrations, and sounds produced by drinking from an ordinary straw attached to the system.
The system transmits pressure changes to the straw, which applies vibrations to the mouth. The pressure changes are created by a valve in the interface. If the valve is closed, the pressure increases. If the valve is open, the pressure decreases. Also, when the speaker inside the interface vibrates, the straw attached to it receives the vibration and transmist it to the lips.
Pressure changes and sounds from real-world drinking experiences are recorded and reproduced by the interface. A pressure sensor installed near the straw gathers pressure values, and a small microphone acquires audio data.
The application could be used to enhance game experience, enable distance communication via touch, augment sense perception among the elderly and physically challenged and of course develop new beverages.