Using the infrared camera in the Wii remote and a head mounted sensor bar (two IR LEDs), you can accurately track the location of your head and render view dependent images on the screen. This effectively transforms your display into a portal to a virtual environment. The display properly reacts to head and body movement as if it were a real window creating a realistic illusion of depth and space.
Technological, social, and market forces have converged to create a fertile new ground for designers and engineers to plow. The price of processing speed has dropped and sensors are readily available.
Touchscreens on our mobile devices, ATMs, and airline check-in kiosks have taught us to expect to be able to manipulate things on-screen with our hands. Games have shown us we can make gestures in space to control objects on-screen. Public restrooms are, believe it or not, test laboratories for interactive gestures: placing your hands under a faucet to turn it on, waving your hands to get a paper towel, stepping into a room to
turn on the lights.
All of these things have ushered in a new era of interaction design, one where
gestures on a surface and in the air replace (or at least supplement) keyboards, mice, and styli. This new era, however, means those who design and develop more "traditional" systems need to grow their skills, adding in knowledge about kinesiology, sensors, ergonomics, physical computing, touchscreen technology, and new interface patterns.