OmniTouch is a very interesting proof-of-concept developed by researchers from Carnegie Mellon University and Microsoft Research. It is a wearable projection and sensing system that turns virtually everything into interactive touch surfaces.
As an example, you can project a keypad onto your hand or a nearby wall and dial a number like you would on your touchscreen phone. What’s different here, of course, is that your “phone” is anywhere you choose to project the image from a laser pico-projector and camera that you wear on your shoulder. Using OmniTouch, you can just easily “project” an incoming e-mail on the reverse of the menu at the restaurant you are in and read it.
This near-magical concept is based on a depth-sensing camera that is able to track finger movements. In fact, you don’t even need any special calibration or training; the technology is smart enough to figure out if you’ve “clicked” on an app or some other selection, or your finger is just poised above the icons. You can even “drag” and define the size of the screen you want the image projected on.
This is a new way to look at mobile interaction. While other devices, such as tablets and smartphones, already offer similar functionality, they are all constrained by the area of the capacitive display.
This approach allows you to provide graphical, interactive, multi-touch input. Prototypes have tested the concept on drawing maps, projecting a full QWERT keyboard and even hierarchical menu lists.
Once it is commercially available, it can hugely benefit people with visual and motor challenges that prevent them from using conventional touchscreen devices.
For more information on the OmniTouch system, take a look at the Microsoft Research article here The video below shows some of the amazing features.