Augmented reality on smartphones may well be a killer app, but MIT’s Fluid Interfaces group is already moving beyond the confines of a GPS-capable cellphone to create a data-driven “Sixth Sense.”
The group, part of MIT’s Media Lab, designed a device that gathers data on the environment around the user, searches for information using the Internet as a data store, aggregates the results, and presents it back to the user via a display. Think of it as a meta-data system for real life.
Dr. Pattie Maes demonstrated the system at the TED conference. It comprises an off-the-shelf webcam, mirrors, smartphone, and a pico-projector–all hung on a lanyard. The device recognizes the movements of the user’s hands via the webcam (and color-coded finger-gloves worn on index finger and thumb,) enabling gesture-commands like the classic “frame” gesture which makes the device snap a photo.
In a bookstore, the device could recognize a book the user selects (either by image recognition or RFID) and project information onto it, such as its Amazon rating or annotated notes. A newspaper would prompt the device to search for relevant news video clips, while an unrecognized person might prompt the display to show their contact details, and so forth.