Will gesture recognition transform our mobile phone, tv, and computer experiences? Certainly leading companies in the space, such as PrimeSense and EyeSight Mobile Technologies, believe so. Technology-savvy consumers may see this giant wave of innovations coming: gesture recognition, à la tv drama NCIS, whose cutting-edge video screen walls allow people to grab and expand files almost physically with their hands, is likely coming to a device near you, bringing with it an endless array of gesture-controlled apps and games.
The latest news is that app developers are beginning to consider transforming their apps with a new user interface – one that uses gesture control instead of the touchscreen, much like that found in the Wii or Kinect gaming systems, but more advanced. If new phones come with a gesture mode, then what’s next for automotive dashboard controls, kitchen appliances, and light switches?
An amazing array of technologies are possible with this technology, but given that even a touchscreen can be accidentally touched the wrong way, very careful thought must be given to the gestures that are used to control these devices. One wouldn’t want to walk by the television swinging his arms, and by doing so accidentally trigger a pay-per-view movie – this is referred to as “noise” by industry professionals. The goal is to have a system that recognizes the difference between noise and a real gesture. If you cannot scratch your chin in front of these cameras, customers will not be pleased.
Hopefully, given the new graphic acceleration being used in the latest gesture recognition technology, greater precision will be in place for mass market applications. According to this article in CNET, the next round of shipments from Lenovo, Toshiba, and Philips will include products with the latest gesture recognition technology from EyeSight built in. Only time will tell if it truly catches on, whether people are willing to switch to a new type of control pad, and what are truly the biggest challenges in this space.