Everyone remembers the scene in Minority Report where Tom Cruise controls the computer with hand gestures. A decade later, perceptual computing is finally starting to become reality. According to Intel, perceptual computing will fundamentally change how people interact with their PCs in intuitive, natural and engaging ways. Developers can create exciting new applications that take advantage of close-range hand and finger tracking, speech recognition, face analysis, and augmented reality.
Samsung brings perceptual computing to the mobile space
The highly successful Samsung S4 brings gesture controls to the forefront. The S4 implements perceptual computing using both hand and eye tracking. Users can scroll web pages simply by looking up or down. Looking away while watching a video pauses video play. Air gestures allow various commands such as scroll to be implemented without touching the screen. They are good party tricks and handy if you don’t want to touch the screen but touch screen interface is still the most accurate and responsive. That will improve over time.
Intel Capital to create $100 million perceptual computing fund
At the Computex computer trade show in Taiwan this week, Intel executive Tom Kilroy gave a keynote speech. The vision is to integrate “human-like sensing technology into devices, ultimately delivering more natural, intuitive, and immersive computing experiences.” To that end, Intel Capital announced they will create a $100 million fund to finance perceptual computing. Intel will make the investments over the next two or three years. The company will look at touch apps, imaging, gesture, voice, emotion sensing, and biometrics.
Xbox One Kinect
The upcoming Xbox One Kinect will feature a higher resolution 1080p camera, a major upgrade from the VGA sensor on the original Kinect. The new motion controller processes 2GB of data per-second to accurately read your environment. It’s claimed to be accurate enough to read finger gestures and even heartbeats.
Wi-Fi signals enable gesture recognition throughout entire home
University of Washington computer scientists have developed gesture-recognition technology that brings this a step closer to reality. Researchers have shown it’s possible to leverage Wi-Fi signals around us to detect specific movements without needing sensors on the human body or cameras.
PointGrab takes a software approach
PointGrab is a leading provider of advanced hand gesture recognition software using a standard 2D camera. PointGrab’s user-friendly solutions include next generation technology for hand gesture recognition; enabling TVs, PCs, Smartphones, Tablets, All-in-One devices and more to be operated by a Natural User Interface (NUI) using hand gestures only. Their approach allows implementation of gesture controls in legacy devices without the need for advance hardware such as 3D cameras.
In an exclusive interview with Assaf Gad, PointGrab’s Vice President of Marketing and Product, he indicated their focus in embedding the software in TVs as well as smartphones, tablets, and computers. According to Gad, PointGrab’s software has already been shipping in 20 million devices. In a demo with Windows 8, an Ipad, and an iPhone, Gad was able to navigate various applications, games, and camera functions using simple hand gestures. Follow me on Facebook and check back later to read more about PointGrab’s implementation including video demos.