Intel and AMD take another step into the future with perceptual computing

Intel and AMD have shown demos at CES 2013 that rely on voice, gestures and eye movement. Has the future of computing already moved beyond touch?

ces-computing

What if you could navigate your tablet while cooking dinner by swiping your hands through the air? Or connect your computer to your display with a literal snap of your fingers?

Both Intel and AMD have teased this long anticipated future at CES 2013. Intel, which is lumping motion and voice controls under the term “perceptual computing,” showed a number of tech demos from the company’s labs. In one, a user playing “Where’s Waldo” with an eye-tracking camera, was able to “find” the famous character by pausing his gaze on the famous character. Another demo used hand movement to grab a companion cube in Portal 2 and move it on top of a button.

A 3D motion-tracking camera made these demonstrations possible. Intel promised that it will be more affordable than Microsoft’s Kinect, and will work via a single USB connection (Kinect requires additional external power).

AMD showed a video with more practical implications at its conference. One short clip depicted a cook navigating a recipe on a tablet by swiping through the air. Another focused on media navigation via gestures. These features will be enabled on some systems shipping with the company’s latest A8 and A10 mobile APUs. No external camera is required.

The technology is still rough, but it is improving rapidly. We’ve seen the promise of this through motion based controller’s like Microsoft’s Kinect, but they were just the first step. This is the next. We’re excited to see an alternative to touch and believe the backing of both major x86 processor companies – as well as numerous smaller companies on the show floor – could make perceptual computing the Next Big Thing at CES 2014.


Source : digitaltrends[dot]com

Post a Comment

It's free
item