Paul Watson, PC Technician Friday, May 24th 2013
Gesture Recognition Isn’t As Simple As Having The Right Drivers
Part of the Kinect experience is being able to control gameplay using your body instead of a separate hardware device or controller. The Kinect was designed to read body movements from across the room. PC gamers aren’t like console gamers in that respect; most PC gamers are sitting right in front of their PC. Not exactly the sweet spot for gesture recognition.
The question most PC gamers have is, “How can gesture recognition technology based on the Kinect be incorporated into the PC gaming experience?” It’s a question that has yet to be answered successfully in terms of products on the market.
It’s pretty clear that the Kinect – at least in the form you see it in the stores – won’t be integrated into the PC for gaming purposes anytime soon, but the concept of gesture recognition may be closer than some gamers realize. The trick will be developing gesture recognition software (and hardware) that focuses primarily on the user’s hands and works in close proximity to the game console.
Gesture recognition is a stable of mobile devices. But mobile devices have the benefit of having the user’s hand making significant contact with a touch-screen on a mobile device. A Kinect-based technology for PC’s probably wouldn’t be a touchscreen affair. So what might it look like?
The Leap Motion controller has been much talked about as a potential answer to the question of device control in a small, short-distance 3D space. Earlier this month, Double Fine exhibited Dropchord, an app designed for the Leap Motion controller, which is supposed to debut in July.
It remains to be seen how intuitive the controller will be for the user, and what level of skill is required to make the controller work. But it’s a first step toward gesture control for PC games, and may provide an answer to the “Kinect” question for the PC platform.
Photo Credit: Leap Motion