Microsoft Research has launched a new project called 'Code Space' which combines Kinect's capabilities with air and touch-based gestures.
Designed for making developer meetings more fun and productive, 'Code Space,' which is a work in progress, allows people to interact using touch-based mobiles, tablets and even plane surfaces to move objects on the main screen.
Microsoft acknowledges the use of Kinect's motion tracking technology goes beyond the gaming domain and is being used to create cool new ways to interact with machines.
'Code Space' involves the use of Kinect motion tracking capabilities with air and touch gestures in small-grouped developer meetings to ‘democratising access, control, and sharing of information across multiple personal devices and public displays.'
"Our system uses a combination of a shared multi-touch screen, mobile touch devices, and Microsoft Kinect sensors. We describe cross-device interactions, which use a combination of in-air pointing for social disclosure of commands, targeting and mode setting, combined with touch for command execution and precise gestures," said Microsoft in a blog post.
In a demo test, developers found that interacting with hand gestures and using touch-based commands was much more productive and easy to conduct.