This system will use touch as well as air gesture hybrid interactions in order to support co-located, small group developer meetings through the democratization of access, control, and sharing of information across multiple personal devices and public displays. We are looking at a shared multi-touch display, mobile touch devices, as well as Microsoft Kinect sensors working in tandem to deliver a combination of in-air pointing for social disclosure of commands, targeting and mode setting, not to mention touch capability to command execution and precise gestures.
Imagine in-air interactions that require gestures so that you are able to take content from the display, and transfer them onto other gizmos such as notebooks, tablets and smartphones – isn’t that neat?