Our hands are capable of moving in so many possible different ways, but when it comes to using touchscreen devices, we seem to be performing the same gestures all the time. The same swipe to the left and right, pinch to zoom, and so on. Not to say that there’s anything wrong with it, but there’s only so much we can do with same moves in different applications.
And because what we do with our hands mimic mouse-based desktop commands in some way, the folks over at Microsoft are looking for a way to integrate more natural movements to gesture-based controls instead. The researchers have come up with a new gesture system nicknamed “Rock & Rails” where users form shapes perform the same actions with their fingers, but now have the added benefit of being able to use another hand to form a shape to constraint movement.
Basically, this will allow users to add new functions to an app without having to clutter up the interface with new tools – just turn your hand into a recognizable shape and let the touchscreen detect it. In the Rock & Rails system, users can close their fists into a rock to help pin an object on screen, while the user resizes it uniformly with his or her other hand performing the regular spreading gesture on the screen. Other examples include putting their hand’s edge straight on the surface of the screen, creating a virtual ruler that they can use to align objects against. It’s all very interesting stuff, and it definitely shows how much of the touchscreen we’ve yet to make full use of.
Head over to the Microsoft Research website to watch the video demonstration of Rock & Rails.
Filed in Gestures, Microsoft, Research and Touchscreen.
. Read more about