Gesturizing Everything: A Look into Applicability

To bring new readers up to speed the developer android documentation defines touch gestures as:

A “touch gesture” occurs when a user places one or more fingers on the touch screen, and your application interprets that pattern of touches as a particular gesture. There are correspondingly two phases to gesture detection:
1). Gathering data about touch events.
2). Interpreting the data to see if it meets the criteria for any of the gestures your app supports.

The driving factor behind gesture technology seems to be the overwhelming interaction between humans and computers and our level of dependence – gesture recognition takes this interaction and amplifies it to extend beyond traditional uses. Nearly every industry could benefit from incorporating (touch or interpreted vision) gestures into their products and processes.

Gestures are not only the futuristic features of Tom Cruise films but a way to bring a cleaner user experience to those utilizing mobile phones, tablets, and computer monitors. Gone are the days of clunky game controllers causing Nintedinitis or Playstation Thumb; Gestures are poised to free users from a wired approach and offer substantial immersive experiences similar to virtual reality.

Currently, the application of gestures can be intertwined as standalone devices, add-ons, and even written directly into code, such as with gestureKit. Today, many startups are taking on the challenge of providing gesture recognition technology to improve the UX and to do this they are recognizing the applicability and improvement points that current technology presents with its limitations. One example that demonstrates this is GPS systems in automobiles. Voice control, touch screens, and traffic notifications are commonplace features of GPS systems but whether the device is integrated into the car console or suctioned to the window the user is still pained with a clumsy UX that alters the safety of the driving experience. New GPS systems could incorporate comfortable viewing within the sight of vision on the windshield unobtrusively. Gestures offer a way to alleviate the physical requirements of technology and re-focus the UX back to safety and fun.

20140420-094903.jpg

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s