Sunday, 5 January 2014

The future of mobile is motion

By Ioannis Verdelis

When the Samsung Galaxy S4 debuted, there was a lot of hoopla over its Air Gesture and Eye Scroll features. With Air Gesture, users can simply wave a hand over the phone and it will respond accordingly, by answering a call or skipping to the next track. Eye Scroll uses eye movement to move up or down a page.

Many users ultimately found the features underwhelming, but it doesn’t really matter. These new features — still in their infant stages — mark another milestone for mobile.

The problem with “smart” devices

When we took computers and crammed them into tiny phones, we brought with us the way we had always done things. The problem — as any smartphone user knows — is that it’s hard to control an application’s interface when it’s centered on tiny buttons. There’s no tactile feedback, and visual feedback is often insufficient. Victims of autocorrect know what I’m talking about.
If users get frustrated with these devices, they might give them up altogether. It’s imperative that user interface design becomes the core focus.

Make UI design your focus

The user experience can make or break a new technology. Before diving into the development of a new app or device, spend time on a whiteboard. What problems are specific to mobile? What new opportunities do current and future generations of devices offer? How would you do business if you thought about it anew? What hindrances did desktops present that mobile can avoid? Pay attention to patterns, and empower users with controls they need.
It’s also important to incorporate trends into your design — without them, you’ll be playing catch-up with your competitors. When our company focuses on UI design today, for example, we consider whether we can build in swiping gestures instead of requiring customers to tap a small button. Mailbox demonstrated the power of this; iOS7 picked it up. Working with these features ensures you’ll develop more functionality — and tweaks that set you apart.

The future in motion

Some user interface interactions are slowly being taken over by voice. But with 85 percent of iOS7 users saying they haven’t used Siri, it’s clear that “silent interaction” is still an important factor in how people use smart devices.
Some companies are already implementing gestures in their interface:
  • Dolphin browser: Dolphin is a Web browser that allows you to map gestures to specific websites. For instance, you can set it to go to Google when you draw a “G” or to your wife’s blog when you draw a heart. It’s much more efficient than typing full URLs.
  • Keyboard innovation: These keyboards replace buttons, like the space bar and backspace, with gestures. Want to delete that last word? Swipe once to the left. Need to type a name? A swipe up will temporarily disable autocorrect.
  • Aviate desktop: Aviate organizes my apps based on where I am and what time of day it is. When I wake up in the morning, one flick brings up my morning routine apps. Considering that U.S. smartphone users average 41 apps per phone, this makes navigating those apps extremely simple. 

Motion is everywhere

It’s not just our smartphones that will soon be controlled by motion. Three years ago, Microsoft launched the Kinect gaming system, which uses an RGB camera, depth sensor, and microphone to allow users to control a video game without a controller. The Leap Motion device takes accuracy further — it plugs into a computer and lets users control the system using air gestures.
With more device manufacturers including gesture controls in operating systems, users are becoming more accustomed to using them daily. Startups that want to build the best apps will have to include motion controls in their interfaces to compete.

Motion is the future, so you better get moving.


                                                                                                                          

No comments:

Post a Comment

RPM Tech Widget

Search Box

Blog Archive