Designing for touch-based mobile user interfaces requires new thinking and an expanded design vocabulary. It’s no longer about soft-keys and d-pads — it’s about having space to play.
Through Punchcut’s work with touch, our interaction design, visual design and motion design teams have collected the following insights from their disciplines on what it takes to create satisfying touch UI experiences.
1. Design for immediate access.
Touchscreens allow users to jump from point A to point B with a single tap, rather than step incrementally through menu items the way key-based UIs required. As a result, core navigation and calls to action have to be made very obvious. Lay out screens where the most important menu items are most prominent, not simply the top-most in a stepped list. Design with quick taps in mind, and use finger travel as a mechanism to guide the user. Capitalize on the ideas of direct manipulation in the physical world that people naturally bring to touch devices, and engage them whenever possible. Use touch keyboard-based input as a last resort.
2. Keep gestures smart and simple.
Because people naturally bring familiarity with direct manipulation to touch devices, it is important to implement gestural controls that respond exactly as a user would expect. Taps and flicks are essential ingredients at the UIs foundation. Additional gestures, for the most part, are not naturally discoverable, and should be used sparingly and accompanied by explicit instructions. Use a redundant buttons, and allow the additional gesture to serve as a shortcut to the same functionality.
Distinguish between global, system-level gestures and local, app-level gestures. There is more freedom add unique gestures to the touch vocabulary within an application, but global gestures must necessarily be intuitive to keep the overall UI experience straightforward and navigable.
3. Leverage clear mental models.
The touch experience is an intimate interaction with the content and UI space. By creating a system where interactions — touches, flicks, drags — are subject to common rules of physical motion — intertia, bounce, gravity — the user is transported into an interface world that reinforces their expectations and as a result becomes real, tangible and believable.
Dimensionality helps offset the experience of interacting with the flat aspect of the screen. Transition animations help confirm an action has taken place, and give users the sense they have gone "deeper" into an application context or shifted over to a parallel task. Keep transitions simple and quick to allow the user to focus on the task at hand, rather than be distracted by loud special effects that call dramatic attention to themselves. Also, keep in mind: Touch does not allow for focus or hover states to cue users in, so iconography and other touchable elements should stand out clearly from page content. Hardware keys — Home, Back, Menu — can be effective ways of letting the user know they are leaving the current on-screen context to access another.
4. Design for real hand sizes.
Fingers come in all sizes. Build interfaces to match. Touch targets and the objects that surround them should be sized appropriately. Tolerances between buttons are driven by the size of the button and the likelihood of accidentally hitting an adjacent element. Generally speaking, the smaller the buttons, the bigger the gaps needed between buttons. Because of this principle, there is usually some tolerance flexibility on the edge of the screen, because the finger is only partially on screen and makes for an easier target. Many in-dash automobile navigation systems successfully use this method.
It is also important to remember that hit targets are directly proportional to screen resolution. Therefore on-device testing is critical in measuring the actual size of the resulting interface. A common goal for touchable elements is 10 millimeters minimum, regardless of resolution.
5. Touch feedback is key.
Visual feedback is critical, especially when considering situations when touching an object obscures the target a user set out to touch. Creating a visual feedback system that takes this into account and is still visible after the user has released their touch one effective way to remedy this situation. Audible feedback is often used as a secondary mechanism when visual feedback might not be noticed. However, because it is common for mobile devices to be set to silent, audible feedback should be used in conjunction with other feedback. When creating audible cues, specific ranges of sounds are recommended to cut through the din of the user’s natural environment. Haptic feedback can offset the difficulty some users face in transitioning from a key-based device to a flat touchscreen. The haptic response can give the user an illusion that a key has been pressed by utilizing a small vibration under the user’s finger. While vibration technology can account for a spectrum of tactile experiences that are available to the UI designer, one must also recognize that the user may opt to turn off this feedback to conserve battery life.
The truth of the matter is touch is a gratifying experience, one that strengthens the bond between people and their devices. Once emotional black boxes with buttons for input, devices now have the potential to be manipulated like familiar objects in the natural world. Yet translating that potential into a well executed touch experience takes keen attention to details. Through Punchcut’s understanding of these details, we work toward a world where people feel like their devices are aware and responding to them, instead of the other way around. As touch moves from emergence to the expected mode of input, Punchcut has helped companies do the little things right on the way to realizing digital experiences that unfold naturally.