Posts Tagged ‘touch’

Touch interface for good

Hong Kong May 22-23, 2009

Touch screen NYC ticket vending machine (photo credit: yuen_long / flickr)

Here’s a comment I posted on Changemakers AshokaTech discussion board in response to question How would you use touch-screen technology for good?

Following our blog post on touch-screen technology, I’d love to hear about your ideas on how we can make such technology work for the social sector, or if you’ve heard of organizations that are already doing so.

The great thing about touch interfaces are in the ease of use. It is intuitive. I moved to Korea two years ago and learning to type in Korean on a keyboard or a cell phone was not without its pain. Computers have a keyboard and mouse as its main input devices, however the use of these have to be learned. Not so with a touch-enabled device.

Touch-enabled devices is closer to how the real world works. You directly press buttons on the screen rather than moving the pointer with a mouse to a graphic that represents a button and click on the mouse that in turn simulates pressing the button on the screen.

Touch devices are intuitive to use. Look at a well designed touch-enabled subway ticket vending machine. The ones in New York thousands of tourists use every day without having ever used them before.

Computers 20-30 years ago were only accessible to those who were trained to use them. Windows (or MacOS) is a big step but it still has a learning curve. Use an iPod Touch for the first time and the learning experience is actually enjoyable. That’s the power that an intuitive touch interface brings.

So the potentials are huge. For those who do not use PCs or laptops everyday it can be a way to overcome the digital barrier. It can be used to bridge the digital divide. For kids, it’s a more intuitive, educational device. For the elderly, its a more humane interface especially for those with arthritis. For developing countries, its a better way for them to access information.

Displays, touch-screens and processing power are becoming cheaper everyday. I spent a whole week without using my laptop while it was in repair, surviving on my iPod Touch. It was possible, and this opened my eyes to the future that will be touch-enabled smaller devices that are as powerful as PC’s, but infinitely more portable and intuitive to use.

I would love to hear more about the actual application in the social sector.

The Satisfying Touch UI Experience

It’s a little embarrassing, but I get a lot of my insights from watching TED presentations. Blame it on the combination of my 2 hour commute, iPod Nano and TED providing video podcasts.

In a fascinating presentation by neurologist Vilayanur Ramachandran, he talks about how the brain works with sensory input. What stuck with me was towards the end of his talk:

Something very interesting is happening in the angular gyrus, because it is the crossroads between hearing, vision and touch and it became enormous in humans. I think it is the basis of many uniquely human abilities as abstraction, metaphor and creativity.

With interfaces, it is important to get sensory feedback. For example, right now, I am typing on a keyboard. This action creates a tactile feedback (it depresses), an auditory feedback (it clicks), and a visual feedback (letters appear on the screen). Unknowingly we feel satisfaction when these sensory feedback is properly provided. When typing on a keyboard does not produce letters on the screen, or the letters are somehow delayed, we have an emotional response – one of frustration.

Touch experience on the iPhone and LG Prada phone

Touch experience on the iPhone and LG Prada phone

With the iPhone there is no tactile or haptic feedback. (Some phones do have haptic feedback in the form of light vibrations)  In order to compensate for the fact that it is missing the one of three feedback that is necessary for a good interface, it provides strong feedback through the remaining two. When you use the dialer on the iPhone, it provide a strong color change (visual feedback) and the dial tone (auditory feedback) whenever you touch they keys. Same thing happens when you use the on-screen qwerty keyboard. In order to compensate for the fact that is is no tactile key-pressing sensation, iPhone provides visual feedback in the form of the keys popping up, and auditory feedback in the form of a tapping sound.

Compare the iPhone experience to the LG Prada phone experience. LG Prada phone provides haptic feedback (you feel a slight vibrarion at your fingertips) and visual feedback, however the color change in the interface is weak (trying to stay “cool” by using grey tones), and auditory feedback is aways the same no matter what you do (it’s the same bell sound). This results in the Prada phone having a less satisfying touch UI experience over the iPhone.

A large part of the satisfaction when using a touch UI is based on providing appropriate feedback. Another large part is based on what metaphor from everyday life you adopt and present to the users. Watching Ramachandran’s talk made me realize is that there is a deeper neurological basis for what consitiutes to a satisfying touch UI experience: Our brains are wired to take in sensory feedback and develop an emotional response to it (sometimes without us realizing it).

What Do You Want to Be, Touch UI?

Lately I’ve been thinking a lot about touch-based user interfaces for mobile phone for a project I’ve been involved in.

Louis Kahn, one of the most influential architects of our time, and subject of an amazing documentary film, once said:

“What do you want Brick?”

He was alluding to the fact that each material has properties and limitations and wants to be used a certain way. Whether it be materials, or systems, or UI’s, each has a certain affordance you can either acknowledge and work with, or work against.

Don Norman also describes a similar attitude towards the design of products in his influential The Design of Everyday Things:

The term affordance refers to the perceived and actual properties of the thing, primarily those fundamental properties that determine just how the thing could possibly be used. A chair affords (“is for”) support and, therefore, affords sitting.

When designing a touch user interface for mobile phones, where do you start? You can start by taking a look at what Apple has wonderfully done with the iPhone. Or you can take a look at how to improve the current mobile UI and make it touch-enabled. Both lead to very restricted designs, since they can’t escape what either Apple or current mobile UI have set up as its affordances.

One needs to ask, “So, what do you want to be, mobile phone touch UI?”

In my mind, some of what it wants to be is the following (these are its affordances):

  • It wants large touch targets: Fingers are less precise than keys, and there are minimun touch area requirements that can’t be too small.
  • It wants simple page layout: touch requires immediate feedback, and quick transitions to subsequent pages. There shouldn’t really be anything to navigate on a page. The interface should be “tap, tap, tap”, i.e. a quick progression of pages to finish the task the user to trying to accomplish.
  • It want to have limited choices: More choices on a pages means more things to touch and this make make things harder to touch with precision. In the web page paradigm, it may be better to present more options on a page, however in a mobile touch interface, with limited screen area, and touch targets, it may make more sense to provide limited choices and more “in-between” pages.

It is also important to select the right everyday metaphor for the touch UI elements. Metaphors allow users to recognize how to use something without learning, since it is something they are familiar with already. On the iPhone you see sliders (unlocking the phone), dials (selecting a date), and buttons.

One great source of metaphors for a touch UI is actually baby toys for many reasons:

  • Interactive elements are brightly colored, allowing the user (the baby) to locate and initiate an action.
  • Interactive elements are easy to touch, pull or twist and have large target areas, taking in to account the users lack of mastery over motor functions and pudgy fingers.
  • Interactive elements provide clear feedback to reward the users and provoke them to repeat the action.
  • The objects are not overly complex and choices for manipulation are simple.

It is no wonder a baby can use an iPhone interface.