Posts Tagged ‘usability’

Culture gap: no 4th floor

No 4th Floor

F(ourth) floor is 4th floor in Korea

No 4th Floor

No 13th floor (photo credit: eggrollstan)

The 4th floor in Korea has the same status as the 13th floor does in the US.

The pronunciation for “4” is “sa” which is the same as the Chinese character for “death”, hence the “F” (for Fourth) instead of “4” in elevators.

Quite silly really.

IDEO Method Cards widget

IDEO Method Cards

IDEO Method Cards

IDEO’s excellent Method Cards for human-centered design are available as a widget for Mac OS X users. (Saves you $50)

Available as a download in the right column of the news page. Really easy to miss. Incidentally, Fast Company has an article on IDEO Method Cards Turn, Um, 7!

Happy birthday Method Cards!

The Satisfying Touch UI Experience

It’s a little embarrassing, but I get a lot of my insights from watching TED presentations. Blame it on the combination of my 2 hour commute, iPod Nano and TED providing video podcasts.

In a fascinating presentation by neurologist Vilayanur Ramachandran, he talks about how the brain works with sensory input. What stuck with me was towards the end of his talk:

Something very interesting is happening in the angular gyrus, because it is the crossroads between hearing, vision and touch and it became enormous in humans. I think it is the basis of many uniquely human abilities as abstraction, metaphor and creativity.

With interfaces, it is important to get sensory feedback. For example, right now, I am typing on a keyboard. This action creates a tactile feedback (it depresses), an auditory feedback (it clicks), and a visual feedback (letters appear on the screen). Unknowingly we feel satisfaction when these sensory feedback is properly provided. When typing on a keyboard does not produce letters on the screen, or the letters are somehow delayed, we have an emotional response – one of frustration.

Touch experience on the iPhone and LG Prada phone

Touch experience on the iPhone and LG Prada phone

With the iPhone there is no tactile or haptic feedback. (Some phones do have haptic feedback in the form of light vibrations)  In order to compensate for the fact that it is missing the one of three feedback that is necessary for a good interface, it provides strong feedback through the remaining two. When you use the dialer on the iPhone, it provide a strong color change (visual feedback) and the dial tone (auditory feedback) whenever you touch they keys. Same thing happens when you use the on-screen qwerty keyboard. In order to compensate for the fact that is is no tactile key-pressing sensation, iPhone provides visual feedback in the form of the keys popping up, and auditory feedback in the form of a tapping sound.

Compare the iPhone experience to the LG Prada phone experience. LG Prada phone provides haptic feedback (you feel a slight vibrarion at your fingertips) and visual feedback, however the color change in the interface is weak (trying to stay “cool” by using grey tones), and auditory feedback is aways the same no matter what you do (it’s the same bell sound). This results in the Prada phone having a less satisfying touch UI experience over the iPhone.

A large part of the satisfaction when using a touch UI is based on providing appropriate feedback. Another large part is based on what metaphor from everyday life you adopt and present to the users. Watching Ramachandran’s talk made me realize is that there is a deeper neurological basis for what consitiutes to a satisfying touch UI experience: Our brains are wired to take in sensory feedback and develop an emotional response to it (sometimes without us realizing it).

Better Personas: Data Driven Design Research

Data-driven personas

Data-driven personas

Todd Warfel has an inspiring presentation on persona creation. Go to the presentation on slideshare and view it full screen. In case you are wondering what those geen and blue lines are on his personas, here’s the answer.

Another of Todd’s persentations I enjoyed was, Goal Oriented Data Driven Design which incorporates parts of Barry Schwartz’s Paradox of Choice in explaining design based on usability not capability.

Progressive Disclosure

Progressive disclosure is quickly becoming a staple in my user experience vocabulary. It’s one of those impressive words that causes people to say “ooh!”. It has an authority about it that seems to squash the ever-recurring user experience dilemma: how to fit complex functionality into limited screen real-estate.

This is how Jakob Nielsen describes it in his Alertbox column:

Progressive disclosure defers advanced or rarely used features to a secondary screen, making applications easier to learn and less error-prone.

Progressive disclosure works wonderfully well with search engine interfaces, and Google has used this to its full advantage making it its trademark with its zen-like front page that only displays a search bar.

This also seems like an ideal solution for mobile handsets that are always suffering from increasing complexity but needs to fit all the functionality into a screen the size of a business card.

Not so fast. As with all seductive terms, one has to be cautious. It is not a silver bullet for UX woes. It does not provide license to forgo careful study of how users use devices and interfaces: what are the core set of features that should be displayed on the initial screen? How and when should the secondary screen with the advanced features be accessed?

Simplicity is always what interface designers strive for and user want. Simplicity is never simple, and certainly not an excuse to be lazy.