Tuesday, December 4, 2012

Usability problem of the day (URLs on Facebook and Google+)

In the courses I teach about human-computer interaction, I typically open each class with an example of a usability problem. I'm putting these online, in case others find them useful.

A few months ago Stephanie Buck collected a list of pet peeves from Mashable staff: 20 Things Your Most Annoying Friends Do on Facebook.

Here's number 20:
20. Redundant Links
Pet peeve: people who don't remove the URL once they've copy/pasted it into a status update. Result: ilooklikebarackobama.com on top of ilooklikebarackobama.com. Hurts my eyes.


Let's translate:
  1. You paste a link into the Facebook status box.
  2. Facebook automatically adds a prettified version below.
  3. You undo what you just did.
  4. You post your status.
And if you don't bother with Step 3, or if you forget, you get the blame for hurting someone's eyes. This hardly seems fair--Facebook could easily add a facility to strip out your link just as automatically as they add a preview of the linked Web site.

But this isn't unusual when it comes to user interfaces. People seem quicker to blame other human beings than poorly designed systems (or the designers of those systems).

A good historical example can be seen in partial meltdown of the Three Mile Island nuclear power plant in 1979. Time magazine summarized the event a few months later with an article subtitled, "Human error is to blame." But further analysis by human factors experts showed that the control room was a usability nightmare, with critical controls installed on both the fronts and backs of panels, inconsistent indicators (for some valves, a red light meant an “open” status; for others, a red light meant “closed”), and a hundred alarms going off at once during the emergency. 

Human error is often to blame, but when things go wrong it's not always the fault of the last person in line. Google "human error is to blame" and you'll find stories about gas explosions, voting machine errors, missing dead bodies, escaped prisoners, and train derailments. Read closely, though, and you might eventually think, "Shouldn't we be able to predict and prevent such mishaps more often?" That is, the problems are often with ill-thought-out procedures, complex equipment, and assumptions about ideal human performance.

The lesson for my students is that blaming users for problems with the systems they design is passing the buck. Who's responsibility is it, after all?

4 comments:

  1. The last line says it all.
    And when it gets down to where the rubber meets the road, do we need to assign blame (pretty stigmatized) or should we just skip that part and move on to fixing it and teaching everyone a better way in the process?

    ReplyDelete
  2. Hey, Kelly! Nice to hear from you. Yes, that's what we should do: solve problems.

    ReplyDelete

  3. And if you don't bother with Step 3, or if you forget, you get the blame for hurting someone's eyes. This hardly seems fair--Facebook could easily add a facility to strip out your link just as automatically as they add a preview of the linked Web sitefacebook

    ReplyDelete
  4. My friend mentioned to me your blog, so I thought I’d read it for myself. Very interesting insights, will be back for more!
    Buy Facebook Likes Cheap

    ReplyDelete