Reality Check

Examples:

Context:  The user is performing an action which may have destructive or nonobvious side effects, especially if that action isn't reversible.  This may be one of the Convenient Environment Actions, for instance, or one of the Localized Object Actions, or a Composed Command; less commonly, it may also be a result of direct manipulation through a WYSIWYG Editor or a Control Panel.

Problem:  How can the artifact protect itself and the user from these kinds of actions, while allowing the user to have the final say over whether or not an action is performed?

Forces:

Solution:  Before the action is performed, tell the user what the side effects of the action will be, and ask the user to confirm that that's what they really want to do.  Don't simply parrot back the action request -- this won't tell the user anything they don't already know, unless the action request was accidental in the first place.  Instead, give them an intelligent analysis of what the action may do, in case they did not anticipate the potential side effects.

Resulting Context:   Once the user has seen a few Reality Checks in appropriate places, they may become used to it, and assume that the artifact will tell them whenever they're about to do something bad.  On the one hand, this is good -- the user develops trust in the artifact, and feels more free to experiment with it.  On the other hand, it raises the bar for the designer:  make sure you have Reality Checks wherever they're needed, or the user's trust will be rudely shattered the first time the system lets them do something bad!

Notes:  This is a very computer-centric pattern, since other media don't generally have the capability to understand the implications of a given action, nor react to it appropriately.  Humans do, though -- what are some good examples?  Executive secretaries?  Athletic coaches?

Don Norman, in The Design of Everyday Things, points out that reflexively asking a user if they really want to do a given action doesn't work.  To paraphrase one of his examples:

   User:      "Remove file Foo."
   Computer:  "Do you really want to remove file Foo?"
   User:      "Of course."  (That's what I just told you, you idiot!)
   Computer:  "File Foo removed."
   User:      "Oops."

Now what if the computer said instead, "If you remove file Foo, you will permanently lose all your custom settings for the application FooMaker.  Do you want to do this?"


Comments to:  jtidwell@alum.mit.edu
Last modified May 17, 1999

Copyright (c) 1999 by Jenifer Tidwell.  All rights reserved.