automated social engineering at it's best (maybe?)
Bill Sconce
sconce at in-spec-inc.com
Mon Aug 2 18:12:01 EDT 2004
Hi, Ben -
<heckle> time. Hope you don't mind. Too much.
On Thu, 29 Jul 2004 21:52:35 -0400 (EDT) bscott at ntisys.com wrote:
> .. Exactly. Or, more broadly stated, "people generally just
> don't want to be bothered to think".
How true. True of everyone, no?
Especially (and justifiably) true when they/I/you are distracted
by juggling other tasks. Does reading e-mail get, or even deserve to
get, your full attention, for every single message?
> People need to realize that not thinking is harmful, even dangerous.
Yup. "They" need to realize that.
> Example: Every year people get hit trains. *It's a train.*
Wow! Not only are they not thinking. They must not be *listening*.
A *train*! (You'd think the noise would be a clue. :)
> It isn't like they can sneak up on you unexpectedly. They generally
> follow the tracks. Yet people still find themselves in the position
> of being hit by them.
True. Only once though.
Seriously, this is a bad analogy. Opening e-mail is not at all like
a train. An e-mail client which allows and encourages you to execute
a piece of code by clicking on an icon (when the whole interface of
the "operating system" forces you to click on icons all day) is
completely different from a train.
What we actually have here is mortally-flawed software design. Some
blame for clicking falls to the user, but only a small part. It's
not reasonable to expect "hey-there's-a-train-coming" thinking when
opening each and every e-mail. You can expect it if you wish, but
it's just not going to happen.
Sooner or later an e-mail will look enough like something legitimate,
or the user will be thinking about three or four other things while
clicking through e-mail, and zap. Even you or me - if we're using
that kind of software.
Let's stop blaming the user for not thinking. Pundits and vendor
shills will probably blame the user forever, but we should know better.
We know enough about software to recognize what works and what blows
up, and software which blows up is not the user's fault.
</heckle>
-Bill
P.S. A possibly better analogy than trains, from aviation: In
the early days each airplane manufacturer designed the layout and
feel of instruments and controls in their own favorite way. In
theory, you could say (and it WAS said) that pilots should "think"
before flipping a switch. One famous example was a popular model of
airplane whose flap and landing-gear switches had the same shape,
and whose positions on the panel were interchanged from those on
other makes of airplane.
Sure, the pilot should "think". But embarrassing moments kept
happening, year after year. (Embarrassing because retracting the
gear after landing results in noisy and difficult taxiing, whereas
retracting the flaps is normal.) Other examples caused more than
embarrassment; sometimes people died. Because pilots WOULD (and
still will) occasionally "forget" to "think". One example was an
air-carrier accident in South America caused by an autopilot which
which could be set for descent using either feet per minute or
degrees. The pilots had to think very carefully, every time - the
difference between FPM or degree modes was the absence or presence
of a decimal point on the gas-discharge display. One night the
degree mode intersected a mountain.
Sure, pilots are expected to think. Yet every aircrew uses checklists.
And the message about the consequences of confusing cockpit layouts
eventually sank in: layouts are standardized now, and flap and gear
handles have shapes so different that you can tell them apart in
total darkness.
We don't tolerate an aircraft flaw which increases the chance of
passengers being killed when someone doesn't "think" hard enough.
Would you get on an airliner which was designed like Windows,
however good you thought the pilots were at thinking?
More information about the gnhlug-discuss
mailing list