In strictest overconfidence
By David
Stauffer
IN a look at possible explanations
for discounter Kmarts loss of market share under
the leadership of CEO Joseph Antonini, Wall Street
Journal reported that "attitude may have made a
bigger difference than strategy.... Antonini didnt
think others could tell him much about the business... he
bristled at criticism and... didnt do much hiring
of managers from outside the company who might challenge
him."
If this assessment is correct, Antonini
was the victim of overconfidence the
understandable and all-too-human tendency to accept our
own experiences and beliefs as reliable truths. We
humans, it seems, are almost irresistibly motivated to
use even the flimsiest evidence as the basis for
unshakable conviction.
Further, we are
convinced that we have a solid basis for our conviction,
when in fact the vast majority of us are stunningly over
confident. And even when we are incontrovertibly proven
wrong by events or the weight of other opinions, we are
inclined to go in a direction that compounds rather than
alleviates the damage resulting from our original
position.
These powerful
tendencies are most vexing to people who are frequently
called on to make decisions and those whose careers have
been marked by steady advancement toward the top ("I
wouldnt have been promoted for being wrong, would
I?"). Fortunately, if you can accept these
tendencies as part of the package that comes with being
human, you can neutralise their ill effects and sometimes
even turn them to your advantage as superstar
leaders Jack Welch of General Electric and Andy Grove of
Intel have done.
Decision research shows
that we groundlessly bolster belief in the extent of our
knowledge. Among the more prominent ways we do this by:
l We remember little of
what we experience. University of Bamberg psychology
professor Dietrich Dormer, in his book The Logic of
Failure observes, "Human memory may have a very
large capacity, but its inflow capacity is
rather small. What we perceive at any given moment may be
rich in content, colourful and clear in its contours. The
moment we close our eyes, however, a great deal of that
richness instantly disappears unclear and pale
outlines remain".
l We remember
selectively. In The Challenger Launch Decision, Boston
College sociology professor Diane Vaughan observes that
even highly skilled and experienced professionals have
been shown to selectively accept or dismiss new facts and
evidence to fit their existing world view. The result:
decisions based on preconceptions rather than current
circumstances.
l We remember
prejudicially. Eliot R. Smith, a Purdue University
psychology professor, studies the effects of exemplars
mental pictures of, say, a least-liked school
teacher that we unknowingly carry with us and
apply to others. The danger? If a new acquaintance,
perhaps a prospective new hire, reminds you of one of
your exemplars even "below the level of
consciousness," then you will judge the new person
by what you think of the exemplar. Smith relates findings
of an experiment in which participants had an encounter
with someone with a particular hairstyle who was mean to
them. Later, when presented with a new, not-unfriendly
person with the same hairstyle, the participants judged
that person to be unfriendly. "It doesnt make
sense," Smith notes, "but it still has an
effect".
lWe change what we
expected to fit with what actually happened. Vaughan
observes that evidence of the design flaw that led to the
1986 Challenger tragedy was first seen in 1977.
But the "deviancy" from expected performance
was "normalised" engineers found an
explanation for it. Subsequent manifestations of the same
problem over the years were treated the same way:
Deviances from past performance were in fact predicted.
We toss these and other
monkey wrenches into what we swear is
"knowledge", giving us a gigantic case of
overconfidence, according to J. Edward Russo, a Cornell
University professor of management who conducts executive
seminars on managerial decision-making. He and Wharton
School research director Paul J.H. Schoemaker have
administered "overconfidence quizzes" to
thousands of managers around the world, consistently
finding that the managers grossly overestimate what they
claim to know.
Russo says that the
attitude of one banks chief loan officer is
typical. "He claimed that he and the loan officers
who worked for him would score very high on a quiz which
he himself assessed as being germane to his industry and
not particularly difficult. No one is more
realistic than a banker, he said. He then took the
test and failed miserably. He had his loan
officers take the same test. Every one of them
flunked".
In a 1992 Sloan
Management Review article, Russo and Schoemaker
explained that their overconfidence quiz "measures
something called metaknowledge: an appreciation of what
we do know and what we do not know.... We draw on our
metaknowledge when we conclude that we have enough
information and are ready to make a decision now.... No
group of managers we tested ever exhibited adequate
metaknowledge; every group believed it knew more than it
did about its industry or company. Of the 2,000-plus
individuals to whom we have given a 10-question quiz...
fewer than 1 per cent were not overconfident".
One group that appears
to have fallen victim to a very public case of costly
overconfidence is the General Motors corporate design
staff who redesigned the 1991 Chevrolet Caprice. Numerous
press accounts noted the staffs rejection of
negative comments on the prototype, such as this one in
the Washington Post. "The design staff... had
a reputation for being resentful of marketers, engineers
and other corporate outsiders who had the gumption to
tell them what was and was not attractive". The
article goes on to report that when a focus group also
voiced its strong objections to the proposed new Caprice
exterior, design staff VP Charles M. Jordan
"didnt like the clinics reaction.
We were excited about the design, he said.
We decided not to do anything about it. We believed
in the design.... All the car guys liked the
design".
Caprice sales for 1991
amounted to half the anticipated volume, and in 1995, the
model was discontinued.
But the Caprice saga is
distinguished in the annals of overconfidence only by its
very public unfolding. "This is the stuff of human
nature," Russo observes. "It is apparently
worldwide. Weve tested in Europe and Asia as well
as North America. Asians, admired in many ways for their
business acumen, are, if anything, even more
overconfident than Americans. Its not culturally
specific either in terms of societal or
organisation culture".
Its difficult, not
to mention personally risky, for a subordinate to try to
get a message of reality through to the
overconfident-and-proud-of-it boss. "Theres no
way to convince that boss to hear what he doesnt
want to hear," Russo laments. "It would be like
the Hollywood Mogul who was reputed to have said I
dont want a bunch of yes-men around me; anyone can
say no if hes willing to lose his job".
Nonetheless, Russo says,
"Overconfidence isnt all bad. It fills our
need to believe in our abilities. It can contribute to a
palpable optimism, which has motivational value. And an
argument could be made that risky projects are undertaken
when some key people have an unrealistic belief in their
chances of success".
Weve seen that we
can easily be led to errant decision-making by believing
we possess a level of knowledge about our industry or
field of expertise that we, in fact, overrate by 100 per
cent or more.
Now for the really bad
news: Human nature doesnt just lead us to mistaken
judgments by routes such as those discussed above; it
invites us to follow our initial misjudgment with
responses that, at best, do nothing to correct our course
or, at worst, lead to tragedy. Thus Vaughan recognises,
"The O-ring failure that destroyed the Challenger
was preceded by a year-long series of sporadic O-ring
problems".
A series of decisions
with increasingly poor outcomes is called a "doom
loop" by Eileen C. Shapiro, president of the
Cambridge, Massachusetts, consulting group Hillcrest
Group Inc., in her forthcoming book The Seven Deadly
Sins of Business. In an interview, Shapiro said,
"Many executives get caught in this downward spiral,
where unquestioned and incorrect beliefs lead them to
compound rather than eliminate their past mistakes. Look
at what happened when consumers began to turn away from
microwave ovens that were progressively loaded with more
and more features, options and complexity. The response
by some manufacturers was not to simplify things, but to
offer cooking classes, more elaborate instructions, and
incredibly still more features".
"The single most
important step you can take is to allow your beliefs to
be called beliefs and not truths," says Shapiro.
Russo agrees, adding, "You have to have that crack
in the door: the boss recognising at least the
possibility of his fallibility. It involves an absence of
self-delusion that is itself closely related to
overconfidence the leader who isnt afraid to
admit what he doesnt know. Jack Welch is reportedly
such a person. If so, the phenomenal value he has added
for GE probably results at least in part from his
willingness to acknowledge he may not have all the
answers".
Russo also reports that
some executives may overcome overconfidence with
awareness alone. He cites a study in which half of the
participants in a negotiation exercise were warned about
overconfidence: "Compared to the unwarped
group," he says, "those forewarned were 30 per
cent more likely to reach a negotiated agreement instead
of having to turn to costly arbitration, and they
achieved net dollar benefits that were 70 per cent
higher".
Similarly, "good
managers devise their own solutions to the problems of
overconfidence," Russo continues. "That head
loan officer who had been so overconfident of his
knowledge devised a competitor alert file for
his term of loan officers they were required to
contribute to it and read everyones compiled
submissions. Only three weeks after initiating the file,
one of the officers was alerted to the possible defection
of a major client to another bank. He contacted the
client and convinced him not to switch, saving $ 160,000
in annual revenue".
"As strange as it
may seem," says Alcock, "experience is often a
poor guide to reality. Critical thinking helps us
question our experience and avoid being too readily led
to believe what is not so. In effect, we come to a
conclusion and they say, Hold on, how do I know
this? We suspend judgment. We cant apply this
to everything we do for example, standing at the
supermarket dairy case and thinking, This looks
like a quart of milk, but is it really? But we can
apply it in the case of decisions that are outside our
areas of expertise".
Russo reflects the
consensus of opinion in concluding that "group
judgements on average, (are) better than individual
judgements" in decision exercises. But he and other
experts caution that listening to others involves its own
pitfalls. Foremost among these is the natural tendency to
seek out people (and literature, anecdotes, etc.) who
agree with us. In his book Judgment in Managerial
Decision Making, Kellogg School professor Max H.
Bazerman mentions the special susceptibility of corporate
leaders to this tendency: "Prior successes often
reinforce this behaviour. Almost by definition, a track
record of success means there is lots of evidence
confirming what one believes".
Alcock warns of another
pitfall: "You have to watch out for those people
making the same mistakes you might be making. The
corporate leader, above all, must be wary; he is likely
to be hearing from people who want to aim favour and may
do so by not raising problems or objections to the
bosss idea".
The danger of such
behaviour is particularly acute in the presence of a
forceful and strong-willed boss. The tough-guy reputation
of former Eastern Airlines CEO Frank Lorenzo, for
example, is said to have so intimidated top executives
that they dared not defy him. A former colleague says
lieutenants would strain to catch clues of Lorenzos
position on an issue and then rush to get on board with
the boss.
There are several ways
in which you might be able to gauge and adjust for the
yes-man factor. First, lay out problems and perhaps
alternative responses, but dont state an opinion
until others have stated theirs. Second, state an opinion
and gather comments from aides, then drop the issue for a
week or two. Bring it up again by taking an opposite
stance, and see who switches with you. Third, be truly
receptive to dissenting opinion. Go beyond lip service;
thank subordinates immediately and publicly for stating
their objections. And fourth, formalise dissent and
counter-argument; this measure is unanimously advocated
by the experts. Russo, for example, champions
"counter-argumentation, which has proven so valuable
in several studies that its probably smart to
insist that one or two people take a devils
advocate role, even when your teams opinion on an
issue is unanimous.... Other studies have found that,
when listing pros and cons, the cons do the most good in
countering overconfidence".
Questioning yourself
enough to avoid overconfidence but not so much as
to cause needless self-doubt is a tricky balancing
act to maintain, especially since we are rarely willing
to admit that we could be the victims of our own hubris.
But if the alternative is to realise at some point in the
future that you could have avoided a mistake, it suddenly
seems easier to take a deep breath and ask yourself: How
do I know?
(Courtesy
Span)
|