Coincidences set up mental error
By
Kimberly Patch,
Technology Research News
Ever fix something using a specific remedy
only to later realize that the fix had nothing to do with the problem
resolving itself?
It may seem obvious after the fact that pushing a specific sequence
of computer keys has nothing to do with avoiding an automatic eight-second
delay, but because you were seemingly fixing the problem by taking a specific
action the scenario made perfect sense at the time. This is because it
was part of your working mental model.
Researchers from the Universities of Newcastle upon Tyne and York
in England have showed that this psychological phenomenon -- a person's
tendency to assume that a sequence of events are causally related if those
events fit the mental model people construct to filter information --
can cause highly-trained operators to make high-consequence mistakes without
noticing those mistakes.
The researchers are making human-computer interfaces designed
to sidestep the problem. Such interfaces could lead to safer machines,
including airplanes, according to the researchers.
Mental models are shortcuts -- simplified pictures of reality
that help the brain cope with the complexity of the outside world. "Because
the human brain does not have the capacity to process [every] single piece
of information in our environment, it has to filter out what it does not
want," said Denis Besnard, a research associate at the University of Newcastle
upon Tyne. This explains the tendency to filter out surrounding sounds
when reading an interesting book, he said.
Our limited processing capacity causes us to unconsciously aim
for solutions that are good enough instead of perfect solutions, said
Besnard. We also implicitly accept that we cannot understand everything
in our environment, he said.
This cognitive economy strategy shapes reality into a distorted
picture -- the mental model. "This picture is strongly influenced by our
goals and we use it to decide about what is happening, [and] what will
happen if we do this or that," said Besnard.
Although the model allows us to function in a complicated environment,
it also harbors potential problems. "When something that we expect happens,
we tend to believe that we understand the causes," said Besnard. In thinking
we understand "we act under a confirmation bias whereby we rely on instances
that confirm our predictions instead of trying to find the cases in which
our expectations do not hold," he said.
Even operators in critical processes, like aircraft pilots, use
these incorrect mental models, said Besnard. "Problems can then be felt
to be understood when they are not... leading to mishaps and accidents,"
he said.
Coincidences become potentially dangerous in certain situations,
given this mental phenomenon. Take a pilot trying to solve an engine problem
by erroneously shutting off a correctly functioning engine at the same
time as vibrations from a faulty engine subside, said Besnard.
The researchers examined such an instance that caused a British
Midland Airways Boeing 737-400 to crash near Kegworth, England in 1989,
killing 47. The vibrations were caused by a fan blade detaching from one
of the plane's engines. According to the cockpit flight recorder, the
flight crew wasn't sure which engine was affected, but throttled one of
the engines back and eventually shut it down. The fumes stopped and the
vibration dropped at the same time. "This co-occurrence led them to [wrongly]
believe that they had solved the problem," said Besnard.
The crew decided to make emergency landing, and during the time
they flew toward the airport the working engine showed an abnormal level
of vibration for several minutes, but the flight crew did not notice the
problem, according to Besnard. Only when they had begun the landing approach
and the engine vibrations got much worse and were accompanied by a loss
of power and an associated fire warning did the crew attempt to restart
the other engine, and by that time it was too late.
The pilot and co-pilot survived and later said they had not remembered
seeing any signs of high vibrations on the engine vibration indicators
even though at least one of the pilots said he was in the habit of regularly
scanning them.
A better understanding of the phenomenon promises to help design
better interfaces for airplanes in order to allow pilots to more easily
notice mistakes even when they correspond with the mental model phenomenon,
said Besnard. "One of the things flight crews need to know is when they
err," he said. "It seems obvious but it's not. Because of the complexity
and tempo of the interaction aboard modern aircraft, pilots sometimes
[make] mistakes they do not notice."
The researchers are currently looking at the conditions under
which operators can lose their grasp on a situation, said Besnard.
The ultimate goal is to enhance the ergonomics of the cockpit,
said Besnard. "This requires... a dialogue and collabora[tion], if possible,
with cockpit designers," he said.
In general, modern cockpits should provide pilots with a picture
of what is likely to happen in the near future and suggest solutions,
said Besnard. "Pilots have too much data to process," he said. "Pro-active
pilot-centered support systems... that provide feedback to pilots about
events that may occur in the future and for which actions will be needed,"
would help, but have not been implemented in commercial aircraft, he said.
It will be between 5 and 10 years before the researchers' methods
will be ready for practical use, partly because of the the complexity
of the work, it's critical nature, and certification issues, said Besnard.
Besnard's research colleagues were David Greathead at the University
of Newcastle upon Tyne and Gordon Baxter of the University of New York
in England. The work appeared in the January, 2004 issue of the International
Journal of Human-Computer Studies. The research was funded by the
UK Engineering and Physical Sciences Research Council (EPSRC).
Timeline: 5-10 years
Funding: Government
TRN Categories: Human-Computer Interaction
Story Type: News
Related Elements: Technical paper, "When Mental Models Go
Wrong: Co-Occurrences in Dynamic, Critical Systems," International Journal
of Human-Computer Studies, January, 2004
Advertisements:
|
February 11/18, 2004
Page
One
Light-storing chip charted
Coincidences set
up mental error
Noise boosts nanotube
antennas
Web users re-visit in
steps
Briefs:
All-plastic display
demoed
DNA sorts nanotubes
Electricity
teleportation devised
Mechanical
storage goes low power
Scientists brew
tree-shaped DNA
Magnets tune
photonic crystal
News:
Research News Roundup
Research Watch blog
Features:
View from the High Ground Q&A
How It Works
RSS Feeds:
News | Blog
| Books
Ad links:
Buy an ad link
Advertisements:
|
|
|
|