Software fuse shorts bugs
By
Kimberly Patch,
Technology Research News
Software failure is a huge problem. Software
bugs cost the U.S. economy close to $60 billion a year, according to the
National Institute for Standards and Technology.
Many of the problems are caused by conditions that software designers
didn't anticipate. "Most failures occur when we take software outside
its comfort zone" of foreseeable conditions, said George Candea, a researcher
at Stanford University.
Historically researchers and developers have tried to cut down
errors by widening the comfort zone, and doing extensive testing. This
doesn't always work, however, because there are always untested scenarios,
and in some of these the software invariably fails.
Candea has proposed an alternative approach: make sure that the
software indicates that something is wrong rather than returning an answer
that doesn't make sense. The method calls for adding two pieces of software
to any given program: a fuse to protect against unexpected input and an
output guard to protect against unexpected output.
The approach serves to "coerce the reality surrounding the software
to conform to what designers expected", said Candea. The approach is part
of a broad research effort aimed at making software dependable and eventually
self-healing.
The method has the potential to be much cheaper than fixing software
bugs and can be implemented by entities other than the original software
vendor, said Candea.
The software fuse ensures that everything that the given program
encounters conforms exactly to what the product is designed to anticipate
and can handle without failing. It is akin to the fuse on an electric
circuit, which disables the circuit when the circuit is exposed to current
that exceeds its capacity.
This includes three types of unexpected input. Input of an unexpected
size, input of unexpected content, and input that arrives at an unexpected
rate.
The majority of the Cert Coordination Center security and availability
compromise alerts in the past few years have been due to buffer overflows,
according to Candea. The recent SQL Slammer worm, for example, used input
longer than expected to overwrite the key portion of a program with attacker
code in order to gain control of the computer and use that control to
spread to other computers.
Denial-of-service vulnerabilities related to HTML parsing that
have dogged the Apache Web server and the Squid proxy cache fall under
the second category, said Candea.
And Internet services failures, including CNN.com's failure on
the morning of September 11, 2001 when its traffic more than doubled within
fifteen minutes, fall under the third category. The CNN failure was not
due to the amount of traffic, which the site was prepared to handle. The
failure happened because the system was not able to adapt to the drastic
change in traffic quickly enough.
Instead of attempting to anticipate these problems and testing
for them, Candea's approach would prevent bug-triggering inputs from entering
or propagating through the system.
On the other end, the output guard ensures that the software's
output conforms to what it is supposed to be returning to the user or
a downstream software module. "If the product fails and produces wrong
output, the guard will prevent it from lying," said Candea.
The approach is as simple as saying that if your program fails
when you give it certain inputs or constrain certain resources, then don't,
and it likely will not fail, said Candea. "Instead of fixing the product
that fails when given wrong inputs, fix the inputs," he said. "It's kind
of like when you go to the doctor and say your arm hurts when you raise
it, and he says 'well, then, don't raise it'".
And in the same way most doctors will not suggest not raising
your arm, traditional dependability researchers will shy away from suggesting
constraining the reality surrounding your programs, said Candea.
The approach is pragmatic, however, said Candea. Using the approach
in combination with focusing on software quality should yield better results
than either approach in isolation, he said.
The fuse and guard software provide guarantees that can be verified
formally, said Candea. "For example, if the software fuse ensures that
my program never receives strings longer than expected, then the software
product becomes immune to buffer overflow attacks," he said. "Similarly,
if the output guard ensures that the product never generates more than
20 requests per second, then I know for sure that whatever I hook up to
the product will never received more than 20 requests per second."
The difficult part of developing fuses and output guards for a
given piece of software is the difficulty of capturing the notions of
correct input and output, said Candea.
The method treats software modules as black boxes whose inner
workings are unknowable. This is particularly useful in environments where
older software that can't be easily fixed needs to be integrated with
newer software, according to Candea.
The approach requires that software predictability be measured
so that reasonable decisions can be made about trade-offs between factors
like predictability, performance and cost.
It is potentially much cheaper than fixing bugs and rewriting
software, however, said Candea. Bug fixes often introduce more bugs, and
rewritten code also suffers because it hasn't been tested and debuged
as thoroughly as code that has been deployed in the field for years, he
said.
The method could be implemented in practical applications in three
to six years, according to Candea.
The research was funded by Stanford University.
Timeline: 3-6 years
Funding: University
TRN Categories: Software Design and Engineering
Story Type: News
Related Elements: Technical paper, "Predictable Software
-- a Shortcut To Dependable Computing?" posted on the Computing Research
Repository (CoRR) at http://arxiv.org/abs/cs.OS/0403013
Advertisements:
|
June 30/July 7, 2004
Page
One
Software fuse shorts bugs
Holograms enable
pocket projectors
Memory cards make connections
Interface blends
screen and video
Briefs:
Paper promises
better e-paper
Birds-eye
view helps guide public
Nanotubes boost
neuron growth
Chip protects single
atoms
Mega video
enables virtual window
Method tests
molecular devices
News:
Research News Roundup
Research Watch blog
Features:
View from the High Ground Q&A
How It Works
RSS Feeds:
News | Blog
| Books
Ad links:
Buy an ad link
Advertisements:
|
|
|
|