Design handles iffy nanocircuits
By
Kimberly Patch,
Technology Research News
Scientists are getting better at growing
molecular-scale nanotubes and nanowires, which is paving the way for packing
trillions of ultrasmall circuits on computer chips.
These tiny circuits pose challenges that don't show up at larger
scales, however. One of the biggest has to do with the number of defects
in a device.
Nanoscale circuits are more sensitive to temperature changes,
cosmic rays and electromagnetic interference than today's circuits. "For
such a densely-integrated circuit to perform a useful computation, it
has to deal with the inaccuracies and instabilities introduced by fabrication
processes and the tiny devices themselves," said Jie Han, a research assistant
at Delft University of Technology in the Netherlands.
If it is difficult to make a defect-free device that has several
million electrical circuits, it is orders of magnitude more difficult
to manufacture devices with trillions of circuits. Today's Pentium 4 computer
chips contain about 42 million transistors.
Rather than having to discard rising numbers of devices that don't
make the grade, researchers are exploring ways to build defect tolerance
into electronics so the hardware will work even when it contains a sizable
percentage of faulty circuits. "Future nanoelectronics architectures will
have to be able to tolerate an extremely large number of defects and faults,"
said Han.
Han and a Delft University colleague have come up with an architecture
that combines circuits that have redundant logic gates and interconnections
with the ability to reconfigure structures at several levels on a chip.
The combination of redundancy and reconfigurability allows the system
to identify and discard minor or modest faulty signals and reconfigure
to avoid more severe problems.
Redundant circuits allow one or a few circuits to produce errors
without the logic gate as a whole giving the wrong result. "If modest
errors occur, some components are carrying on faulty signals while others
in the majority are still working properly," said Han.
The chip's reconfigurability compensates for errors that have
a larger effect, said Han. "If the effect of errors is tremendous, some
computing modules will fail," he said. In this case, the hierarchical
reconfigurability compensates for the loss.
The researchers' simulations show that the architecture is capable
of tolerating a relatively large number of faults, said Han. The architecture
can tolerate a failure rate of 0.2 percent, which is much higher than
today's chip designs can tolerate, said Han. A trillion-circuit chip based
on the architecture should be able to work reliably if at least 100 billion
of the circuits are functioning, he said.
The architecture has potential for correcting both permanent defects
in a chip, and transient faults, which may be common in systems based
on nanometer-scale devices, Han said.
The idea of redundant logic circuits goes back to the father of
modern computing, John von Neumann, but relying only on redundancy to
catch errors requires an impractical number of circuits. Adding reconfigurability
to the architecture cut down considerably on the amount of redundancy
needed, said Han.
Defect and fault tolerance is clearly a very important problem
as devices get smaller, said Seth Goldstein, an associate professor of
computer science at Carnegie Mellon University.
The reconfigurable nature of the researchers' approach is "absolutely
essential" to solving this problem, said Goldstein. "If you can reconfigure
around defects then you can have less waste -- you can figure out where
the defects are and... route around them," he said.
The researchers are heading in the right general direction, and
this is important for the field, but they may not have gone far enough
in making the devices reconfigurable, said Goldstein. "They have some
reconfigurable resources but their basic plan is to... present a defect-free
circuit after they've done one step of post-manufacturing reconfiguration,"
said Goldstein.
It remains to be seen how feasible this approach is, versus an
even more flexible approach that allows for more reconfiguration, according
to Goldstein. The approach "seems to be fairly brute-force," he said.
"If you look at the overhead that they have to pay for the end result
it seems like it's not much better."
The researchers are working on further simulations in order to
make the architecture more efficient, said Han. "The goal is to achieve
maximum fault-tolerance by using minimum component redundancy," he said.
The architecture could be used for computers made with any type
of circuit, including fantastically small devices made from molecules,
nanowires or carbon nanotubes, and single electron tunneling circuits,
said Han. Nanowires and carbon nanotubes can be as small as one nanometer
in diameter, which is the size of a row of 10 hydrogen atoms. A nanometer
is one millionth of a millimeter.
The architecture could be ready to be applied to practical devices
in less than two years, according to Han.
Han's research colleague was Pieter Jonker. The work appeared
in the January 16, 2003 issue of Nanotechnology. The research was funded
by Delft University of Technology.
Timeline: < 2 years
Funding: University
TRN Categories: Nanotechnology; Integrated Circuits; Architecture
Story Type: News
Related Elements: Technical paper, "A Defect-and Fault-tolerant
Architecture for Nanocomputers," Nanotechnology, January 16, 2003.
Advertisements:
|
March 26/April 2, 2003
Page
One
3D holo video arrives
Design handles iffy
nanocircuits
Network builds
itself from scratch
Molecule toggle makes
nano logic
News briefs:
Rubber stamp writes
data
Flexible motor takes
shape
On-chip battery debuts
Tilted trenches
turn out tiny wires
Plastic
coating makes chips biofriendly
Hydrogen yields
smaller nanowires
News:
Research News Roundup
Research Watch blog
Features:
View from the High Ground Q&A
How It Works
RSS Feeds:
News | Blog
| Books
Ad links:
Buy an ad link
Advertisements:
|
|
|
|