Study reveals Net's parts
By
Kimberly Patch,
Technology Research News
As the Internet grows, it is becoming increasingly
important for software makers and information managers to adapt to the
network's basic patterns rather than its current configuration. Researchers
studying the workings of the Internet have found several of its structural
secrets, but Internet simulations still differ from the real thing.
Researchers from the Nordic Institute for Theoretical Physics
in Denmark, Brookhaven National Laboratory, the Niels Bohr Institute in
Norway, and the Norwegian University of Science and Technology have uncovered
another fundamental Internet attribute -- it has an underlying modular
structure regulated by the number of sites, or nodes, that link to a given
node.
The researchers used a method similar to the algorithm underlying
the search engine Google to measure Internet modularity. But rather than
the usual method of measuring connected nodes, the researchers focused
on links between nodes, mapping out a picture of links linking to links.
They found that the Internet has about 100 modules that correspond
roughly to countries, and the farthest points from each other are Russia
and U.S. military sites, according to Kasper Astrup Eriksen, who carried
out the research at the Nordic Institute for Theoretical Physics and is
now a researcher at Lund University in Sweden.
The work promises to improve the accuracy of Internet simulators,
and could help strengthen the Net by pointing out where to reinforce links
between weakly-connected modules.
Past research has shown that the Internet is a scale-free network,
meaning it has a few well-connected nodes and many nodes with only a few
links. "For the Internet the rule is approximately... for every 1,024
nodes with one link, there are 256 nodes with two links, 64 nodes with
four links, 16 nodes with eight links, four nodes with 16 links, etcetera,"
said Eriksen.
In general, scale free networks exhibit preferential attachment,
meaning the more links a node already has, the more rapidly it will collect
additional links. If a node has two existing links, for example, it is
twice as likely to be linked to again than a node with only one existing
link.
The Nordic/Brookhaven/Niels Bohr team looked at the structure
a little differently, focusing on links between nodes rather than the
nodes themselves. Connections can be thought of as being between links
rather than nodes, so that a connection to a node with a lot of links
is actually a connection to the ends of many links, said Eriksen.
Looking at the structure this way, and keeping in mind that all
link ends have the same probability of being picked, at highly connected
nodes links "often get a free ride when a new link is connected to one
of the other link ends at the same node," he said.
The researchers used a variation on the random walker diffusion
method to detect the modularity of the network from the connected links
point of view.
Picture a person exploring a network by walking along its links,
said Eriksen. "Whenever the walker comes to a node, he picks at random
one of the link ends emanating from that node," he said. If you put many
walkers on a network and the walkers make decisions independently of each
other, they will eventually reach equilibrium -- if there are twice as
many walkers as links, each link will average at any given time two walkers
traveling in opposite directions.
The key to uncovering structural traits of the Internet is studying
how this ensemble of walkers slowly reaches equilibrium, said Eriksen.
For example, in a network whose patterns resemble North and South
America -- with very few links, or roads linking the two parts of the
network, or landmasses -- a walker starting in North America and turning
left and right at random is not very likely to find the road going to
South America, said Eriksen.
"What we observed is that first the walkers within North and South
America individually come to an equilibrium... and then later on the number
of walkers within each country [reaches] its long-term mean," said Eriksen.
Because the walkers reach an equilibrium in a single area, or
module first, the method can be used to detect existing modules of the
Internet, and can assess the degree of isolation of an individual module,
according to Eriksen.
According to the researchers' simulations, the underlying modular
structure of the Internet roughly corresponds to individual countries.
"We found that the Internet indeed is modular and we identify the large
part of this modularity history in the political and geographical divisions
in the real world," said Eriksen. The last place the walkers reached equilibrium,
for instance, was between Russia and U.S. military sites, he said. "These
are thus the... two parts of the network that are most separated from
each other."
To carry out the study, the researchers had to adjust some existing
algorithms to develop a practical way to run simulation of many walkers,
Eriksen said. "Just... running the simulation of random walkers is not
the fastest way to calculate the diffusion modes and identify the modules
[and is not] feasible for huge networks," said Eriksen.
They found a way to pose the problem so that the time it took
the algorithm to calculate roughly doubled, rather than increasing exponentially,
every time the network doubled, he said.
The visual results of the simulations were star-like shapes. Straight
lines radiating from the center indicated independent modules.
Traditionally, there are two strategies to determining the modularity
of the Internet -- bottom-up or top-down, said Eriksen. The first scenario
groups the most similar nodes into a module. The researchers' work falls
into the second scenario, which subdivides the network into modules.
Visualizing modularity is a step toward making a coarse-grain
description of the Internet that can be used to better understand its
architecture and how and where to improve its connectivity, said Eriksen.
The researchers' method could help make Internet topology generators,
or simulators, more accurate, according to Eriksen. "We have devised methods
to see if the artificial networks generators are capable of this," he
added.
As the Internet grows and changes, it is increasingly important
to take its basic patterns into consideration, said Eriksen. "When you
devise [software like] routing rules for the Internet, you not only want
[it] to do well on today's Internet, but also on the Internet of tomorrow,
which at the current speed of development might be quite different," he
said. "It is... not good if your algorithms only function efficiently
due to... special linkage patterns present today but not tomorrow."
Erikson's research colleagues were Ingve Simonsen of the Nordic
Institute for Theoretical Physics (NORDITA), Sergei Maslov of the Brookhaven
National Laboratory and Kim Sneppen, now at NORDITA. The work appeared
in the April 11, 2003 issue of Physical Review Letters. The research
was funded by NORDITA, Brookhaven National Laboratory and the Norwegian
University of Science and Technology.
Timeline: Unknown
Funding: Government
TRN Categories: Internet; Physics
Story Type: News
Related Elements: Technical paper, "Modularity and Extreme
Edges of the Internet," posted in the Physics Archive at arxiv.org/abs/cond-mat/0212001.
Advertisements:
|
July 2/9, 2003
Page
One
DNA makes nano barcode
Study reveals Net's parts
Recommenders can skew
results
Light pipes track motion
News briefs:
Material helps
bits beat heat
Process puts
nanotubes in place
Printing method
makes biochips
Tiny T splits light
Tiny walls sprout
nanowires
Big sites hoard links
News:
Research News Roundup
Research Watch blog
Features:
View from the High Ground Q&A
How It Works
RSS Feeds:
News | Blog
| Books
Ad links:
Buy an ad link
Advertisements:
|
|
|
|