Network similarities run deep

By Kimberly Patch, Technology Research News

Networks exist everywhere -- in living systems and social groups as well as among computers. What's more, there are similarities among networks as diverse as the World Wide Web and the chemical goings on of a bacterium.

A pair of scientists working to map out what networks have in common have come up with a theory that uses the tools of calculus to uncover the ways that networks develop. The work could lead to both better search algorithms for the Web and better means of disrupting the chemical networks of cells for medicinal purposes.

The work builds on previous research that found that the links in networks are not random. Whether the network is of chemicals in a cell, interactions among people, or Web pages, once the network grows to a certain size, it develops hubs where links congregate. These larger hubs exist in all networks the researchers have examined, said Albert-László Barabási, a physics professor at the University of Notre Dame.

"The World Wide Web, the actor network, the science citation network and many other networks in nature follow the same pattern... show some kind of self organizing process ... that is different from simple random networks," said Barabási.

In the actor network, each actor is a node, and actors are linked to each other when they play together in a movie. The similar science citation network maps out scientists who work together. This corresponds to Web pages connected by links and chemicals connected by reactions in natural systems.

In the actor and scientists networks, certain people emerge who are linked to many more people than the average. "If [an actor] is very popular, they will acquire links at an even higher rate because every casting director would like to have them in the movie. Some of these processes can contribute to make the hubs even larger," said Barabási.

The Web has larger hubs as well. In living systems, certain chemicals, like water and ATP, play a role in large numbers of reactions, while a very large number of chemicals have just one or two roles. A simple bacterium, for example, uses between 1000 and 2000 chemicals, Barabási said.

This pattern is a continuum, with a few extremely large hubs stepping down rapidly into a very long tail of nodes with an increasingly smaller number of links each. This pattern corresponds to the continuum model, which allows the researchers to "mathematically describe how these networks are growing... and to predict the characteristics of the network," said Barabási.

The Notre Dame researcher's work is one of two ways to describe systems, said John Klineberg, assistant professor of computer science at Cornell. A discrete model looks at nodes and links as distinct objects. A continuous model looks at them as approximations. Each model has its pluses and minuses.

"Fundamentally the network is a discrete object -- it has an actual number of nodes, so it's not really infinitely divisible, but ... once you approximate things ... as continuous you have at your disposal the tools of differential equations -- calculus. And so the question is going to be is your system big enough that you can really model it continuously," said Klineberg.

If the models were describing how water boils rather how networks are constructed, a discrete model would have the impossible task of tracking the trajectory of each molecule as the water heated up, while the continuum theory would find aggregate properties of the system in order to derive statistically that the water will boil at a certain temperature after a certain amount of heat is added, said Klineberg.

The continuous approach is well-suited to describe a pot of boiling water because it contains about 10 million trillion molecules. With that many objects "you don't lose much by approximating," said Klineberg.

At this point the Web is too big to model discretely, but still much smaller than a pot of boiling water. "We're at a point where it's interesting to look at both approaches... and it's reassuring that the two approaches are getting similar answers," said Klineberg.

The Notre Dame researchers have used the Continuum Theory to look at several traits of networks to see if the way networks develop affects this basic model. "Now that we understand that [networks] are vaguely similar to each other, they follow the same mathematical principles, [we're trying] to understand the details of this process, Barabási said.

They have found that the basic structure of networks makes them fairly impervious to random errors, but if several of the largest hubs are simultaneously attacked, the whole network is at risk. "If you go for the big nodes, not randomly, but you take simultaneously down, say, one percent of the biggest nodes in the system, then the system will break apart into pieces," said Barabási.

Networks have several other traits in common, he said. Networks are often growing systems. "You're always adding new Web pages, new people to society," he said. In addition, when a new link is added, the most connected nodes are generally preferred.

In a biological system, however "we don't know why these parallels appear because they're very tricky ... there is evolution which is supposed to optimize the system [but] it's not clear" how such a complicated process affects the network, Barabási added.

It is also apparent that networks share a small diameter, said Klineberg. "You can get from one point to any other point with very few hops -- that's also coming out as we look at this."

There are also ways networks differ, Klineberg said. For instance, while a social network is symmetrical, meaning relationship links are usually two-way, Web links are asymmetrical. "Lots of people link to Yahoo's homepage... but Yahoo doesn't reciprocate," he said.

The Notre Dame researcher's work is already starting to be applied Web searching, Barabasi said. "The Google search engine is using some of these principles already. The more we understand about the topology of the system, the better we can design tools to work with the systems, whether it's a search engine or it's a [chemical] that's trying to kill a cell," he said.

Barabási's research colleague is Reka Albert from the University of Notre Dame. They published a technical paper about the research in the December 11, 2000 issue of Physical Review Letters. The work was funded by the National Science Foundation (NSF).

Timeline:   Now
Funding:   Government
TRN Categories:   Networking; Internet
Story Type:   News
Related Elements:  Technical paper, "Topology of Evolving Networks: Local Events and Universality," Physical Review Letters, December 11, 2000; Technical papers posted at; Related technical paper, "Network Robustness and Fragility: Percolation on Random Graphs," Physical Review Letters December 18, 2000


January 3, 2001

Page One

Gels make micro plumbing

Security comes one photon at a time

Shining a new light on electron spin

Network similarities run deep

Transistor lights up


Research News Roundup
Research Watch blog

View from the High Ground Q&A
How It Works

RSS Feeds:
News  | Blog  | Books 

Ad links:
Buy an ad link


Ad links: Clear History

Buy an ad link

Home     Archive     Resources    Feeds     Offline Publications     Glossary
TRN Finder     Research Dir.    Events Dir.      Researchers     Bookshelf
   Contribute      Under Development     T-shirts etc.     Classifieds
Forum    Comments    Feedback     About TRN

© Copyright Technology Research News, LLC 2000-2006. All rights reserved.