Remote
monitoring aids data access
By
Kimberly Patch,
Technology Research News
One of the ongoing challenges facing scientists
and business people is how to access and visualize the vast amounts of
data modern technology allows us to collect.
Researchers from Sandia National Laboratories have found a way to work
with large amounts of data over networks in near real-time. The researcher's
prototype uses the Internet to give people access to very large sets of
data stored thousands of miles away and allows them to manipulate the
data with a lag time of less than one tenth of a second.
Remote access schemes tend to focus on moving data, said John Eldridge,
a principal member of technical staff at Sandia National Laboratories.
"Either they send the data set in its entirety or they transmit the image
geometry so that the remote computer can render and display the image
through its own video adapter," he said.
The Sandia method doesn't transfer data at all, but instead transfers
the video signal that normally carries image information from a computer
to its monitor. "The video card is designed to put out a video signal
to a local monitor... we extend the signal," said Eldridge.
The approach can speed remote access, said Eldridge. "Where large data
sets... are involved, it may be more efficient to move the video signal
rather than the data," he said.
The interactive remote-visualization hardware could allow doctors to view
and manipulate very large images, like magnetic resonance imaging (MRI)
files, remotely, according to Eldridge. It could also allow people who
work in other fields that involve very large amounts of data -- like geophysical
modeling and financial services -- to view and manipulate data remotely,
he said.
Lag time makes sharing and manipulating remote data difficult, said Eldridge.
"The simplest example of this is in trying to coordinate the movement
of the computer's mouse with a mouse pointer on the display," he said.
"As the delay between the action and the apparent response increases,
the interactivity and usability decreases; as the processing delay approaches
0.1 seconds, [it] noticeably affects a user's interactivity," he said.
To decrease lag time, the group took advantage of today's graphics cards
for video games, which render two-dimensional and three-dimensional images
very quickly. These images are typically fed to nearby monitors. The researchers
found a way to instead move the video signal across the Internet.
The researcher's encoder/decoder hardware attaches to a computer's video
card adapter. It digitizes the video signal, compresses the digitized
data stream, and then formats the data stream into standard network protocol
packets of data, said Eldridge. The card then sends the packets to a Gigabit
Ethernet interface card, which transmits the packets across a network.
At the remote location, the researchers' hardware receives the packets,
rebuilds the data into a video stream, and translates the video signal
for a locally-attached video monitor, said Eldridge.
The main challenge to speeding things up was to perform the encoding and
decoding process in near real-time, Eldridge said.
To do this the researchers changed the way the data was stored in memory,
and the way the hardware performs frame differencing, said Eldridge. Because
each frame of a video looks a lot like the frame before, it saves a lot
of time to only transmit image changes. The system performs frame differencing
by looking at differences between successive video frames; it then transmits
only those differences across the network.
The time required to capture frames for the differencing process is responsible
for most of the response-time delay, said Eldridge. For video screens
that refresh at 60 hertz, or times per second, the encoder/decoder hardware
completes the frame differencing step in about 32 milliseconds, he said.
The researchers used reprogrammable logic chips to process and compress
the video image. "Since the processing is performed in logic hardware
it is very quick," said Eldridge. "The hardware extracts a great deal
of performance from each clock cycle," he added.
The signal can be compressed further by removing redundant timing information
and even reducing the frame update rate, if necessary, said Eldridge.
The raw information rate from a computer's video display is about 2.5
gigabits, or billion bits, per second for a screen resolution of 1280
by 1024 and a 60 hertz refresh rate. The researcher's prototype achieved
network transfer rates of between 70 and 800 megabits per second, said
Eldridge.
The idea of transferring just the video signal has the nice property that
the required bandwidth, though high, is limited and doesn't go up with
the complexity of what is being visualized, said Peter Schröder, a professor
of computer science and applied and computational mathematics at the California
Institute of Technology. "However, one would have to do a careful bandwidth
analysis to see where that cutoff point is," he said.
The technology's potential usefulness hinges on its cost, Schröder added.
"The commodity hardware business is very unforgiving of custom solutions
with only a few potential customers," he said.
The researchers are aiming to transfer the technology to a partner who
can commercialize it, said Eldridge. They are also aiming to make the
scheme work with multi-tiled displays.
The prototype could be packaged into a commercial product in 6 to 12 months,
said Eldridge.
Eldridge's research colleague was Lyndon Pierson. The research was funded
by Sandia National Laboratories.
Timeline: 6-12 months
Funding: Government
TRN Categories: Networking; Graphics; Internet; Data Representation
and Simulation
Story Type: News
Related Elements: None
Advertisements:
|
January
15/22, 2003
Page
One
Heat's on silicon
Remote monitoring
aids data access
Metal stores more hydrogen
Device demos terabit
storage
Plastic process
produces puny pores
News:
Research News Roundup
Research Watch blog
Features:
View from the High Ground Q&A
How It Works
RSS Feeds:
News | Blog
| Books
Ad links:
Buy an ad link
Advertisements:
|
|
|
|