Teamed computers drive big display

By Chhavi Sachdev, Technology Research News

Remember the big board that dominated the Pentagon’s War Room in Dr. Strangelove? On it, tiny warplanes inched relentlessly towards Russia and a nuclear nightmare. Researchers at Sandia National Laboratories have built a high-resoultion large screen display that exceeds the speed of current graphics rendering systems more than 50 times. The display will be used for, among other things, simulating nuclear weapons like those that closed out Kubrick’s masterpiece.

The 10- by 13-foot display turns large sets of visual data into three-dimensional images. “The data is continuously being rendered such that it can be rotated or panned or zoomed... it's fully three dimensional. Move the mouse and the application will update the… fully three-dimensional data,” said Carl Leishman, a principal member of technical staff at Sandia. Users can also change and animate objects, he said.

The system takes six seconds to render a complicated image containing about 470 million of the triangles that graphics programs use to render the different shapes and edges of images, Leishman said. Current high-end graphics systems would take five minutes to do the same, he said.

The researchers were able to render a large amount of data quickly by processing the images in parallel. The Sandia system ties together the outputs of graphics cards from a cluster of 64 computer processors. Each computer handles about 7.3 million triangles of data for a total of 470 million.

A high-end graphics card in an individual PC can render 10 to 15 million triangles per second on a single screen, said Leishman. “Since we have 64 of these cards, the theoretical maximum rendering rate we can achieve is around 350 million triangles per second,” said Kenneth Moreland, a member of the technical staff at Sandia. The images are then combined on a 4x4 array of 16 projectors that beam the composite image onto the large screen.

The researchers were able to render 300 million triangles per second on a single screen. However, when more screens were added, the rendering rate decreased. The current system renders about 80 million triangles per second on the 16-screen array with each of its 64 graphics cards, Moreland said.

The 10x13-foot display has a total of 20 million pixels or exactly 16 times as many as a high resolution 1024x1280 pixel computer monitor and about 80 times as many as a typical television. Having more pixels when rendering very complex computed data means more detailed data can be shown, Leishman said.

There are many other large display arrays in use today, mostly in control rooms, flight simulators and planeteria, but with much lower resolutions, said Leishman. The Hayden Planetarium, for instance, uses seven projectors to put about 7.4 million pixels on its ceiling display. None can manage as many pixels or as much data as the Sandia display, however, he said.

The researchers estimate that a person with perfect visual acuity cannot see individual pixels on a high-end computer screen beyond 2 to 3 feet. For their screen, they estimate the greatest distance from which a person can see a clear picture is more like 10 feet. “The point isn't that the display is exceptionally fine grained, but that it's exceptionally high resolution and that we have the ability to interactively render [it],” Leishman said.

Resolution is a function of how many pixels a screen contains, while grain is a function of how close the pixels are, according to Leishman.

Rendering extremely large datasets is particularly important for the extremely high fidelity simulations programs must compute for the Department of Energy’s Visual Interactive Environment for Weapons Simulations (VIEWS) program, Leishman said.

The system will find practical applications in weapons simulations, biological investigations, nanotechnology and other areas, he said. “Keeping in mind our program's engineering requirements for nuclear weapon stockpile stewardship, we think we have an important practical use right now. We expect much of the software to be made available via open source distribution within one year,” said Leishman.

The research area is interesting and large high-resolution screens could be produced on a mass scale in a few years, said Terry Winograd, a professor of computer science at Stanford University.

The screens could be used as big picture windows and also as interactive tools, he said. “Think of people who work in non-computer settings -- they put things on boards, they put things on walls, they move them around, they do things with them. I think there’s a lot of interesting potential for interacting with the wall, with the stuff that’s up there instead of interacting with a keyboard and small screen,” Winograd said.

Back-projected screens are expensive and awkward, he said. Front projection “has obvious problems like when you stand in front of it you block it.” They can’t be used in a conference room in general. But when large self-contained displays like Sandia’s become more practical and can essentially just be nailed to a wall, they will become more commonplace, he said.

The researchers plan to increase the system’s capacity and capability by improving both software and hardware, Leishman said. The Sandia system uses commercially available graphics cards, he said. “Everything is built from normally available commodity hardware.” This allows quick and easy integration of next generation equipment, he said.

Work on the next generation system should begin within 2 to 3 years, he said. “We will soon have 48 projectors in a 12x4 array displaying 64 million pixels,” said Leishman. That projector screen will be 38.1 feet wide and 10.2 feet high. The rendering will not be as fast because more pixels mean slower rendering, said Moreland. To overcome this, the researchers plan to upgrade to a larger cluster of more than 64 processing units, Moreland said.

Leishman and Moreland’s colleagues were Brian Wylie, Constantine Pavlakos, Vasily Lewis, and Philip Heermann. A part of their research was published in the July/August 2001 issue of the journal IEEE Computer Graphics Applications and a second part was presented at the IEEE Symposium, Parallel and Large Data Visualization and Graphics, 2001. The research was funded by the Department of Energy (DOE).

Timeline:  >1 year
Funding:  Government
TRN Categories:  Data Representation and Simulation; Graphics
Story Type:   News
Related Elements:  Technical paper: "Scalable Rendering on PC Clusters," IEEE Computer Graphics and Applications, July/August 2001. Technical paper: "Sort-Last Tiled Rendering for Viewing Extremely Large Data Sets on Tiled Displays," submitted to IEEE Symposium, Parallel and Large Data Visualization and Graphics, 2001.


Advertisements:



October 17, 2001

Page One

Atom laser fits on a chip

Email takes brainpower

Teamed computers drive big display

Holograms control data beams

Pressure produces smaller circuits




News:

Research News Roundup
Research Watch blog

Features:
View from the High Ground Q&A
How It Works

RSS Feeds:
News  | Blog  | Books 



Ad links:
Buy an ad link

Advertisements:







Ad links: Clear History

Buy an ad link

 
Home     Archive     Resources    Feeds     Offline Publications     Glossary
TRN Finder     Research Dir.    Events Dir.      Researchers     Bookshelf
   Contribute      Under Development     T-shirts etc.     Classifieds
Forum    Comments    Feedback     About TRN


© Copyright Technology Research News, LLC 2000-2006. All rights reserved.