Viewer explodes virtual buildings

By Ted Smalley Bowen, Technology Research News

Fully immersive virtual reality programs are an impressive way to experience digital architectural models, but they don't always provide the best view, especially where action is involved.

Researchers at the University of California at Santa Barbara, Stanford University, Microsoft Corporation and the University of Virginia have devised a way to add expanded, or exploded, views to the three-dimensional architectural graphics used by real-time programs like computer games and training simulations.

An exploded view renders components of multipart objects like buildings and machines separately, opening up a building, for example, to make it possible to see the interiors of all floors at once. The view preserves relative positioning among all the model's details, including vertical supports, doors, and furniture.

The researchers' prototype software works with applications that portray action within three-dimensional spaces and can make them easier to follow, according to Christopher Niederauer, a researcher at the University of Virginia.

The software provides a bird's-eye view, which is useful for following team interactions, said Niederauer. "Military strategists and even police could potentially use our software... to aide in planning group interactions within a building," he said. "It is a really cool way to... see everything that is going on."

The software renders architectural models from an external vantage point rather than generating a more compute-intensive immersive, or first person, perspective, which allows the user to view the model as if he were inside. The external viewpoint also allows the software to show more of an interior model, because immersive viewpoints selectively omit structural details and objects to simulate a person's perceptions of space.

The software automates portions of the design-intensive process of locating each of a model's stories and generating the exploded view. It uses Chromium, which is software that can modify, delete or replace graphics commands on-the-fly from programs written in the OpenGL programming language. This allows the exploded view visualizer to alter three-dimensional graphics programs as they run.

The software first determines the location of the model's layers -- for example each story of a multistory building -- by analyzing the architectural model. It then modifies the application's graphics output to separate each story and render the exploded view.

The program splits stories off from the original architectural model just below the ceiling, affording a less obstructed view of the interior. And it allows the user to choose the viewpoint and the spacing between stories in the exploded view.

The exploded view is axonometric, meaning it represents three-dimensional objects with vertical and horizontal dimensions drawn to scale, but distorts curves and diagonals.

The on-the-fly rendering of each frame of an interactive program takes a lot of computer power, but is within the capability of a desktop PC, according to Niederauer. "Games these days are optimized to only show one room at a time, but in order to show everything, we need to draw everything," he said. "We still get interactive rates when using a single computer," he added.

The researchers are currently working on improving the interface, including simplifying the characters' appearance, possibly representing them as icons in order to make them easier to track in the exploded view.

The idea of converting computer models of multistory buildings to better show both exterior and interior details is not new, but the researchers have automated the process and found new applications for the technique in computer graphics, said Michael Ashikhmin, assistant professor of computer science at the State University of New York at Stony Brook. "The low level techniques are extremely simple and well known but the [researchers] combine them in a novel and interesting way."

The software could be used to better visualize interactive multiplayer games such as Doom without any modification to the existing code, he said. "This by itself is an interesting development, especially in environments where a third person is allowed to follow the game action."

The software can be used with existing three-dimensional programs now, according to Niederauer.

Niederauer's research colleagues were Mike Houston of Stanford University, Manesh Agrawala of Microsoft, and Greg Humphreys of the University of Virginia. They presented the work at the Association of Computing Machinery's Symposium on Interactive 3-D Graphics (ACM's I3D '03) in Monterey, California, April 27 to 30, 2003. The research was funded by the Department of Energy (DOE).

Timeline:   Now
Funding:   Government
TRN Categories:  Data Representation and Simulation; Graphics; Human-Computer Interaction
Story Type:   News
Related Elements:  Technical paper, "Non-Invasive Interactive Visualization of Dynamic Architectural Elements," presented at the ACM Symposium on Interactive 3-D Graphics (ACM's I3D '03), April 27-30, Monterey, California, and posted at


August 13/20, 2003

Page One

Skulls gain virtual faces

Viewer explodes virtual buildings

Tool blazes virtual trails

Quantum computer keeps it simple

News briefs:
Video keys off human heat
Interference boosts biochip
Device simulates food
Motion sensor nears quantum limit
Molecule makes ring rotor
Carbon wires expand nano toolkit


Research News Roundup
Research Watch blog

View from the High Ground Q&A
How It Works

RSS Feeds:
News  | Blog  | Books 

Ad links:
Buy an ad link


Ad links: Clear History

Buy an ad link

Home     Archive     Resources    Feeds     Offline Publications     Glossary
TRN Finder     Research Dir.    Events Dir.      Researchers     Bookshelf
   Contribute      Under Development     T-shirts etc.     Classifieds
Forum    Comments    Feedback     About TRN

© Copyright Technology Research News, LLC 2000-2006. All rights reserved.