VR accommodates reality
By
Eric Smalley,
Technology Research News
One reason flight simulators are much more
compelling than other virtual environments is that only the scenery is
virtual. The cockpits are often exact replicas of the real things.
For many training, design verification, telepresence and phobia
treatment applications, the ideal environment is a mix of virtual and
real.
Researchers from the University of North Carolina and Disney Corporation
have advanced their method for representing real objects in virtual environments
by devising a way for real and virtual objects to interact.
The researchers tested the system by having NASA engineers simulate
a space shuttle payload assembly task. "The ideal virtual environment
system would have the participant fully convinced he was actually performing
the task being simulated," said Benjamin Lok, now an assistant professor
of computer and information science and engineering at the University
of Florida. "Parts and tools would have mass, feel real, and handle properly
with appropriate visual and haptic feedback."
The principal drawback to today's fully virtual environments is
that there is nothing to feel and to constrain the user's movements. "Imagine
trying to simulate a task as basic as unscrewing an oil filter from a
car engine in a virtual environment," said Lok. The researchers' goal
is to expand the applicability of virtual environments, he said.
They have taken a step toward mixed-reality environments by allowing
real and virtual objects to coexist in a shared virtual space. The approach
is the inverse of augmented reality systems, which incorporate a small
number of virtual objects in real environments by, for example, projecting
interactive images onto a desk.
In one example of the researchers' hybrid virtual environment,
a user can part a real window curtain so that she feels curtains. At the
same time she sees virtual representations of her hands moving virtual
curtains to reveal a virtual window and a virtual scene beyond.
The heart of the researchers' system is a method for determining
when real and virtual objects collide and providing a plausible response,
said Lok.
The key is keeping the virtual representations of the real objects
as simple as possible. The system uses four cameras and object recognition
software to determine the shapes and positions of real objects in the
environment. The camera data is used to generate virtual three-dimensional
shells in the shapes of the real objects, and the shells are forbidden
zones for all of the virtual objects in the environment.
Virtual objects can have virtual properties like velocity, acceleration
and deformability, or give. Calculating these properties in real-time
for the representations of the real objects would be extremely difficult,
so only virtual objects move or change shape in reaction to collisions.
"This allowed real objects to be integrated into a virtual environment
without additional modeling or tracking," said Lok.
When a virtual object and a shell collide, the system determines
the point of contact and shifts the virtual object to keep the shell and
the object from overlapping.
The researchers tested the hybrid virtual environment with a group
of NASA engineers. The tests show that the environment is more effective
for evaluating hardware designs and for planning assembly tasks than fully
virtual environments, said Lok.
The engineers determine the optimal design for payloads like satellites
and scientific equipment, balancing the need to take up as little room
as possible against the requirements for technicians to be able to assemble
the payload systems.
The researchers simulated a shuttle payload in a hybrid environment.
"We mocked up a common payload integration task and asked [the engineers]
to estimate how much space was required to perform the task," said Lok.
The researchers set the simulation to those spacings and had the engineers
try it out, he said.
The task consisted of fitting a real tube into a hole in a virtual
box, attaching the tube to a real mount on a real table beneath the virtual
box, feeding a real cable through the tube and plugging the end of the
cable into a real socket on the table. The virtual box was surrounded
by virtual walls that constrained the engineer's movements by flashing
red whenever any of the real objects, including the engineer's hands,
came into contact with the walls.
The simulation showed that the engineers' estimates were off.
On average, the engineers underestimated the required spacing by an average
of 5.6 centimeters, according to the researchers. The engineers were surprised
that both additional space and a tool were required to complete to task,
said Lok.
Hybrid environments provide greater realism than fully virtual
environments, and because they are easier to set up than full mockups
of payloads, they can be used earlier in the design process when changes
are easier to make, Lok said.
It will be about five years before hybrid reality systems are
available for specialized applications, and 20 years for general applications,
which require quick reconstruction of real objects in virtual environments,
said Lok. Computer vision and image-based rendering researchers would
call these problems "holy grails in their fields," he said. "Perhaps in
20 years we can have very compelling hybrid realities with production
quality solutions on real-world industrial applications."
The researchers are working on generating better virtual representations
of real objects, using hybrid realities, and determining the role of real
objects in effective virtual environments, according to Lok.
Lok's research colleagues were Samir Naik of Disney Corporation,
and Mary Whitton and Frederick P. Brooks Jr. of the University of North
Carolina at Chapel Hill. The researchers presented the work at the Association
of Computing Machinery (ACM) Symposium on Interactive 3D Graphics, held
in Monterey, California, April 27 to 30, 2003.
The research was funded by L-3 Communications Corporation, the
Office of Naval Research (ONR), the National Institutes of Health (NIH),
the National Science Foundation (NSF), the University of North Carolina
at Chapel Hill and the University of North Carolina at Charlotte.
Timeline: 5 years
Funding: Government, University
TRN Categories: Data Representation and Simulation; Human-Computer
Interaction
Story Type: News
Related Elements: Technical paper, "Incorporating Dynamic
Real Objects into Immersive Virtual Environments," Association of Computing
Machinery (ACM) Symposium on Interactive 3D Graphics, April 27-30, 2003,
Monterey, California
Advertisements:
|
July 30/August 6, 2003
Page
One
VR accommodates reality
Fractals support growing
organs
Eyes off, screen off
Chip senses trace DNA
News briefs:
Laser bursts pierce
fog
Electricity loosens
tiny bits
Nano light
stores data in polymer
See-through magnets
hang tough
Munching microbes
feed fuel cell
Crystal cracks
nurture nanowires
News:
Research News Roundup
Research Watch blog
Features:
View from the High Ground Q&A
How It Works
RSS Feeds:
News | Blog
| Books
Ad links:
Buy an ad link
Advertisements:
|
|
|
|