Virtual physicals at hand

By Kimberly Patch, Technology Research News

University of Buffalo researchers have taken a large step toward being able to send over the Internet the information a physician receives through his fingers during an exam.

Although computers have proven very useful in transmitting, manipulating and storing visual and audio information, they have lagged far behind in doing the same with tactile information because of the much larger amounts of compute power needed to communicate how something feels.

The researchers' Virtual Human Model for Medical Applications allows the tactile information a doctor gathers during an abdominal exam to be collected, stored, and reconstructed as a virtual reality exam that the examiner or another physician can revisit.

During an exam a doctor wears a glove finger that is fitted with pressure sensors that are are connected to a Pentium III laptop. The laptop collects, stores and graphs in real time the information coming from the sensors, said lead researcher Thenkurussi Kesavadas.

More compute power is needed to re-create the exam using a haptics device. A preliminary model of the playback set-up includes a thimble-sized haptics device that fits on one finger and a Pentium III desktop that harbors half a gigabyte of memory. The device allows a person playing back the exam to distinguish among bone, cartilage, soft tissue, and tumor-type masses, said Kesavadas, who is an assistant professor of mechanical and aerospace engineering at the University of Buffalo and director of the University's Virtual Reality Lab. It also allows an examiner to feel surface texture, and to distinguish between pockets of air and fluid, he said. "When [an examiner using the haptics device] presses he can feel human organ softness, hard objects and also if there's any enlargement" of internal organs," said Kesavadas.

Key to the system are the researchers' Atomic Unit Modeling algorithms, which allow a PC to analyze in real-time the 3-D changes that take place in abdominal soft tissues during an examination, including the forces absorbed by tissues near those being pressed, said Kesavadas. This ability to process tactile information in real time allows a doctor to see information about the exam during the examination, and allows a virtual exam to take place at all.

The algorithms model the human torso as many tiny 3-D objects in a process similar to finite element modeling. But instead of solving a set of global equations, which takes a lot of time, "we're using an object-oriented system where each unit knows how much of the force it can absorb [and] how much of the force it needs to pass on to the nearby units,” Kesavadas said.

"If it works, it's fantastic," said Russell Taylor, Research Associate Professor of Computer Science at the University of North Carolina at Chapel Hill. It's "a very difficult problem for two or three reasons -- the real-time software and also the accurate modeling of the masses. Modeling deformable surfaces of any sophistication, even just muscles under skin is hard to do in real-time," he added.

The Visible Human Data Set developed by the National Institutes of Health (NIH) is serving as raw data for the system, which uses objects no bigger than 8 mm. The researchers are gathering additional data, using the system itself to refine the model. Kesavadas said he expects a haptics set-up to be fully integrated with the data they are collecting sometime next year.

The Virtual Human Model for Medical Applications has several potential applications, according to Kesavadas. The first is emergency room situations where patients are checked for things like enlarged organs or appendicitis in order to determine whether they need a more advanced test like an MRI. Second, its data collecting abilities could be used to track changes in a patient over time. "We'll be able to quantify the size of a lump -- this could be used as a methodology for [testing] the effectiveness of a medication, for instance," Kesavadas said. Third, using the haptics device, physicians will be able to compare data collected from different patients at different times.

The researchers are also looking into applying the technology to engineering and aerospace problems that involve giving people tactile feedback from places they ordinarily couldn't touch. "For instance when astronauts go up in space they have loss of sensitivity in their gloves because they're very cumbersome," said Kesavadas. The technology could be used for applications like "kick[ing] back feedback to the astronaut to know exactly what he's doing," he said.

Timeline:  >1 year
Funding:  Government and University
TRN Categories:  Human-Computer Interaction
Story Type:  News
Related Elements:  Photo




Advertisements:



June 21, 2000

Page One

Nature nurtures nanotech

Virtual physicals at hand

Sandia speeds microtube chip making

Data compression makes the heart grow fuzzy

Multicast promises lighter wireless Internet




News:

Research News Roundup
Research Watch blog

Features:
View from the High Ground Q&A
How It Works

RSS Feeds:
News  | Blog  | Books 



Ad links:
Buy an ad link

Advertisements:







Ad links: Clear History

Buy an ad link

 
Home     Archive     Resources    Feeds     Offline Publications     Glossary
TRN Finder     Research Dir.    Events Dir.      Researchers     Bookshelf
   Contribute      Under Development     T-shirts etc.     Classifieds
Forum    Comments    Feedback     About TRN


© Copyright Technology Research News, LLC 2000-2006. All rights reserved.