| Decision 
        tool keeps it simpleBy 
      Chhavi Sachdev, 
      Technology Research News
 The old days of dotting miniature tanks 
        and soldiers across a map in order to plan military strategy may be coming 
        back in a virtual fashion. Researchers at the University of Arizona have 
        devised software that packs strategic information into a three-dimensional 
        model that can be run on an ordinary laptop.
 
 The software uses symbols to represent real-life objects. Just as air 
        traffic displays have a universally-accepted set of symbols, the semantics 
        and syntax of which are clear to speakers of all languages, the software 
        has a set of symbols for military movements, said Jerzy Rozenblit, a professor 
        of computer engineering at the University of Arizona.
 
 The set is based on standard military symbols. A square enclosing a circle 
        with an X through it, for example, denotes a mechanized unit, or troops 
        accompanied by armored vehicles. The software depicts the square as one 
        side of a cube. The other sides display information such as remaining 
        fuel, distance to destination, and strength of the force, said Rosenblum 
        "A simple display of that kind does not clutter the display from a visual 
        and perceptual point of view," he said.
 
 These types of visualization concepts should help commanders make decisions 
        in complex battlefield situations, and their use is not limited to traditional, 
        conventional types of conflict, said Rozenblit.
 
 The software could be used for plotting disaster relief, refugee relief 
        operations, delivering food supplies, or setting up democratic governing 
        systems in unstable political arenas, Rozenblit said. Any situation that 
        requires a lot of strategic information to be packed into a small space 
        and the various players to be moved around would benefit, he said.
 
 The software is also widely applicable because it allows for sophisticated 
        visualization without requiring a lot of computer power. "We really don't 
        deal with high resolution, very detailed graphical representations. We 
        work on an abstract level, so this can be run on a laptop machine," said 
        Rozenblit.
 
 The program has three levels, he said. The top layer displays a scenario. 
        The middle layer consists of the graphics routines and engines that drive 
        and animate the scene, Rozenblit said. The bottom layer is a simulation 
        model that executes the behavior involved with the mission. "What's really 
        involved here is an artificial intelligence component that lends decision 
        support," he said.
 
 The simulation tool includes genetic algorithms that sift through the 
        thousands of available courses of action and narrow them down to, say, 
        five of the best options, Rozenblit said. “It takes seconds to generate 
        the options,” he said.
 
 The options are displayed in a list for the commander, who can run them 
        in turn to see the consequences of each choice, he said. “We are now building 
        in the capability to run them automatically,” Rozenblit said.
 
 Scenarios are created by the user and can be saved in a library and reloaded. 
        The symbols can be customized, or built upon the graphical elements already 
        in the library, said Rozenblit.
 
 Since there is no coding involved, training is relatively intuitive. It 
        could be used not only to see possible courses of action, but also to 
        train commanders by checking to see if they make the right decisions, 
        said Rozenblit.
 
 The work is a nice use of artificial intelligence that could prove very 
        useful, said Paul Juell, an associate professor of computer science at 
        North Dakota State University. "Too many people try to produce beautiful 
        pictures and forget the goal of presenting information."
 
 The system's winning point is that the information is clear despite the 
        use of spatial coordinates to present dense glyphs of information, said 
        Juell. "It is hard to present enough details to aid the decision process 
        but not to hide the underlying patterns."
 
 These overall patterns are important for decision making, Juell said. 
        "Simple global displays, such a trend line, hide the local details. The 
        display produced by this project gives the best of both worlds," he said.
 
 Because the system presents only good options and filters out the rest, 
        "the viewer [can] address the problem rather than having to deal with 
        hundreds of options," said Juell.
 
 One limitation of the system, however, is that a suitable set of features 
        will have to be found every time the system is used in a new domain, Juell 
        added.
 
 The researchers plan to test the visualization system within the next 
        six months to gauge reaction times and error rates, and see how well decisions 
        are actually made, said Rozenblit.
 
 Currently, the software is like a chess game with two players on opposite 
        sides. The researchers are planning to add the capacity for more than 
        two points of view. They also have plans to connect the software directly 
        to databases that will stream real-time information into the model, Rozenblit 
        said. Multiple points of view should be possible within a year; real-time 
        data support will take two to three years, he said.
 
 The researchers are also looking at the effectiveness of the symbols, 
        and developing a graphical means of expressing probabilities and uncertainties, 
        such as the chances of a mission's success, said Rozenblit.
 
 Rozenblit's research colleagues were Liana Sauntak and Faisal Momen at 
        the University of Arizona, Michael Barnes at the U.S. Army Research Laboratories, 
        and Ted Fichtl of The Compass Foundation. They presented the research 
        at the 2001 IEEE International Conference on Systems, Man, and Cybernetics 
        held in Tucson, Arizona from October 7 to 10, 2001. The research was funded 
        by the U.S. Army Research Laboratories.
 
 Timeline:  now; 2-3 years
 Funding:  Government
 TRN Categories:  Data Representation and Simulation; Artificial 
        Life and Evolutionary Computing
 Story Type:   News
 Related Elements:  Technical paper, "Intelligent Decision 
        Support of Support and Stability Operations (SASO) through Symbolic Visualization" 
        in proceedings of the 2001 IEEE International Conference on Systems, Man, 
        and Cybernetics, Tucson, Arizona, October 2001.
 
 
 
 
 Advertisements:
 
 
 
 | March 
      13, 2002
 
 Page 
      One
 
 Decision tool keeps 
      it simple
 
 Ties that bind boost 
      searches
 
 Sapphire chips linked 
      by light
 
 Lasers grasp 
      cell-size water balloons
 
 Fuel cell aimed at handhelds
 
 
 
   
 
 
   
 News:
 Research News Roundup
 Research Watch blog
 
 Features:
 View from the High Ground Q&A
 How It Works
 
 RSS Feeds:
 News
  | Blog  | Books  
 
   
 Ad links:
 Buy an ad link
 
 
 
         
          | Advertisements: 
 
 
 
 |   
          |  
 
 
 |  |  |