The future for graphic development in military simulations is wide open. More and more companies are leading in the integration of computer game-based technology into military graphic training simulators.
With shoot-em up and military-style games all the rage in the video game world for some time and reports of U.S. soldiers stationed in Iraq and Afghanistan playing them to relax in their downtime, it only stands to reason that military training technology companies would look to take advantage of the best the graphic-intensive field has to offer.
The speed of adoption is breakneck in the graphics world with new card technologies taking hold quickly in the market and aimed at the commercial video game and video sector. But is the military lagging behind in adopting game-quality graphics?
First, the military and the game industry have different definitions of quality.
According to Lieutenant Colonel Wil Riggins, Product Manager for Air and Command Tactical Trainers with the Program Executive Office for Simulation, Training and Instrumentation (PEO STRI), “Quality is a subjective term. Military simulators are superior in terms of realistic behaviors, realistic models and size and accuracy of visual databases. Some game point visualizations look more realistic but military simulations must also provide other visualizations including night vision, infrared, targeting and radar that most games do not currently provide.” He went on to say that the military is constantly seeking out improved graphics that are affordable and meet PEO STRI’s demanding training requirements.
Second, the requirements for audience and situational fidelity for training capability are more demanding and/or different than the same requirements for games focused on recruiting, Riggins said.
“We use the latest generation of commercial hardware,” Riggins added. Commercial games used for training include Operation Flashpoint, Full Spectrum Warrior and Kuma Wars.
Finally, the cost of developing software makes a difference between the military and mass-consumption commercial companies.
Budget constraints require a modified approach to accomplish objectives, Riggins said.
However, some companies are making the jump from gaming to military simulation.
Bionatics, of Montpellier, France, provides 3-D plants modeling and visualization software for architecture, landscape, 3-D animation, video game and virtual reality.
The company develops 3-D modeling software for military applications to build realistic 3-D landscape and 3-D training databases. Especially designed for
3-D real-time environments, the software models realistic plants that are adapted to simulators, virtual reality and dynamic environment refresh rates. Export formats are 3ds Max, Maya and OpenFLIGHT.
According to Stephane Gourgout, co-founder of Bionatics and executive vice president of sales and marketing, in military simulation the realism offered by a simulator is of strategic importance to optimize readiness and reduce training time. The 3-D visual database today plays a major role in this realism. However, the development of such environments has been limited because the increase of content reduces display performances and the production of high-fidelity content and new real-time workflow is expensive. Huge investments are being made today in the updates of the visual engines and the visual training databases.
The prices of the simulators were very high and the investment priorities put elsewhere than into the development of realistic visuals. “Things have now changed and the evolution of the video game industry probably had a real impact on the purchasers’ mind, seeing more realistic graphics in $100 video games than in millions of dollars simulators,” Gourgout said. “But the fast and recent evolution of the graphic cards is probably the main reason opening new 3-D visualization capabilities. New software technologies like Bionatics followed those evolutions providing a complete new generation of 3-D modeling and visualization tools able to rapidly build and provide the content now required. More realism also means more details and many new products and services were released to give an easy access to high realism at reasonable costs.
“Real-time is the common root between military and video game industries, but the objectives are very different,” Gourgout continued. “Video games need more realism and immersion feeling for entertainment purposes and the military industry needs it to improve training and mission preparation’s quality and efficiency.”
He adds that military simulation requires maximum realism for the feeling of immersion that is strategic for the quality of the training. “Video games require more sexy and nice-looking environments, which do not necessarily have to be realistic but rich and make you dream,” Gourgout said.
“In both [gaming and military simulation], Bionatics has a common goal which is to develop new technologies to build and display the most realistic 3-D worlds in real-time but the production workflow of those worlds is different,” Gourgout said. “But you see more and more middleware onto the market like artificial intelligence, database making, graphic engines … crossing the line between both industries even if, from my point of view, the business and product package in each market remains very different. In terms of technologies, what we develop in one business unit serves the other ones very often.”
Bionatics works mainly with customers developing ground-based simulation and training systems as such as helicopter, train, UAV, trucks and all types of vehicles simulators. In the military industry, customers include the U.S. Department of Defense and companies like US Socom, EADS, CAE, Thales, Boeing, Lockheed Martin and Rheinmetal.
“Bionatics has a unique technology approach based on procedural modeling allowing the fast development of realistic and geospecific landscape in 3-D. Our plants and trees are generated procedurally, our terrain is computed procedurally and even the scattering of those trees on the terrain are done procedurally allowing the fast development of very large databases with very rich content which would have been impossible to make by hand.” Gourgout continued. “Thanks to our Blueberry 3D Editor, a database of 50 x 50 Km_ can be built in three weeks by one person only with extreme details where several persons during several months would have been necessary with alternative and regular solutions.” Such technology can be easily plugged into an existing production workflow using plug-in versions developed for MultiGen-Paradigm’s Creator Terrain Studio and VEGA Prime or a SDK version, which will be released in the coming months.
Blueberry 3D consists of 2 modules: The Editor helps to create very large and highly detailed 3-D landscape by transforming geo-referenced terrain data into a procedural geometry model. The runtime environment loads and renders the 3-D database developed with the Blueberry 3D Editor through the customer’s graphic engine, with extremely fine levels of detail, dynamic shadows, wind, dense and realistic forests.
Dynamic Animation Systems
Dynamic Animation Systems Inc. (DAS) develops distributed, interactive and immersive virtual reality applications for education, entertainment and defense.
Most military simulations are based on old video technology, according to Eric Young, Executive Producer of Dynamic Animation Systems. “They don’t have professional graphic artists working on the simulations. The UST2000 for example, uses 10-year old technology.”
Video games are much more advanced than military simulations in all respects except the reality of the situation, Young said. In advanced artificial intelligence (AI), user interface, and graphics, games lead the way. In realism military simulations have the edge over games, but that realism is a major problem for the “fun factor” that keeps most military simulations out of the gaming market. Games developed by the military for recruiting are most often not used for training because of this realism factor. “‘America’s Army’ is a top notch video game, Young added, “but it’s used for recruitment, not training.”
Some commercial games have been used by the military for training, such as “Close Combat,” “Combat Mission,” and “V for Victory.” However, Young said, they don’t teach real group tactics, and there are not many types of infantry simulations.
Most simulations use video branching. The scenario has target come up and the trainee makes a shot; if there is a hit, the computer brings up branches of what will happen next. If he misses, other branches come up.
DAS’s Virtual Interactive Combat Environment (V.I.C.E.) is a low-cost yet highly realistic system that enables team-based training set in an urban environment. Parts of the V.I.C.E. technology have been in development for up to 8 years. V.I.C.E. as a concept has been in development for 2 years. V.I.C.E. offers complete freedom of movement as in a video game, but it’s made for serious training, not entertainment. Its uses small-unit training, rather than battalion-level simulation, Young said.
V.I.C.E. tracks all of the shots and can track civilian casualties. There are four major components: First is a training station or lane with 3-D projection. It uses two joysticks, one near the thumb grip of the pistol and one on the forestock. The rear joystick allows the trainee to move left and right, up and down, lean around a corner, drop to a crouch or prone position and jump.
The instructor station sets up the scenarios, controls the weapons and ammunition and tweaks the controls. The instructor can modify scenarios on the fly and can do a briefing for the trainee.
The observer station is for debriefings; the observer can see anything that is going on and can place bookmarks and to show the trainee. The observer can also edit functions and build scenarios.
There is an optional content creator to build a database, which requires professional artists. Most customers don’t buy this feature, but DAS has sold some for top secret content creation by professional designers with clearances.
Other tools in DAS’s arsenal include a tool to create fire, smoke, explosions, dust and other special effects; an editor to build 3-D databases; the TerraFirma Terrain Builder for creating topography; and Open Ocean and Surf Zone used in several simulations for maritime patrol and coastal erosion modeling.
“The future for graphic development in military simulations is wide open,” Young said. “More and more companies will be following DAS’s lead in our graphic solutions to military simulations. The major hold up is funding for development of military simulations versus what it costs to make a game. Most major game titles have budgets of $10 million or more. The government does not want to spend that kind of money on graphics alone, and in my opinion they should not, but as development costs go down you will see more graphic engines being brought into the military simulation market.”
Emergent Game Technologies
Larry Mellon is the vice president of engineering and chief architect of Emergent Game Technologies. Before he joined the game industry, he was an architect in several DARPA research programs (Synthetic Theatre of War, Advanced Simulation Technology Thrusts, Advanced Distributed Simulation) in creating new types of training simulations. He was also a key member of the HLA and RTI 2.0 design teams, which are now the standards for integrating military training simulations.
“Emergent Game Technologies is a leading provider of middleware provider for the game development industry. Emergent’s Gamebryo Element, a game engine, has been used to create more than 70 games, with 70 more currently in production.” They have also enjoyed recognition among developers creating military applications.
Game technology is being increasingly adopted for use in military training applications. “Historically, the quality of graphics used in military training applications has lagged behind graphics development in the games industry,” Mellon said. “But technology like Gamebryo has really started to level the playing field, and more resources are being allocated to developing realistic training technology—and the quality of graphics continues to improve.”
In many cases, the technology is identical—only the applications differ, Mellon continued. “That’s why we work hard to make our technology flexible. The walls between these technologies continue to come down, and we expect that to continue.”
The military’s move toward game technology is not new; it has been underway for some time. One example of a gaming training tool used by the military is “Marine Doom,” a war game used to improve military thinking and decision making. “This is one of the first cases where a game was directly used to create a training exercise,” Mellon explained. “There is also an “Ambush” program at DARPA where they are doing some very interesting work with games and training. It is also interesting to note that the technology flow is not just gaming to military training, but the reverse as well. Many of the scalability techniques used in massively multi-player games were first explored in DARPA or in the Distributed Interactive Simulation world. I’ve seen many game developer talks reference concept papers from the training community, and some aspects of Emergent’s next generation products are drawn from my background in military training simulations.”
While the lion’s share of Emergent’s customers are in the gaming industry, one of their most prominent clients in the military space is the University of Southern California’s Institute for Creative Technologies (ICT), who uses Gamebryo for a number of innovative training applications, according to Mellon.
In ICT’s FlatWorld project, interactive digital images are projected onto collapsible stage screens to create a potentially limitless interactive training environment. FlatWorld is a joint effort between ICT and the Integrated Media Systems Center (IMSC).
ICT’s Joint Fires and Effects Trainer System (JFETS) is a prototype immersive, location-based interactive application focusing on the training and evaluation of concepts for the Universal Observer (UO). An immersive cognitive training tool, the JFETS targets the development of leadership and decision-making skills in the execution of standard call-for fire tasks.
“As the ability to create realistic interactive graphics grows, particularly at a time when ongoing military conflicts make good training all the more important, we see the market for military simulation expanding, Mellon concluded.
Waltham, Mass.-based Windward Mark Interactive hopes to ride the trend in the integration of computer game-based technology into military graphic training simulators with new WindLight software that helps replicate natural light in all atmospheric conditions as photo-realistically as possible.
By layering advanced graphic lighting conditions over existing hardware- or software-based image generators, the software package provides a night-and-day difference over traditional simulations at minimal additional cost.
Using physically accurate lighting models, WindLight provides dramatically improved simulator rendering with its shader-based imaging technology.
Windward Mark president Asi Lang said his team found that nobody was serving as a bridge between the video game industry, which is the apex of computer graphic simulation technology development, and the military simulation and training sector.
“This company is basically a video game company that commercializes technology as part of our R&D.,” Lang said. “The military simulation world has actually grown toward video games in the last five years. Everybody wants to start learning from the video game world. But nobody is fulfilling this particular niche very well.”
Lang said that many simulator systems, such as OpenGL and Direct 3-D-based simulators on which WindLight is designed to easily run, are all based on very basic designs that do not provide accurate depictions of lighting and the impacts of weather on the way things look in real life. Weather and lighting conditions are often limited to just basic fog levels, rain or night-and-day simulations.
“They never do much in the way of lighting to make [the image produced] look like a photograph,” said Lang. “It creates a barrier for the pilot or whomever is using the simulator. You can run through the motions, but the immersion is never there.”
WindLight aims to address that by taking into account the physics of ambient light in the outdoors and impact clouds, sun, air moisture levels and other factors have on how the human eye perceives an outdoor image.
For instance, simulators are sharply colored with flat, unrealistic representations of the outside world, but in the real world light interplays with the landscape, creating subtle variations.
The same images with the WindLight technology layered over them have a uniform lighting simulation incorporated to reflect, including in the 3-D visuals, whatever weather and lighting condition are chosen using algorithms that mimic how light behaves in nature.
The program uses over 20 lighting parameters to mimic atmospheric conditions including general attenuation and color convergence, sun coloration, intensity, angle and attenuation, cloud density and shadowing, and haze coloration and density.
In addition, the software includes dynamic water simulations—including reflections, turbulence and refractions—and a dynamic weather framework simulating the impacts of basic rain, sleet, snow or dust storms, either man-made or naturally occurring, on the underlying image to intense storm conditions.
The system also works with filter effects, including forward-looking infrared and night vision goggles.
Beyond the excellence of the visual impact, Lang also stressed the performance, ease of use and easy integration of its very lightweight, extendable architecture.
While the better the input the better the output, Lang claims the program can work and greatly improve the graphics on nearly any simulator running on OpenGL or Direct3D with minimal installation time.
“It really is just a plug-in layer that companies can use on top of what they have,” Lang said. “They just insert WindLight on top of it.”
MultiGen-Paradigm Inc. a leading provider of real-time 3-D visual simulation software solutions, has licensed WMI’s WindLight rendering technology for use in MultiGen’s line of simulator products.