The Failure of VizSim
© Ben Delaney 2003
www.CyberEdge.com
This story was written in March of 2003 for a newsletter whose publisher refused to print it due to its political nature. We ask that you form your own opinion of its value and present it here as it was offered for publication.
As recent events in Iraq have shown, VizSim has one major failing as a training technology. That soft spot has become increasingly obvious as US and UK troops take casualties, and as an Apache helicopter is downed by men with rifles. That failure is a failure of imagination. As we used to say in the programming business, before a bunch of internet wiz kids showed us how to make money without making products, garbage in, garbage out. Our recent imbroglios in Iraq demonstrate that that principal still holds. The best technology in the world, which the US military certainly has, is no replacement for a lack of imagination, a refusal to consider all possibilities, and a phalanx of yes-men.
From the New York Times of March 25, 2003:
A few days later, on Thursday, March 27, 2003 Lt. Gen. William Wallace, commander of U.S. Army forces in the Persian Gulf, made a much-quoted comment that raised hackles all over Washington DC. Talking about the fierce and guerrilla-style resistance of Iraqi militia groups, Wallace said, "The enemy we're fighting is a bit different than the one we war-gamed against."
GIGO
Garbage In, Garbage Out. GIGO. That was a popular catchphrase when I started programming computers, way back in the last Ice Age. The hoariness of the adage does not reduce its truth. What you get out of a computer program is only as good as what you put in.. Every generation, it seems, has to learn this truth for it self, as witnessed by the never-ending software flaws and screw-ups that we poor computer users are forced to tolerate. Every engineering task is limited by the quality of the assumptions and data that form its basis. If those assumptions, or the assumptions on which they are based, or the assumptions on which those assumptions are based - well you get the picture - are bad, or if the data used to form an assumption is bad, or if the logic used to process the data is bad, well, garbage in, garbage out.
In my years as a programmer and systems analyst, I saw three primary reasons for bad assumptions leading to a GIGO situation. They are optimism, ignorance, and laziness.
Optimism is almost never appropriate when planning an engineering project, be it hardware, software, or highway construction. Optimism leads to too-tight schedules, expectations of instant success, and under-budgeting. Optimism causes one to believe that certain bad things won't happen and that other good things will happen.
Optimism is what causes managers to ignore warnings of possible trouble spots, reduce time for developing error-trapping routines, provide time and budget estimates that are too small, and to hire too few people to get a job done. Optimism leads to frantic pushes to make deadlines. Optimism I what caused NASA's shuttle flight managers to say, "that light-weight piece of insulation couldn't have hurt the shuttle". Optimism is what made US war planners expect the Iraqi people to welcome our troops with open arms. Optimism is good as a life philosophy, and lousy as an engineering protocol.
Ignorance, and its corollary, hubris, are often the root causes of undue optimism. Engineers who graduate school without ever having built anything will be ignorant of the gut-level knowledge of what is strong enough, reliable enough, foolproof enough. Ignorance leads to bad specifications. Ignorance leads to ignoring otherwise well-known problems, and their solutions. Ignorance wastes time by causing the wheel to be reinvented thousands of times - as we have seen recently during the dot-com bubble, when ignorant kids developed applications and tools over and over because they didn't know the problems had already been solved. Ignorance leads to awful demos of products that may have applicability, if only the people promoting them knew the least bit about their potential customers. Ignorance creates simulations that do not simulate the real world, but look absolutely lovely on a big screen.
Laziness is considered a cardinal sin by the go-getters of commerce and industry. In many ways they are right. Laziness leads one to cut corners, and when combined, as it often is, with ignorance, can cause fatal errors. In software development, laziness is often manifested in inadequate testing, especially of unlikely error conditions. There were scores of times during my programming career that I was told that certain errors could never happen, or that certain combinations of conditions could never exist. Almost inevitably, those errors arose, and those conditions coalesced, and the system crashed as a result. My managers were too lazy to test for those possibilities, and they ultimately paid the price, in system failures, busted budgets, and poor reviews. But those were just software projects for banks and insurance companies and non-critical systems. Laziness in wartime is often fatal. Unfortunately, it is often the grunt in the trench who pays for a manager's laziness with his life.
War Game GIGO
Let's think again about General Wallace's remark. "The enemy we're fighting is a bit different than the one we war-gamed against." Taken on its face, this remark is not surprising or unusual. After all, it is impossible to model and simulate every possible scenario. Nearly every real-life situation will differ from its simulation in some aspect. But when the situation is a ground fight on foreign territory, against an enemy fighting to protect his own soil, we need to be sure that every possibility, no matter how remote, is evaluated, and that we thoughtfully and honestly assess the potential for each scenario to unfold.
However, General Wallace points out an error created by optimism, ignorance, hubris, and even possibly by laziness. As such, it serves as an excellent example of how not to do war gaming, which these days is all based on simulation and VizSim. The issue is not the quality of the simulation, or the clarity of the displays, or the integrity of the programmers. The issue is ignorance of what was likely to happen on the ground in Iraq, optimism that we would find the best possible scenario unfolding, and hubris in forming our assumptions.
Reports of the battle to reach Baghdad remind me of my American history lessons. We learned, during our study of the Revolutionary War, that the British commanders were amazed and disgusted by the Colonials' tactics. It seem that our forebears refused to stand and fight like the honorable British did. No, our forces used what we now call asymmetrical warfare tactics: hiding behind cover, ambushing British troops, striking quickly and unexpectedly and moving on. Well by golly, what a surprise - the overmatched Iraqi troops used these very same techniques against our troops. And they had some of the same successes the American Colonials had against the completely superior British regulars.
Could we have foreseen this resistance? Of course we could have. Could we have built it into our war games, and trained our commanders and forces to deal with it? Of course. Did we? Apparently not well enough.
Let us also remember the potential problems of a 300 kilometer-long supply line through hostile territory. One wonders how well that was simulated, and how many different scenarios were played out regarding that contingency.
Now, I don't mean to put all the blame for bad simulation on the military. I know and admire many of the dedicated people who plan and create these training systems, and I know that they do their best to do a good job. The problem is not theirs alone, and it is not isolated in the military. The root of bad simulation is usually in the executive offices. The techniques to avoid bad simulations are well known. The will to implement them is what is lacking.
Remember Murphy
"If anything can go wrong, it will go wrong." Thus sayeth the immortal Murphy, whose law we must remember if we are not to suffer the consequences. Designers of simulations obtain no absolution from Murphy's strictures. Thorough testing involves not just looking at likely problems, but thinking of the least likely problems, the highly unlikely problems, the unusual situations, the 100-year incidents, the extremely rare incidents, in short, whatever could go wrong. Then, a good tester, tests what couldn't possibly go wrong. Nonetheless, something will probably still go wrong.
If you are simulating financial flows, or a baseball game, or the reaction of Gorgons to your new mega-blaster, oversights in testing will cause problems ranging from inconvenience to a few dollars misplaced. But if you are simulating the actions of enemy troops under attack, sloppy testing leads to sloppy training, and sloppy training leads to dead soldiers.
Of course, all of this assumes that the initial planning is good, and that all of the likely scenarios are in fact built in to the simulation and tested. A rear action by angry locals is not unlikely when one is invading a sovereign state. But from General Wallace's comments, one might assume that that contingency was ignored. Blame that omission on optimism and hubris. But the testers should have picked it up.
The lack of imagination, or in the vernacular "thinking outside the box" is inexcusable in designing a military sim. I wasn't there, but I bet that during a design review for a training simulator, a bunch of captains and majors told some general exactly what he wanted to hear. Apparently, he wanted to hear that the Iraqis would welcome us with open arms. Apparently that is not what happened. Did no one have the guts to suggest a less pleasant alternative? Or was that suggestion put back "in the box," judged too unlikely?
The bottom line remains the same in software development, or any engineering project. Garbage in, garbage out. Simulation has no immunity. The only way to make good VizSim is to stay on top of the three foils - optimism, ignorance, and laziness. Are your engineers being given the direction and time to deal with these problems? If they aren't you had better make time to fix things later, because you certainly will have to.
I just hope no one is shooting at the people who use your simulation.
www.CyberEdge.com
This story was written in March of 2003 for a newsletter whose publisher refused to print it due to its political nature. We ask that you form your own opinion of its value and present it here as it was offered for publication.
As recent events in Iraq have shown, VizSim has one major failing as a training technology. That soft spot has become increasingly obvious as US and UK troops take casualties, and as an Apache helicopter is downed by men with rifles. That failure is a failure of imagination. As we used to say in the programming business, before a bunch of internet wiz kids showed us how to make money without making products, garbage in, garbage out. Our recent imbroglios in Iraq demonstrate that that principal still holds. The best technology in the world, which the US military certainly has, is no replacement for a lack of imagination, a refusal to consider all possibilities, and a phalanx of yes-men.
From the New York Times of March 25, 2003:
To set the stage for the assault [on Baghdad], the United States military hammered Iraqi radar and tried to suppress surface-to-air missiles. But the Iraqis had a low-tech solution: they deployed a large number of irregular fighters who were equipped with machine guns and small arms.In a video game, dying doesn't hurt. But on a real battlefield, with an angry enemy firing live rounds, death is final, and tragic.
As the helicopters took off, they flew low off the ground to make themselves less inviting targets for surface-to-air missiles. But that made them vulnerable to the small-arms fire. Thirty of 32 Apache helicopters were struck by small-arms fire.
One helicopter went down [near Karbala], and its two-man crew was captured. The Army was so concerned that the Iraqis would get their hands on the technology that they fired two ATACMS missiles today to destroy the helicopter. Because of bad weather after the action, the military had no report on whether they succeeded.
The Apaches destroyed only 10 to 15 Iraqi armored vehicles. American military commanders say they are rethinking their helicopter tactics as a result of the events of the past 24 hours.
A few days later, on Thursday, March 27, 2003 Lt. Gen. William Wallace, commander of U.S. Army forces in the Persian Gulf, made a much-quoted comment that raised hackles all over Washington DC. Talking about the fierce and guerrilla-style resistance of Iraqi militia groups, Wallace said, "The enemy we're fighting is a bit different than the one we war-gamed against."
GIGO
Garbage In, Garbage Out. GIGO. That was a popular catchphrase when I started programming computers, way back in the last Ice Age. The hoariness of the adage does not reduce its truth. What you get out of a computer program is only as good as what you put in.. Every generation, it seems, has to learn this truth for it self, as witnessed by the never-ending software flaws and screw-ups that we poor computer users are forced to tolerate. Every engineering task is limited by the quality of the assumptions and data that form its basis. If those assumptions, or the assumptions on which they are based, or the assumptions on which those assumptions are based - well you get the picture - are bad, or if the data used to form an assumption is bad, or if the logic used to process the data is bad, well, garbage in, garbage out.
In my years as a programmer and systems analyst, I saw three primary reasons for bad assumptions leading to a GIGO situation. They are optimism, ignorance, and laziness.
Optimism is almost never appropriate when planning an engineering project, be it hardware, software, or highway construction. Optimism leads to too-tight schedules, expectations of instant success, and under-budgeting. Optimism causes one to believe that certain bad things won't happen and that other good things will happen.
Optimism is what causes managers to ignore warnings of possible trouble spots, reduce time for developing error-trapping routines, provide time and budget estimates that are too small, and to hire too few people to get a job done. Optimism leads to frantic pushes to make deadlines. Optimism I what caused NASA's shuttle flight managers to say, "that light-weight piece of insulation couldn't have hurt the shuttle". Optimism is what made US war planners expect the Iraqi people to welcome our troops with open arms. Optimism is good as a life philosophy, and lousy as an engineering protocol.
Ignorance, and its corollary, hubris, are often the root causes of undue optimism. Engineers who graduate school without ever having built anything will be ignorant of the gut-level knowledge of what is strong enough, reliable enough, foolproof enough. Ignorance leads to bad specifications. Ignorance leads to ignoring otherwise well-known problems, and their solutions. Ignorance wastes time by causing the wheel to be reinvented thousands of times - as we have seen recently during the dot-com bubble, when ignorant kids developed applications and tools over and over because they didn't know the problems had already been solved. Ignorance leads to awful demos of products that may have applicability, if only the people promoting them knew the least bit about their potential customers. Ignorance creates simulations that do not simulate the real world, but look absolutely lovely on a big screen.
Laziness is considered a cardinal sin by the go-getters of commerce and industry. In many ways they are right. Laziness leads one to cut corners, and when combined, as it often is, with ignorance, can cause fatal errors. In software development, laziness is often manifested in inadequate testing, especially of unlikely error conditions. There were scores of times during my programming career that I was told that certain errors could never happen, or that certain combinations of conditions could never exist. Almost inevitably, those errors arose, and those conditions coalesced, and the system crashed as a result. My managers were too lazy to test for those possibilities, and they ultimately paid the price, in system failures, busted budgets, and poor reviews. But those were just software projects for banks and insurance companies and non-critical systems. Laziness in wartime is often fatal. Unfortunately, it is often the grunt in the trench who pays for a manager's laziness with his life.
War Game GIGO
Let's think again about General Wallace's remark. "The enemy we're fighting is a bit different than the one we war-gamed against." Taken on its face, this remark is not surprising or unusual. After all, it is impossible to model and simulate every possible scenario. Nearly every real-life situation will differ from its simulation in some aspect. But when the situation is a ground fight on foreign territory, against an enemy fighting to protect his own soil, we need to be sure that every possibility, no matter how remote, is evaluated, and that we thoughtfully and honestly assess the potential for each scenario to unfold.
However, General Wallace points out an error created by optimism, ignorance, hubris, and even possibly by laziness. As such, it serves as an excellent example of how not to do war gaming, which these days is all based on simulation and VizSim. The issue is not the quality of the simulation, or the clarity of the displays, or the integrity of the programmers. The issue is ignorance of what was likely to happen on the ground in Iraq, optimism that we would find the best possible scenario unfolding, and hubris in forming our assumptions.
Reports of the battle to reach Baghdad remind me of my American history lessons. We learned, during our study of the Revolutionary War, that the British commanders were amazed and disgusted by the Colonials' tactics. It seem that our forebears refused to stand and fight like the honorable British did. No, our forces used what we now call asymmetrical warfare tactics: hiding behind cover, ambushing British troops, striking quickly and unexpectedly and moving on. Well by golly, what a surprise - the overmatched Iraqi troops used these very same techniques against our troops. And they had some of the same successes the American Colonials had against the completely superior British regulars.
Could we have foreseen this resistance? Of course we could have. Could we have built it into our war games, and trained our commanders and forces to deal with it? Of course. Did we? Apparently not well enough.
Let us also remember the potential problems of a 300 kilometer-long supply line through hostile territory. One wonders how well that was simulated, and how many different scenarios were played out regarding that contingency.
Now, I don't mean to put all the blame for bad simulation on the military. I know and admire many of the dedicated people who plan and create these training systems, and I know that they do their best to do a good job. The problem is not theirs alone, and it is not isolated in the military. The root of bad simulation is usually in the executive offices. The techniques to avoid bad simulations are well known. The will to implement them is what is lacking.
Remember Murphy
"If anything can go wrong, it will go wrong." Thus sayeth the immortal Murphy, whose law we must remember if we are not to suffer the consequences. Designers of simulations obtain no absolution from Murphy's strictures. Thorough testing involves not just looking at likely problems, but thinking of the least likely problems, the highly unlikely problems, the unusual situations, the 100-year incidents, the extremely rare incidents, in short, whatever could go wrong. Then, a good tester, tests what couldn't possibly go wrong. Nonetheless, something will probably still go wrong.
If you are simulating financial flows, or a baseball game, or the reaction of Gorgons to your new mega-blaster, oversights in testing will cause problems ranging from inconvenience to a few dollars misplaced. But if you are simulating the actions of enemy troops under attack, sloppy testing leads to sloppy training, and sloppy training leads to dead soldiers.
Of course, all of this assumes that the initial planning is good, and that all of the likely scenarios are in fact built in to the simulation and tested. A rear action by angry locals is not unlikely when one is invading a sovereign state. But from General Wallace's comments, one might assume that that contingency was ignored. Blame that omission on optimism and hubris. But the testers should have picked it up.
The lack of imagination, or in the vernacular "thinking outside the box" is inexcusable in designing a military sim. I wasn't there, but I bet that during a design review for a training simulator, a bunch of captains and majors told some general exactly what he wanted to hear. Apparently, he wanted to hear that the Iraqis would welcome us with open arms. Apparently that is not what happened. Did no one have the guts to suggest a less pleasant alternative? Or was that suggestion put back "in the box," judged too unlikely?
The bottom line remains the same in software development, or any engineering project. Garbage in, garbage out. Simulation has no immunity. The only way to make good VizSim is to stay on top of the three foils - optimism, ignorance, and laziness. Are your engineers being given the direction and time to deal with these problems? If they aren't you had better make time to fix things later, because you certainly will have to.
I just hope no one is shooting at the people who use your simulation.
1 Comments:
I don't know the blame lies entirely within the military when the civilian authority planning this war disregarded the assessments of top military leaders re what was involved. Officials with only corporate and PR backgrounds proferred a religious zeal in promoting their high-tech, lower-manpowered invasion, and they defined the assumptions used from attack strategies to interrogation techniques. Perhaps believing simulations based such assumptions highlight the gap between simulated and actual realities. Maybe we should be developing a policy simulations.
By Jeffrey Abouaf, at August 09, 2004 3:50 PM
Post a Comment
<< Home