search.noResults

search.searching

dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
A RISKY BUSINESS


M


uch like a high-stakes gambler deciding whether to raise, check or fold, the acquisition process for military systems is a series of high-stakes decisions in which we commit more resources for further


development (raise) or stop funding an effort (fold). Tere is seldom an option to delay a decision with no cost (check). Ulti- mately, Soldiers’ lives are at stake. Successful gamblers understand the risks they are taking and know how to consider these risks when making decisions.


Acquisition leaders emphasize that the Army needs to accept more risk when making decisions, and, for acquisition professionals, comfort with risk comes from understanding it. Understand- ing risk allows us to make appropriately risk-aware decisions as individuals and as an organization. We must make decisions the way successful gamblers do—with a clear understanding of risk. Troughout the acquisition process, how can decision-makers best understand the risks they are taking and make risk-aware decisions? What kind of information must decision-makers insist on being provided?


To manage risk when making a decision, Army acquisition profes- sionals often collect data through testing or experimentation (for the rest of the article, simply called testing). We then analyze the data to create the information needed to support the decision. Te cost of collecting data often limits the data we can collect; there- fore, there are limitations in what we can learn from the data. For example, when testing $500,000 missiles, the costs of missiles, test range time and personnel can quickly become too great, so the number of missiles that can be tested is limited. When data is analyzed appropriately, an important way the data limitations will show up is in the form of uncertainty in our conclusions. Understanding this helps us make risk-aware decisions.


We collect data for many purposes and from many sources to inform decisions throughout the acquisition process. Examples include performing basic science experiments to identify technol- ogies worthy of further investment, running computer models to determine the best design characteristics, using flight simula- tors to assess pilot-vehicle interfaces and range testing to support vendor selection or a fielding decision. In many cases, we test to understand how changes in the inputs to a process affect some response, the end result that is being measured. Tough the exam- ple below is of a fielding decision, the points made in this article apply any time we attempt to understand how inputs affect a response.


A RISKY BET? Consider a system that warns aircraft pilots of incoming missiles. A program manager (PM) must decide whether to field a new version of software or continue with the current version.


In this simple case there is only one input, the software version (current or new), but there could be many more. To determine whether the new software is “better” and should be fielded, the response being measured is whether the system successfully detects a simulated missile (success or failure). Te PM will consider the new software worthy of fielding if its probability of successfully detecting a missile is at least 5 percent greater than for the current software. Te PM has budgeted 20 test events for each software version.


Using the design of experiments to plan tests allows us to use expert assumptions to determine a sufficient but not excessive number of test events to achieve acceptable levels of risk.


94 Army AL&T Magazine Fall 2020


During the test, the new software is successful in 18 of 20 events (90 percent success rate), and the current software is successful in 16 of 20 events (80 percent success rate). How should the PM decide whether to field the new software? One common approach is to select the software with the greater success rate—in this case the new software. But is this a good decision? Although the new software performed 10 percent better, do test results provide suffi- cient evidence that the new software really is better? Is it possible to get these results even if the new software is equivalent or even worse? If so, how likely is that to happen? To make a risk-aware decision, the PM must be able to answer these questions. Other- wise, the PM is acting like an unskilled gambler. Tese questions and others can be answered using appropriate statistical analy- sis methods.


Te plot in Figure 1 is a result of statistical analysis of the test data described above (more specifically, logistic regression was used for analysis). Understanding what the plot tells us is critical to making a risk-aware decision.


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56  |  Page 57  |  Page 58  |  Page 59  |  Page 60  |  Page 61  |  Page 62  |  Page 63  |  Page 64  |  Page 65  |  Page 66  |  Page 67  |  Page 68  |  Page 69  |  Page 70  |  Page 71  |  Page 72  |  Page 73  |  Page 74  |  Page 75  |  Page 76  |  Page 77  |  Page 78  |  Page 79  |  Page 80  |  Page 81  |  Page 82  |  Page 83  |  Page 84  |  Page 85  |  Page 86  |  Page 87  |  Page 88  |  Page 89  |  Page 90  |  Page 91  |  Page 92  |  Page 93  |  Page 94  |  Page 95  |  Page 96  |  Page 97  |  Page 98  |  Page 99  |  Page 100  |  Page 101  |  Page 102  |  Page 103  |  Page 104  |  Page 105  |  Page 106  |  Page 107  |  Page 108  |  Page 109  |  Page 110  |  Page 111  |  Page 112  |  Page 113  |  Page 114  |  Page 115  |  Page 116  |  Page 117  |  Page 118  |  Page 119  |  Page 120  |  Page 121  |  Page 122  |  Page 123  |  Page 124  |  Page 125  |  Page 126  |  Page 127  |  Page 128  |  Page 129  |  Page 130  |  Page 131  |  Page 132  |  Page 133  |  Page 134  |  Page 135  |  Page 136  |  Page 137  |  Page 138  |  Page 139  |  Page 140  |  Page 141  |  Page 142  |  Page 143  |  Page 144  |  Page 145  |  Page 146  |  Page 147  |  Page 148  |  Page 149  |  Page 150  |  Page 151  |  Page 152  |  Page 153  |  Page 154  |  Page 155  |  Page 156  |  Page 157  |  Page 158  |  Page 159  |  Page 160  |  Page 161  |  Page 162  |  Page 163  |  Page 164  |  Page 165  |  Page 166  |  Page 167  |  Page 168  |  Page 169  |  Page 170  |  Page 171  |  Page 172  |  Page 173  |  Page 174  |  Page 175  |  Page 176  |  Page 177  |  Page 178  |  Page 179  |  Page 180  |  Page 181  |  Page 182  |  Page 183  |  Page 184  |  Page 185  |  Page 186  |  Page 187  |  Page 188  |  Page 189  |  Page 190  |  Page 191  |  Page 192  |  Page 193  |  Page 194  |  Page 195  |  Page 196