search.noResults

search.searching

saml.title
dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
THE CALCULUS OF CAUTION


Tis process is cumbersome but required to maintain control. Te key to maximizing value though, is to use the fewest mitigations to get to a reasonable level of risk. Tat’s where an AI-specific risk framework comes into play.


AI LAYERED DEFENSE FRAMEWORK As part of the AI Implementation Plan for Assistant Secretary of the Army Acquisition, Logistics and Technology (ASA(ALT)), announced in March 2024, Deputy Assistant Secretary of the Army (Data, Engineering and Software) is building the AI Layered Defense Framework. Te intent of the framework is to give program managers a toolkit to self-assess for AI-specific risks and be informed on relevant mitigations available to them. All AI capabilities would undergo the most basic mitigations (Layer 1) while more critical systems would get additional layers.


Tere are three layers of risk in which all AI capabilities fall:


Layer 1—AI tools that have broad value, and if compro- mised have limited potential for harm or hindrance of Army’s objectives. Te maximum benefit is achieved with the fewest controls. Using a navigation system to find lunch would fall in this category. We don’t need to invest a large amount in mitigations here.


Layer 2—AI software that offers strategic value to a more limited number of users. Tis layer will have more significant mitigation strategies to balance value and security. Key risks will be mitigated.


Layer 3—AI models that provide high value and cannot be compromised. Layer 3 will employ state-of-the-art defenses for the most valuable or critical capabilities.


Te AI Layered Defense Framework aims to incrementally increase security measures from an open, accessible strategy to a highly secure approach with stringent controls, tailored to the unique sensitivity and importance of Army data and systems. Te AI Layered Defense framework will serve as a thorough theoretical and practical framework for mitigating adversarial risks to our systems and warfighters. Here, risk is being broadly defined as the possibility that the occurrence of an event, related to AI systems, will adversely affect the achievement of the Army’s objectives. It’s a general statement which reflects the multitude of challenges the Army faces every day and the idea that many mission objectives must be achieved even if there are dangers. Te risk is not the potential of harm or injury, those have to be tolerated at some level; the risk is failing to achieve an objective.


12 Army AL&T Magazine Fall 2024


To maintain dominance in the battlefield of tomorrow, the U.S. needs to continue to lead in developing systems on the bleeding edge of technology. This means development and inclusion of AI capabilities.


While AI systems face the traditional cybersecurity risks asso- ciated with all computer systems, the AI Layered Defense Framework is concerned with building a comprehensive library of risks and mitigations unique to or inherent in AI systems: risk associated with the data used to train the system, the system itself,


MAKE CHANGE


Scaling and maturing artificial intelligence/machine learning is one of seven focus areas prioritized by ASA(ALT) in an effort to Disrupt the Status Quo through digital transformation. (Image courtesy of DASA(DES))


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56  |  Page 57  |  Page 58  |  Page 59  |  Page 60  |  Page 61  |  Page 62  |  Page 63  |  Page 64  |  Page 65  |  Page 66  |  Page 67  |  Page 68  |  Page 69  |  Page 70  |  Page 71  |  Page 72  |  Page 73  |  Page 74  |  Page 75  |  Page 76  |  Page 77  |  Page 78  |  Page 79  |  Page 80  |  Page 81  |  Page 82  |  Page 83  |  Page 84  |  Page 85  |  Page 86  |  Page 87  |  Page 88  |  Page 89  |  Page 90  |  Page 91  |  Page 92  |  Page 93  |  Page 94  |  Page 95  |  Page 96  |  Page 97  |  Page 98  |  Page 99  |  Page 100  |  Page 101  |  Page 102  |  Page 103  |  Page 104  |  Page 105  |  Page 106  |  Page 107  |  Page 108  |  Page 109  |  Page 110  |  Page 111  |  Page 112  |  Page 113  |  Page 114  |  Page 115  |  Page 116  |  Page 117  |  Page 118  |  Page 119  |  Page 120