search.noResults

search.searching

dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
COMMENTARY


decide that all persons in the area are legitimate combatants— and can cease fire if that changes? Is it enough to specify that anyone wearing an enemy uniform is part of the specified target group if sensors are capable enough at differentiating uniforms and clothing? How specific does the target description need to be, considering sensor and automation capabilities, to meet the standard for saying the human was in control?


ATTITUDES AND GENERATIONS CHANGE We should also consider how policy might evolve as society’s confidence in AI increases. Current policies reflect the nascent state of current automated systems. Yet AI-based systems are improving and proliferating throughout society. Cameras no longer snap photos when we press the shutter-release button; rather, we trust the AI software to decide when everyone is smil- ing and record the best image. We have AI systems targeting us with individually tailored advertising. AI systems make million- dollar trades on stock exchanges throughout the world without human approval.


Our children are growing up in a world where they can ask an AI-powered device a question and not only get a correct answer, but the device recognizes them and addresses them by name when giving that answer. In only 20 years, some of these children will be the generals on the battlefield. In less than a generation, we should expect societal attitudes toward artificial intelligence to adjust to the demonstrated reliability that comes from improve- ment in the technology.


At what point does the human in the loop on a weapon system stop deciding whether a weapon should be used and start click- ing the “approve” button because the AI sensor system assessed the proposed target as a threat? If a family court judge rejected the results of a DNA paternity test because he didn’t think the child resembled the father, there would be shock in the court- room, followed by a quick appeal. What happens when faith in the performance of a technology is high enough that disagree- ing with what the system tells you becomes unthinkable? What happens when we reach the point where we court-martial weapon operators for placing friendly units at risk when they override weapon systems? At that point, why is the human part of the process and what role do they serve? Societal attitudes toward autonomous systems are going to change. It is highly likely we will eventually see fully autonomous weapons on the battlefield.


CONCLUSION Te technologies that allow creation of AI weapon systems are inevitable, if not already existent. It is no longer possible to


https://asc.ar my.mil 121


prevent research unique to AI weapons while allowing research into helpful civilian applications to continue, because the remain- ing research areas are all dual-use. Furthermore, rudimentary but functional autonomous weapon systems can already be created with existing technology. Te horse is out of the barn.


What we need to do now is have a serious discussion about the moral and ethical implications of AI technology. But it must be one that starts from the reality of the current state of technology, the capabilities that already exist, and recognizes that bad actors will misuse any technology in the future. We should consider not just our current morals and ethics, but also account for how soci- ety’s norms will shift over time, as they always do.


What we do about the ethical and moral implications of AI will say a great deal to future generations about how we balanced rational and emotional concerns, and what kind of character and values we had.


For more information, contact the author at Gordon.


cooke@westpoint.edu or visit https://westpoint.edu/ military/department-of-military-instruction/simulation- center, https://www.pica.army.mil/tbrl/ or https:// www.ardec.army.mil/.


DR. GORDON COOKE is director of the West Point Simulation Center and an associate professor in the Department of Military Instruction at the United States Military Academy at West Point. He holds a Ph.D. in biomechanics and an M.S. in mechanical engi- neering from Stevens Institute of Technology, as well as graduate certificates in ordnance engineering and biomedical engineering from Stevens. After graduating from West Point with a B.S. in mechanical engineering, he served as a combat engineer officer in the


11th Armored Cavalry Regiment, then spent 12 years as a civilian research engineer at the U.S. Army Armament Research, Develop- ment and Engineering Center (ARDEC), now known as the U.S. Army Combat Capabilities Development Command Armaments Center. During his time at ARDEC, he spent five years on the faculty of the Armaments Graduate School. Cooke was selected for Junior and Senior Science Fellowships, was awarded the Kurt H. Weil Award for master’s candidates, and received the U.S. Army Research and Development Achievement Award twice. He is an Acquisition Corps member and is Level III certified in production, quality and manufacturing.


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56  |  Page 57  |  Page 58  |  Page 59  |  Page 60  |  Page 61  |  Page 62  |  Page 63  |  Page 64  |  Page 65  |  Page 66  |  Page 67  |  Page 68  |  Page 69  |  Page 70  |  Page 71  |  Page 72  |  Page 73  |  Page 74  |  Page 75  |  Page 76  |  Page 77  |  Page 78  |  Page 79  |  Page 80  |  Page 81  |  Page 82  |  Page 83  |  Page 84  |  Page 85  |  Page 86  |  Page 87  |  Page 88  |  Page 89  |  Page 90  |  Page 91  |  Page 92  |  Page 93  |  Page 94  |  Page 95  |  Page 96  |  Page 97  |  Page 98  |  Page 99  |  Page 100  |  Page 101  |  Page 102  |  Page 103  |  Page 104  |  Page 105  |  Page 106  |  Page 107  |  Page 108  |  Page 109  |  Page 110  |  Page 111  |  Page 112  |  Page 113  |  Page 114  |  Page 115  |  Page 116  |  Page 117  |  Page 118  |  Page 119  |  Page 120  |  Page 121  |  Page 122  |  Page 123  |  Page 124  |  Page 125  |  Page 126  |  Page 127  |  Page 128  |  Page 129  |  Page 130  |  Page 131  |  Page 132  |  Page 133  |  Page 134  |  Page 135  |  Page 136  |  Page 137  |  Page 138  |  Page 139  |  Page 140  |  Page 141  |  Page 142  |  Page 143  |  Page 144  |  Page 145  |  Page 146  |  Page 147  |  Page 148  |  Page 149  |  Page 150  |  Page 151  |  Page 152  |  Page 153  |  Page 154  |  Page 155  |  Page 156