ALL THINGS CYBER
CONCLUSION AI has the potential
to revolutionize
the U.S. Army by improving efficiency, enhancing decision-making and strength- ening national security. However, it is essential to recognize that AI and automa- tion can only be effective if the underlying processes have been optimized through formal PI methodologies. By addressing the challenges and ethical considerations associated with AI implementation, the Army can harness the full potential of this transformative technology to maintain its competitive edge. Strategic investment in AI research and development, coupled with a focus on responsible AI develop- ment and deployment, will be critical for realizing the full benefits of AI-powered PI within the U.S. Army.
MAPPING A PLAN
A draft process map sits on a desk in the U.S. Army Financial Management Command headquarters in Indianapolis, Indiana, on May 13, 2019. U.S. Army Financial Management Command’s Business Process Management directorate completed a three- year mission of documenting and standardizing all of the Army’s business processes impacting financial statements on October 1, 2020. (Photo by Mark R.W. Orders- Woempner, U.S. Army)
as data profiling, cleansing and the use of AI tools for continuous monitoring and validation.
Interpretability and transparency are also essential. Understanding how AI systems reach their conclusions is crucial for build- ing trust and ensuring responsible use. Te problem of “black box” AI, where decision- making processes are unclear or opaque, can be especially concerning in high-stakes situations. To address this issue, the use of simpler, more interpretable models— such as decision trees, linear regression and rule-based systems—should be prioritized, along with explainable AI techniques and feature importance analysis to enhance transparency.
Bias and fairness present another major challenge, as AI systems can inherit and
amplify biases present in the training data. Addressing this requires careful curation of diverse datasets, implementation of bias detection tools and the establishment of mechanisms for continuous monitoring and human oversight to safeguard against unfair or discriminatory outcomes.
Finally, ethical considerations in autono- mous systems must be carefully managed. As AI evolves, the possibility of auton- omous weapons systems introduces serious ethical concerns regarding human control and accountability. Ensuring fair- ness, transparency and human oversight throughout the AI life cycle is essential, along with enforcing robust data protec- tion protocols, security measures and continuous auditing to uphold ethical standards in all applications.
For more information, contact the author at
charles.t.brandon.civ@
army.mil.
CHARLES T. BRANDON III, DBA, is the Army director for business process improvement
and reengineering
in the
Office of the Chief Information Officer at the Pentagon in Washington, D.C. He holds a DBA in quality systems management from the National Graduate School of Quality Management, an M.S. in information technology from the American InterContinental University and a B.S. in economics from Alabama Agricultural and Mechanical University. Te views expressed are his own and do not necessarily represent the opinions of the U.S. Army or DOD.
https://asc.ar my.mil
35
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50 |
Page 51 |
Page 52 |
Page 53 |
Page 54 |
Page 55 |
Page 56 |
Page 57 |
Page 58 |
Page 59 |
Page 60 |
Page 61 |
Page 62 |
Page 63 |
Page 64 |
Page 65 |
Page 66 |
Page 67 |
Page 68 |
Page 69 |
Page 70 |
Page 71 |
Page 72 |
Page 73 |
Page 74 |
Page 75 |
Page 76 |
Page 77 |
Page 78 |
Page 79 |
Page 80 |
Page 81 |
Page 82 |
Page 83 |
Page 84 |
Page 85 |
Page 86 |
Page 87 |
Page 88 |
Page 89 |
Page 90 |
Page 91 |
Page 92 |
Page 93 |
Page 94 |
Page 95 |
Page 96 |
Page 97 |
Page 98 |
Page 99 |
Page 100