search.noResults

search.searching

saml.title
dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
EMERGING TECHNOLOGY AND MODERNIZING THE ARMY


BREAKING DOWN THE PROCESS Te process begins with data owners uploading a collection of documents filled with valuable troubleshooting content to a centralized repository. A software toolkit accurately extracts the content from a variety of file formats. Next, the LLM encodes the raw data into a numerical format, known as embeddings, which are then stored in a specialized database and grouped together based on semantic similarity. For example, if a docu- ment contains information for setting up a new Outlook e-mail account, this document could be grouped together with other documents containing e-mail procedures.


Te first AI agent references the collection of documents to engage with the user, collecting relevant information about the issue at hand. Te agent asks questions based on its understand- ing of common problems and attempts to categorize the issue. Additionally, the agent attempts to discern which steps the user has already tried or skipped. A second AI agent uses the infor- mation collected from the first agent to query the embeddings database, retrieving content which is most likely to help solve the troubleshooting issue. Te third AI agent, powered by another LLM, uses the retrieved information as context to generate a human-like response back to the user.


AN LLM FOR EVERYONE Te background technology and techniques associated with LLMs have become widely available and accessible in recent years. Several companies and platforms have emerged to provide numerous solutions for consumers to use and experiment with different LLMs. Hugging Face is one such platform that has democratized access to open-source models. Ollama, another platform, allows users to run open-source LLMs on a personal computer offline using only local resources, enabling users to query the LLM without connecting to the Internet or an exter- nal cloud. Furthermore, big tech companies have developed


AI tools are not without flaws. A major drawback to LLMs is their tendency to confidently provide inaccurate answers when they lack a solution.


significant interest in creating and improving their own LLMs while simultaneously providing them for public use. Some models include Microsoft’s Phi-3, Meta’s Llama, Mistral AI’s Mistral7B and Anthropic’s Claude.


While there has been significant innovation surrounding the development and availability of LLMs, there are still circum- stances in which organizations may require specific use cases that necessitate a custom solution. A common use case is the ability to process and reason with a company’s internal data. One approach is fine-tuning, where additional training data is provided to a pre-trained LLM to adapt its responses to organization-specific questions. However, this method has a significant drawback as it largely resembles the initial training process, although with less data, and still requires powerful computational resources.


Another more popular and feasible approach being adopted by many organizations is retrieval-augmented generation (RAG), a method that helps LLMs provide more contextually aware responses by sourcing information from a collection of docu- ments, which is the method utilized in our help desk example. RAG comprises two components: a retriever and a generator. Te retriever searches a set of documents and returns those most relevant to a user’s question. Te generator then uses the relevant information obtained from the retriever as context to generate a response.


AI IN THE ARMY Consumers are beginning to see the effect of AI products in both commercial and common environments, but AI is also becoming widely utilized in the Army. “As an organization, we’ve been leveraging AI technology to help us write NCOERS [Noncommissioned Officer Evaluation Reports], especially when we encounter difficulties in structuring our thoughts and language,” said Sgt. Alfredo Rodriguez, radio communications team lead, 1/150th Cavalry Regiment Headquarters and Head- quarters Troop. “Our AI tools enable us to quickly input our written content and receive assistance in correcting grammati- cal errors and improving overall structure.”


Project Manager Mission Command (PM MC) is interested in the role AI will play in the tactical environment, where connectivity is often disrupted, disconnected, intermittent or low-bandwidth (DDIL). Te Data Engineering, Architecture and Analysis (DEA2) team within PM MC is developing one such AI tool for DDIL environments that is able to operate with zero network connectivity and has the added benefit of being deployable on a wide variety of devices ranging in computing


https://asc.ar my.mil 39


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56  |  Page 57  |  Page 58  |  Page 59  |  Page 60  |  Page 61  |  Page 62  |  Page 63  |  Page 64  |  Page 65  |  Page 66  |  Page 67  |  Page 68  |  Page 69  |  Page 70  |  Page 71  |  Page 72  |  Page 73  |  Page 74  |  Page 75  |  Page 76  |  Page 77  |  Page 78  |  Page 79  |  Page 80  |  Page 81  |  Page 82  |  Page 83  |  Page 84  |  Page 85  |  Page 86  |  Page 87  |  Page 88  |  Page 89  |  Page 90  |  Page 91  |  Page 92  |  Page 93  |  Page 94  |  Page 95  |  Page 96  |  Page 97  |  Page 98  |  Page 99  |  Page 100  |  Page 101  |  Page 102  |  Page 103  |  Page 104  |  Page 105  |  Page 106  |  Page 107  |  Page 108  |  Page 109  |  Page 110  |  Page 111  |  Page 112  |  Page 113  |  Page 114  |  Page 115  |  Page 116  |  Page 117  |  Page 118  |  Page 119  |  Page 120