REDEFINING REALITY

By January 26, 2022March 6th, 2023Army ALT Magazine, Science and Technology

CLOSE QUARTERS: Current processes and network limitations force staffs to be together physically to conduct operations. The metaverse provides a potential solution that could enable operations while inherently making the command post more survivable by distributing operations. (Photo by Maj. Vonnie Wright, 1st Brigade Combat Team, 101st Airborne Division Public Affairs)

 

 

Creating a metaverse model for mission planning.

 

 

by Thom Hawkins, Lt. Col. Matt Maness, Mark Dennison and Pete Khooshabehadeh

 

Maj. Gen. Narrowpass impatiently watched the timer on his augmented-reality visor countdown—still two more minutes. Turning toward his autonomous security bot, ASB-3, he grumbled, “I hate waiting around to talk to my commanders just because some soulless artificial intelligence has calculated the optimal windows to conduct long-haul transmissions.” ASB-3 knew it was not appropriate to respond to the comment; Narrowpass understood the consequences of transmitting outside of the communications survivability window and would not appreciate a reminder.

The mission was too complex and there were too many variables; voice and video transmissions were not going to cut it at the moment. At his request, the distributed division staff and all subordinate command post constellations had aligned their communications windows to allow for a full rehearsal within the division’s cross-reality command post.

Narrowpass initiated his connection to the tactical metaverse and was immediately greeted by his chief of staff’s avatar as well as the primary staff and all subordinate commanders. The ominous timer above the all-domain common operational picture showed fewer than 15 minutes before everyone in this virtual environment would need to drop off for a survivability move or a temporary transmission halt. “Chief, roll the terrain fly-through and cue up the friction points on the 3D model. … I need to understand where we’re going to need to intervene in the fight.” Although the military metaverse is just a concept today, Army researchers are exploring its potential for the future.

THE COMMON OPERATIONAL PICTURE

“I need to understand” is perhaps the primary driver behind technology for mission command. The fundamental concept of developing and maintaining a common operational picture is to enhance situational awareness, enable situational understanding and promote shared understanding across all echelons. Executed through complex application programming interfaces that link digital systems to display information on 2D and 3D maps or by manually tracking friendly and enemy information on paper maps, the process hasn’t evolved much in the last three decades. The effort requires large, cumbersome command posts resourced with centralized people and technology that conduct the operations process and ultimately generate a common operational picture that commanders and staffs can use to make the most timely and accurate decisions possible.

Unfortunately, as operations have grown more complex and data more prolific, units have struggled to effectively conduct information and knowledge management. Command posts have expanded in size and scope to meet the need. Increases in the number of personnel and dependency on the network have left today’s command post vulnerable to enemy attack without sufficient mobility and survivability. The metaverse provides a potential solution that could enable the operations process while inherently making the command post more survivable by distributing operations, as well as reducing the physical and electromagnetic footprints.

MEET ME IN THE METAVERSE: In the future, Soldiers could be able to “drop in” to a virtual environment to conduct mission planning prior to execution. Although a “military metaverse” is still only a concept, researchers and scientists across the Army are exploring the potential applications. (Photo by Mission Command Battle Laboratory)

WHAT’S A METAVERSE?

Coined by Neal Stephenson in his 1992 novel “Snow Crash” to describe an online world where users interact in a virtual space, the metaverse already has become familiar through massive multiplayer online games and virtual worlds like Second Life, Roblox or Minecraft. Just as mobile devices changed how the internet was consumed over the last 10 years, a new generation of technology—in this case, virtual and augmented reality headsets—are enabling a new perspective on how we consume content. No longer limited by the confines of flat screens, these headsets allow users to perceive and interact with 3D objects and media rendered on top of or in place of the physical world. The concept has gained even more popularity with the pandemic-driven acceleration of remote work. Facebook has even pinned its future on this shift, leveraging its acquisition of virtual reality headset maker Oculus and development of its own metaverse platform, Horizon Worlds, it changed the name of its parent company to Meta in October 2021.

One of the most thorough explorations of the metaverse was written as a nine-part blog series by Matthew Ball, a venture fund partner and respected business writer. Ball’s primer focuses on seven aspects of the metaverse:

  • Networking
  • Virtual platforms.
  • Hardware
  • Computing power.
  • Interchange tools and standards.
  • Payment services.
  • Content, services and assets.

He discusses the progress in each area, as well as the way to fully enable and adopt the metaverse as a successor to the mobile internet.

FROM VIRTUAL TO REALITY: As large command posts disaggregate their physical footprints and rely on digital environments, concepts such as the metaverse may help staffs conduct planning for real-world operations. (Image courtesy of Program Executive Office for Intelligence, Electronic Warfare and Sensors)

NETWORKING

Bandwidth is a scarce resource on today’s battlefields and will require a technological breakthrough to fully enable the metaverse. However, many tactical scenarios could benefit from information that is not particularly dense, and therefore requires less bandwidth to transmit, such as geospatial position, summary of unit status, current objective, etc. Furthermore, information that is more dense, such as a high-resolution 3D terrain model of the operational area or video of an unknown enemy vehicle to train aided target-recognition algorithms, does not need to be sent in real time over the network. This will require the Army to utilize cloud services that are not only efficient in moving and processing information but are controlled by intelligence that understands the value of information for the clients that are requesting, or are likely to request, data and services.

A critical problem that could mean the difference between life and death is the delay or latency of this information. The assumed change—or lack thereof—in position of friendly units can cause a waterfall of decisions across the metaverse and change the perspective of the state of the mission. To enable better decision-making, the Army must create a hyper-efficient network where only the right, relevant information is transmitted. This notion of real-time information updates is a critical component of the immersive hardware that will be utilized in the metaverse, since the representation and actions of a Soldier’s “digital twin” must be synchronized across all other devices connected to their shared space. Unlike the commercial world, the battlefields of the metaverse will involve combatants trying to bring down the networks of their opponents, or to alter them so that the information flow degrades their decision-making, e.g., by using deep fake imagery.

MICROSOFT FLIGHT SIMULATOR

The popular Microsoft Flight Simulator video game series includes a “digital twin” of the planet, combining maps and satellite imagery to render buildings and even trees with real-time weather and air traffic. This is a huge model that is impractical for the constrained bandwidth at the tactical edge, but this model and others like it can allow for hyper-realistic modeling and simulation of vehicles and weapons effects at higher, cloud-connected echelons, or on home station resources. Rendering new objects is facilitated by world-building packages such as NVIDIA’s Omniverse, which include materials, textures and movement as building blocks for construction and simulations. Even lower-resolution versions of these world-based models can be used for rehearsal-of-concept drills or mission walkthroughs, regardless of whether a unit is co-located.

PICTURE THIS: The immersive hardware in use today almost completely obscures the user’s view of the real world; ultimately, displays will need to dynamically adjust between rendering content on top of reality or replacing everything with synthetic content. (Image by Mission Command Battle Laboratory)

VIRTUAL PLATFORMS

The stovepiped platforms that incorporate the Army’s digital training, fighting and enterprise systems will not suffice to realize the metaverse. The metaverse will require that a Soldier’s digital presence transcend different training platforms and seamlessly integrate into other warfighting tools. These tools must also enable the user to interact with battlefield data from different perspectives, be it on a traditional 2D display but also from an immersive shared virtual space. This will require architectures that enable data from the real world or a simulation to seamlessly render across a variety of display media, regardless of how they are deployed. The commercial gaming world has been adapting to this challenge, enabling cross-play of the same game between different types of hardware such as PCs and gaming consoles.

While the appearance of avatars may not be as much of a priority for our Soldiers, digital assets can be used in other ways that could be useful—for example, including in one’s identity system preferences or custom-language models that can aid with human-machine teaming even when a user logs into a new system. Moreover, some games are enabling a subset of users to play wearing virtual reality devices from a godlike top-down perspective, whereas other players embody avatars and view the world in first person from the ground. Gaming concepts like this seem to fit neatly into the employment of this capability at various echelons, where different types of data and interaction are necessary.

From the tactical perspective, the Army must build systems that have a common look and feel, regardless of how the system is worn or interacted with. A Soldier should be able to utilize their head-mounted display, their handheld system and their desktop system with the same profile and easily switch between them utilizing the same persona.

HARDWARE

Systems like the Android Tactical Assault Kit (ATAK), a handheld tablet or phone housed in a rugged case, offer warfighters a digital perspective of their operating environment. ATAK can visualize maps, both 2D and 3D, as well as a host of graphic control measures to represent the position of friendly and enemy forces. While not as ubiquitous as the consumer smartphones in the civilian world, these devices represent one of the first attempts at converging the physical and digital domains into a piece of handheld kit.

However, the current hardware in augmented reality systems limits the quality of field of view of holographic content. Virtual reality head-mounted displays provide high-quality visuals, but at the cost of occluding almost entirely the user’s view of the natural world. While the Army is beginning to assess virtual reality for use in less lethal environments such as command posts, ultimately the future of immersive hardware will fuse into a single head-mounted display that can dynamically adjust between rendering content on top of reality or replacing everything with synthetic content. This will be necessary to fully realize the metaverse across the battlefield environments of the future.

CONCLUSION

Despite the push toward the future, we must also acknowledge the limitations we still face with current technology—for example, access issues, latency and hot mics. These problems won’t be solved simply by upgrading to the metaverse and must be solved along with its development. Moving to a metaverse model for planning, preparing, executing and assessing operations would allow dispersed staffs to synchronize warfighting functions more effectively within a virtual node capable of collaboration that would rival existing physical command posts. Ad hoc meetings could transcend simple phone calls and video conferences by allowing users to occupy a virtual planning space that contains all the relevant data to make a decision: an interactive 3D common operational picture displaying friendly and enemy positions, intelligence products, relative combat power, sustainment estimates and more.

Like artificial intelligence, metaverse technologies bring a new suite of tools to bear on the problems of the battlefield, both current and anticipated. Also like AI, without the standards and infrastructure to enable these tools, the results will be piecemeal and underwhelming. It’s important for the Army to lean forward and recognize the potential of the new technologies, not only for what they bring in terms of materiel, but also for their implications on how we will fight in the future.

MORE ON THE METAVERSE

These resources provide additional context on the metaverse.

 


 

For more information, contact Thom Hawkins at jeffrey.t.hawkins10.civ@army.mil. For more information about the U.S. Army Combat Capabilities Development Command Army Research Laboratory’s cross-reality common operating picture effort, contact Mark Dennison at mark.s.dennison.civ@army.mil.

THOM HAWKINS is a project officer for artificial intelligence and data strategy with Project Manager Mission Command, assigned to the Program Executive Office for Command, Control and Communications – Tactical, at Aberdeen Proving Ground, Maryland. He holds an M.S. in library and information science from Drexel University and a B.A. in English from Washington College. He is Level III certified in program management and Level II certified in financial management, and is a member of the Army Acquisition Corps.

LTC MATT MANESS is the Science and Technology branch chief at the Mission Command Battle Laboratory at Fort Leavenworth, Kansas. He holds an M.S. and a B.S in systems engineering from George Washington University and the United States Military Academy at West Point, respectively. Commissioned as an armor officer in 2006, he now serves as an information systems engineer supporting the Army’s modernization enterprise with a focus on command and control information systems. 

MARK DENNISON is a cross-reality researcher with U.S. Army Combat Capabilities Development Command Army Research Laboratory West (DEVCOM ARL West) at Playa Vista, California. He leads an applied research project on providing a cross-reality common operating picture under the common operating environment line of effort in the Network Cross-Functional Team. He earned a Ph.D. in psychology, an M.S. in cognitive neuroscience and a B.S. in psychology from the University of California, Irvine. 

PETE KHOOSHABEHADEH is a cognitive scientist and the regional lead of DEVCOM ARL West. He leads a group of interdisciplinary scientists and engineers who collaborate to operationalize science with trusted academic and industry partners. He earned a Ph.D. and an M.S. in psychological and brain sciences from the University of California, Santa Barbara, and a B.A. in cognitive science from the University of California, Berkeley.

   

Read the full article in the Summer 2024 issue of Army AL&T magazine. 
Subscribe to Army AL&T – the premier source of Army acquisition news and information.
For question, concerns, or more information, contact us here.