The emergence of next generation virtual and augmented reality devices such as the Oculus Rift and Microsoft HoloLens has increased interest in using mixed reality to simulate training, enhance command and control, and improve the effectiveness of warfighters on the battlefield.
Researchers at RDECOM’s Army Research Laboratory (ARL), the US Army’s corporate research laboratory, are working with the University of Minnesota and the US Army’s Institute for Creative Technologies at the University of Southern California to understand how the viability of mixed reality might be assessed and tested.
In a recently published paper, the researchers surveyed potential methods for assessing the usefulness of immersive systems, discussed how the data might be acquired in experimental and tactical scenarios, posed issues in multi-user collaboration.
This paper is one of the first to survey metrics and methods that are relevant to the unique problems that warfighters may face when performing decision-making in command and control or intelligence analysis scenarios.
In addition, the researchers discuss the ARL-developed Mixed Reality Tactical Analysis Kit (MRTAK), which functions as an experimental platform to perform these assessments during collaborative mission planning and execution.
MRTAK is now being developed as the mixed reality module of project AURORA (Accelerated User Reasoning for Operations, Research, and Analysis), as AURORA-MR.
This research was recently presented at the 23rd International Command and Control Information and Technology Symposium held in Pensacola, Florida.
Dr Mark Dennison, research psychologist in ARL’s Battlefield Information Processing Branch stationed at ARL West in Playa Vista, California, said: “Our survey of the existing literature determined that new methods and metrics are essential to ensure that future basic and applied research can efficiently and accurately assess performance differences between immersive technologies and traditional 2-D systems.”
According to Dennison, their work has often shown that researchers in this field have performed studies where collected data do not allow for useful metrics to be reported on, making it difficult or impossible for key decision-makers to determine how, when and where immersive technology provides any benefit or deficit to specific mission or task needs.
“In this paper, we suggest a paradigm shift away from simply comparing non-immersive and immersive systems on similar tasks, and instead meticulously breaking down complex decision-making into component processes that can be more accurately modeled and compared across disparate display types,” Dennison said. “For example, when studying the planning of a tactical operation, such as the breach and clear of a hostile building, the same spatial information must be present in the 2D and VR experimental conditions to allow for precise quantitative comparisons.
As part of this research into collaborative immersive analytics, the researchers developed and deployed AURORA-MR, which serves as a test-bed to perform tightly controlled basic and applied research of multi-user decision making with distributed immersive systems.
Currently, AURORA-MR is being used for collaborative immersive analytics research in Maryland at ARL headquarters at the Adelphi Laboratory Center and Aberdeen Proving Ground, in California at ARL West and the ICT’s Mixed Reality Lab, and at the University of Minnesota.
The system has also been demonstrated to NATO SET-256, the Air Force’s TAP Lab, and was featured at the AUSA 2018 Global Force Innovator’s Corner.
According to the researchers, research conducted with AURORA-MR will enable a soldier to understand when visualising and interacting with critical battlefield information might be best done in an immersive system, or in collaboration with others using traditional systems.
“Through virtualisation of some or all elements of the Tactical Operations Center, commanders and intelligence analysts can communicate and collaborate without the constraints of a physical building and with a reduced footprint to enemy intelligence, surveillance and reconnaissance, or ISR,” Dennison said.
The design of AURORA-MR seeks to enable easy integration with other databases, sensors and machine learning so that joint research can more fluidly occur internally across ARL and externally with their academic and industry partners.
“Currently, we are evolving the network powering AURORA-MR, called AURORA-NET, to allow for greater control over the information that is sent and received by clients, while ensuring that the virtual environment is rendered at a comfortable frame rate to minimise the crippling effects of motion sickness on immersed users,” Dennison said. “This will enable us to conduct research on how ingestion and analysis of data from noisy systems, such as the Internet of Battlefield Things, can be augmented through distributed collaboration in mixed reality.”