Researchers unveil tool to help developers create augmented reality task assistants

Overview of the user interface and components of ARGUS Offline. (A) The Data Manager shows the applied filters (A1) and the list of retrieved sessions (A2). (B) The Spatial View shows the world point cloud representing the physical environment, 3D points for eye and hand positions, and gaze projections and heatmaps. (B1) Render Controls allow the user to select the elements of the Spatial View they desire to see. (C) Temporal View: (C1) The Video Player is the main camera video output of the current timestamp selected by the user. (C2) The Temporal Controller controls the video player and updates the model output viewer as well. (C3) The Model Output Viewer displays the output of the machine learning models (reasoning and perception) used during execution time. Credit: arXiv (2023). DOI: 10.48550/arxiv.2308.06246

Augmented reality (AR) technology has long fascinated both the scientific community and the general public, remaining a staple of modern science fiction for decades.

In the pursuit of advanced AR assistants—ones that can guide people through intricate surgeries or everyday food preparation, for example—a research team from NYU Tandon School of Engineering has introduced Augmented Reality Guidance and User-Modeling System, or ARGUS.

An interactive visual analytics tool, ARGUS is engineered to support the development of intelligent AR assistants that can run on devices like Microsoft HoloLens 2 or MagicLeap. It enables developers to collect and analyze data, model how people perform tasks, and find and fix problems in the AR assistants they are building.

Claudio Silva, NYU Tandon Institute Professor of Computer Science and Engineering and Professor of Data Science at the NYU Center for Data Science, leads the research team that will present its paper on ARGUS at IEEE VIS 2023 on October 26, 2023, in Melbourne Australia. The paper received Honorable Mention in that event’s Best Paper Awards.

“Imagine you’re developing an AR AI assistant to help home cooks prepare meals,” said Silva. “Using ARGUS, a developer can monitor a cook working with the ingredients, so they can assess how well the AI is performing in understanding the environment and user actions. Also, how the system is providing relevant instructions and feedback to the user. It is meant to be used by developers of such AR systems.”

ARGUS works in two modes: Online and offline.

The online mode is for real-time monitoring and debugging while an AR system is in use. It lets developers see what the AR system sees and how it’s interpreting the environment and user actions. They can also adjust settings and record data for later analysis.

The offline mode is for analyzing historical data generated by the AR system. It provides tools to explore and visualize this data, helping developers understand how the system behaved in the past.

ARGUS’ offline mode comprises three key components: the Data Manager, which helps users organize and filter AR session data; the Spatial View, providing a 3D visualization of spatial interactions in the AR environment; and the Temporal View, which focuses on the temporal progression of actions and objects during AR sessions. These components collectively facilitate comprehensive data analysis and debugging.

“ARGUS is unique in its ability to provide comprehensive real-time monitoring and retrospective analysis of complex multimodal data in the development of systems,” said Silva. “Its integration of spatial and temporal visualization tools sets it apart as a solution for improving intelligent assistive AR systems, offering capabilities not found together in other tools.”

The research is published on the arXiv preprint server.

More information:
Sonia Castelo et al, ARGUS: Visualization of AI-Assisted Task Guidance in AR, arXiv (2023). DOI: 10.48550/arxiv.2308.06246

ARGUS is open source and available on GitHub under VIDA-NYU

Journal information:
arXiv

Provided by
NYU Tandon School of Engineering

Citation:
Researchers unveil tool to help developers create augmented reality task assistants (2023, September 28)
retrieved 28 September 2023
from https://techxplore.com/news/2023-09-unveil-tool-augmented-reality-task.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.