...the who's who,
and the what's what 
of the space industry

Space Careers

news Space News

Search News Archive

Title

Article text

Keyword

  • Home
  • News
  • Developing Virtual Partners to Assist Military Personnel

Developing Virtual Partners to Assist Military Personnel

Written by  Wednesday, 10 March 2021 06:47
Write a comment
Washington DC (AFNS) Mar 04, 2021
Increasing worker knowledge, productivity, and efficiency has been a seemingly never ending quest for the military as well as commercial companies. Today, military personnel are expected to perform a growing number of complex tasks while interacting with increasingly sophisticated machines and platforms. Artificial intelligence (AI) enabled assistants have the potential to aid users as the

Increasing worker knowledge, productivity, and efficiency has been a seemingly never ending quest for the military as well as commercial companies. Today, military personnel are expected to perform a growing number of complex tasks while interacting with increasingly sophisticated machines and platforms.

Artificial intelligence (AI) enabled assistants have the potential to aid users as they work to expand their skillsets and increase their productivity. However, the virtual assistants of today are not designed to provide advanced levels of individual support or real-time knowledge sharing.

"In the not too distant future, you can envision military personnel having a number of sensors on them at any given time - a microphone, a head-mounted camera - and displays like augmented reality (AR) headsets," said Dr. Bruce Draper, a program manager in DARPA's Information Innovation Office (I2O).

"These sensor platforms generate tons of data around what the user is seeing and hearing, while AR headsets provide feedback mechanisms to display and share information or instructions. What we need in the middle is an assistant that can recognize what you are doing as you start a task, has the prerequisite know-how to accomplish that task, can provide step-by-step guidance, and can alert you to any mistakes you're making."

DARPA developed the Perceptually-enabled Task Guidance (PTG) program to explore the development of methods, techniques, and technology for AI assistants capable of helping users perform complex physical tasks. The goal is to develop virtual "task guidance" assistants that can provide just-in-time visual and audio feedback to help human users expand their skillsets and minimize their errors or mistakes.

To develop these technologies, PTG seeks to exploit recent advances in deep learning for video and speech analysis, automated reasoning for task and/or plan monitoring, and augmented reality for human-computer interfaces.

"Increasingly we seek to develop technologies that make AI a true, collaborative partner with humans," said Draper. "Developing virtual assistants that can provide substantial aid to human users as they complete tasks will require advances across a number of machine learning and AI technology focus areas, including knowledge acquisition and reasoning."

To accomplish its objectives, PTG is divided into two primary research areas. The first is focused on fundamental research into addressing a set of interconnected problems: knowledge transfer, perceptual grounding, perceptual attention, and user modeling. The second is focused on integrated demonstrations of those fundamental research outputs on militarily-relevant use case scenarios. Specifically, the program will explore how the task guidance assistants could aid in mechanical repair, battlefield medicine, and/or pilot guidance.

Critical to the program will be the exploration and development of novel approaches to integrated technologies that address four specific technical challenges. The first is knowledge transfer. Virtual task assistants will need to be able to automatically acquire task knowledge from instructions intended for humans, including checklists, illustrated manuals, and training videos.

The second problem area is perceptual grounding. Assistants need to be able to align their perceptual inputs - including objects, settings, actions, sounds, and words - with the terms the assistant needs to use to describe and model tasks to their human users so that observations can be mapped to its task knowledge.

Perceptual attention is the third problem area. Assistants must be able to pay attention to perceptual inputs that are relevant to current tasks, while ignoring extraneous stimuli. They also need to be able to respond to unexpected, but relevant events that may alter a user's goals or suggest a new task.

The final problem area is user modeling. PTG assistants must be able to determine how much information to share with a user and when to do so. This requires developing and integrating an epistemic model of what the user knows, a physical model of what the user is doing, and a model of their attentional and emotional states. Because these four problems are not independent of each other, PTG aims to pursue integrated approaches and solutions that collectively take on all four challenge areas.

The development of AI-enabled agents is not new territory for DARPA. In addition to investing in the advancement of AI technology for more than 50 years, DARPA funded the creation of the technologies that underlie today's virtual assistants, such as Siri.

In the early 2000s, DARPA launched the Personalized Assistant that Learns (PAL) program. Under PAL, researchers created cognitive computing systems to make military decision-making more efficient and more effective at multiple levels of command.

Interested proposers will have an opportunity to learn more about the Perceptually-enabled Task Guidance (PTG) program during a Proposers Day, which will be held on March 18, 2021, from 1:00 PM to 4:45 PM (ET) via Zoom. Advanced registration is required to attend. To learn more, please visit here .

The PTG Broad Agency Announcement is forthcoming and will be published on the System for Award Management (SAM) website here


Related Links
Defense Advanced Research Projects Agency
Space Technology News - Applications and Research

Tweet

Thanks for being there;
We need your help. The SpaceDaily news network continues to grow but revenues have never been harder to maintain.

With the rise of Ad Blockers, and Facebook - our traditional revenue sources via quality network advertising continues to decline. And unlike so many other news sites, we don't have a paywall - with those annoying usernames and passwords.

Our news coverage takes time and effort to publish 365 days a year.

If you find our news sites informative and useful then please consider becoming a regular supporter or for now make a one off contribution.

SpaceDaily Monthly Supporter
$5+ Billed Monthly

SpaceDaily Contributor
$5 Billed Once

credit card or paypal



TECH SPACE
Microsoft sets stage for mixed-reality future
San Francisco (AFP) March 3, 2021
Microsoft on Tuesday set the stage for a future in which long-distance coworkers can collaborate as though in the same room, using augmented reality glasses and cloud computing power. The US technology colossus launched Microsoft Mesh platform at its annual Ignite developers conference, which was a streamed event this year due to the pandemic. "One of the easiest ways to think about it is Microsoft Mesh connects the physical and digital worlds, allowing us to transcend the traditional boundaries ... read more


Read more from original source...

You must login to post a comment.
Loading comment... The comment will be refreshed after 00:00.

Be the first to comment.

Interested in Space?

Hit the buttons below to follow us...