Military researchers to use gaming concepts and artificial intelligence (AI) for nuanced communications


SOURCE: MILITARYAEROSPACE.COM
OCT 17, 2021

Diplomatic negotiation is complex and requires an in-depth understanding of the interactions of many potentially untrustworthy allies and adversaries.

ARLINGTON, Va. – U.S. military researchers will use gaming concepts and artificial intelligence to help military commanders improve their understanding of nuanced communications and situational understanding in strategic decision-making, collaboration, and deception.

Officials of the U.S. Defense Advanced Research Projects Agency (DARPA) in Arlington, Va., issued a solicitation on Wednesday (DARPA-PA-21-04-03) for the Stabilizing Hostilities through Arbitration and Diplomatic Engagement (SHADE) project.

Using a gaming environment based on the classic board game Diplomacy, SHADE will train and evaluate automated and artificial intelligence (AI)-assisted techniques that can identify and address the complications of multi-party negotiation with deception, collusion, profiling, and other real-world features.

Diplomatic negotiation is complex and requires a detailed, in-depth understanding of the positions and interactions of many potentially untrustworthy allies and adversaries. Successful diplomats must identify deceptions, explore valid courses of action, and assess the consequences of any diplomatic actions, DARPA researchers explain.

Diplomacy is a multi-player game reportedly favored by John F. Kennedy, Henry Kissinger, and Walter Cronkite that revolves around obtaining and defending territories on a map. There are no dice, playing cards, or other game elements that produce random effects. Instead, the game requires players to collaborate, collude, and betray to win. Diplomacy requires pure analysis of the adversarial intent based on discussions, movement of forces, and signaling.

SHADE will build on this understanding to inform strategic decisions in a collaborative and competitive gaming environment that can be tuned to explore varying levels of cooperation, competition, deception, and betrayal.

Progress on this issue will be beneficial towards building AI-enabled decision-support systems to assist with diplomatic negotiations, systems that distill intelligence from disparate communications with varying trustworthiness, and systems that predict stabilizing vs. destabilizing actions in the presence of deceptive actors with competing priorities.

Given the project's need for powerful computers, a common isolated testing environment called The Gym will enable selected proposers to deploy agents, as well as test, train, and generate models. Proposers are encouraged to work together frequently.

Work proposed must develop a formal contract negotiation language and a formal knowledge base that includes information about the agent’s intent to analyze deception and coercion, as well as whether the other agent’s plans to accept, decline, or propose a new offer.

SHADE's second phase will refine play against other agent and human players in a Diplomacy tournament. When possible, proposers should use existing tools, datasets, open source technologies, and contribute back to the open source community. The project will award contracts collective worth about one million dollars.

Companies interested should upload eight-page unclassified proposals no later than 4 Nov. 2021 to the DARPA BAA website at https://baa.darpa.mil.

Similar articles you can read