CSC News
Making AI ‘Intentional’: a Case Study and New Programming Framework
Matt Shipman | News Services | 919.515.6386
Dr. Chris Martens | 919.515.5925
Computer science researchers have developed an artificial intelligence (AI) game-playing program that exhibits social reasoning and “intentionality” – meaning it can infer how other players are likely to respond to new information and what other players likely want from the AI when they share information. The program serves as a proof of concept for a new programming framework the researchers have developed for creating more intentional AI programs.
“People are already interacting with AI programs, whether they know it or not, and that’s likely to increase over time,” says Chris Martens, an assistant professor of computer science at NC State and senior author of two papers on the work. “We think it’s important for AI programs to be able to communicate effectively with users. Being able to gauge people’s intentions is a critical part of how people communicate with each other, and we think our work here is a step toward incorporating that kind of intentionality into AI in a meaningful way.”
In their first paper, researchers developed several iterations of an AI program that can play the card game Hanabi. Hanabi is a cooperative game in which players must share information in order to succeed.
One version of the AI was programmed to “think” about how players will interpret the intent of its actions; one version was programmed with “full” intent, accounting for both how players will interpret its intent and how it should interpret the intent of other players; and other versions were not programmed to address intent at all.
The researchers conducted tests in which 224 human players played with the various AI programs. At the end of the game, study participants were asked both how much they enjoyed playing with the AI and the extent to which participants thought the AI’s gameplay decisions were intentional.
“We found that the full intentional AI was deemed both more fun to play with and more intentional than any of the other programs,” Martens says.
In their second paper, the researchers unveil a new programming framework – called Ostari – that developers can use to craft their own intentional AI programs.
“What sets Ostari apart from previous efforts to address intentionality in AI, is its usability,” Martens says. “Previous models have been implemented as extensions to complex expert-oriented languages, and we aim to bring these tools into a recognizable programming style where anyone comfortable with a mainstream programming language can understand the code line-by-line.
“Ostari could be used to create game prototypes, but we don’t think it’s limited to games,” Martens says. “Ideally it could be used in conjunction with other programming languages for any application in which you need to model situations where information can be exchanged.”
The first paper, “An Intentional AI for Hanabi,” was presented at the IEEE Conference on Computational Intelligence in Games (CIG), which was held Aug. 22-25 in New York City. Lead author of the paper is Markus Eger, a Ph.D. student at NC State. The paper was co-authored by Marcela Alfaro Córdoba, a former Ph.D. student at NC State.
The second paper, “Practical Specification of Belief Manipulation in Games,” was presented at the AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment (AIIDE), held Oct. 5-9 in Snowbird, Utah. Eger is lead author of the paper.
~shipman~
Return To News Homepage