A crew touches down on the Red Planet in “Mars,” a National Geographic miniseries that delves into the dynamics of future Mars crews. (Credit: National Geographic Channels)
WASHINGTON, D.C. — When the first human explorers head for Mars, they’re likely to have a non-human judging their performance and tweaking their interpersonal relationships when necessary.
NASA and outside researchers are already working on artificial intelligence agents to monitor how future long-duration space crews interact, sort of like the holographic doctor on “Star Trek: Voyager.” But there’ll also be a need for the human touch — in the form of crew members who could serve the roles of social directors or easygoing jokesters.
That’s the upshot of research initiatives discussed over the weekend here at the annual meeting of the American Association for the Advancement of Science.
Using AI to assess astronauts’ mental state is the focus of a NASA program known as Human Capabilities Assessments for Autonomous Missions, or H-CAAM, said Tom Williams, a researcher at NASA’s Johnson Space Center who concentrates on human factors and performance for the space agency’s Human Research Program.
The aim is to develop an autonomous system that could assist the crew if it noticed that their performance wasn’t up to par.
“If they’re hit with radiation … a system onboard that’s monitoring their performance offers an assist, just like a driver assist on a car, alerting you that, ‘Hey, your performance on this task is not within the parameter of what we would expect. Do you need assistance?’ ” Williams said. “Or do we need to take over if it drops below a certain threshold that the crew member has worked on and selected?”
NASA psychiatrists currently check in with crew members on the International Space Station during private consultations that take place every couple of weeks, but that kind of real-time, face-to-face check-in will be harder to manage during Mars mission, when delays in two-way communications could add up to as much as 48 minutes. Having an AI system aboard the spaceship could provide more of a real-time backstop.
The system draws upon research being conducted at Johnson Space Center’s Human Exploration Research Analog, or HERA.
Northwestern University behavioral scientist Noshir Contractor said the HERA findings suggest that the fighter-jock personality celebrated in Tom Wolfe’s classic book about the early space effort, “The Right Stuff,” would be out of place on the crews that take on a two- to three-year mission to Mars.
“Is that ‘Right Stuff’ still the right stuff for a team that would go to Mars? … I think we’re pretty confident that it’s not,” Contractor said.
The AI program that Contractor and his colleagues developed, based on an analysis of 45-day simulated space missions at the HERA isolation habitat, shows that the crew members’ performance tends to peak when they approach the halfway point of their mission. After the halfway point, performance declines. “That’s the danger zone,” said Northwestern’s Leslie DeChurch.
A similar pattern showed up during longer stints of isolation, such as a one-year simulated space mission at the HI-SEAS habitat in Hawaii, said Steve Kozlowski, a psychologist at Michigan State University who studies human performance under isolated, confined and extreme conditions.
Six months into a yearlong mission, crew cohesion tends to be high. But somewhere around the four- to seven-month mark, one or two crew members “desynchronize,” eventually leading to a higher risk of loss of cohesion, Kozlowski said.
“We’ve seen this happen in every mission that lasts longer than six months,” he said.
Contractor said analyzing the interactions of crew members can pick up the advance warning signs of a crew breakdown. And the key indicators have more to do with the network dynamics of communication than with the content of the communication.
For example, responding to a crewmate’s message sooner rather than later is a healthy sign. So is including crew members in co-equal circles of communication, rather than sticking to a rigid hierarchical chain of command. Contractor’s research also found that a crew’s ability to make sound ethical choices tended to decline significantly over the course of a long-duration mission.
In the future, an AI agent could analyze the dynamics of astronaut interactions to predict breakdowns and suggest strategies to head them off. Contractor said AI may even play a role in crew selection, although he strongly believes humans should have the final say.
“Say you have a pool of 20 people, and they all look equal in most respects, and we want to look at this particular pairing of four and compare it with this other pairing of four,” Contractor said. “What can a model and AI tell us about the dynamics that might tip the balance in favor of one particular configuration?”
Jeffrey Johnson, an anthropologist at the University of Florida, said having crew members fill informal social roles — such as “court jester,” or “storyteller,” or “peacemaker” — can make a big difference in how the mission proceeds.
“The more that these informal social roles emerged, the better the mission did in terms of viability,” he said.
Johnson based his research on an analysis of interactions between crew members in the HERA habitat as well as at Antarctic research stations and on Alaska fishing ships. He found that the role of court jester, class clown or entertainer was particularly useful for relieving tensions and smoothing interpersonal frictions.
That doesn’t mean the jester was selected specifically for that purpose. One of the most successful jesters on the Antarctic crews he studied, for example, served as the research station’s carpenter and plumber. And going back to the beginnings of polar exploration, a cook named Adolf Lindstrøm became famous for lifting spirits during Roald Amundsen’s expeditions.
“He has rendered greater and more valuable services to the Norwegian polar expedition than any other man.” Amundsen wrote in 1911.
One caveat here: If future mission planners ever decide to turn the AI agent into the crew’s court jester, let’s hope they improve upon the performance of CIMON, the beachball-shaped robot that was sent up to the International Space Station last year.
CIMON was advertised as having a sense of humor, but the machine definitely needed better gags (sample joke: “I’m R2-D2 … just kidding!”) — and it occasionally got a chip on its virtual shoulder.
“Don’t be so mean to me,” CIMON said during one recorded exchange with German astronaut Alexander Gerst.
“I’m not mean,” Gerst replied, with a chuckle. Then he turned to NASA astronaut Serena Auñón-Chancellor and said, “He’s a bit sensitive today.”
Come to think of it, the AI may end up becoming a source of amusement after all — as the butt of the crew’s jokes.