Captain Kirk, Spock, and the rest of the Star Trek gang were in constant dialogue with the onboard Enterprise computer, asking it questions about the starship and their alien environments.
With NASA reviving its human space exploration program through Artemis, it seems only natural real astronauts of the 2020s who will crew the forthcoming space missions would do the same. After all, boldly going where no one has gone before could be lonely, and having an A.I. sidekick might help on those long voyages.
When Lockheed Martin, the company that built the new Orion spacecraft for NASA, first dreamed up the talking computer, engineers figured they’d just throw an Amazon Echo Dot on the dashboard with a laptop and call it a day. But it wasn’t nearly that simple, said Rob Chambers, Lockheed’s director of commercial civil space strategy.
Beyond technical constraints, they had to overcome the menacing representations of an inflight space computer, in the vein of Stanley Kubrick’s 2001: A Space Odyssey. Unlike the collegial computer in Star Trek, “HAL” starts to glitch, takes control of the spacecraft, and then fights the crew’s attempts to shut it down.
That’s not merely a concern raised through science fiction. This summer A.I. developer Blake Lemoine, formerly of Google, went public with his belief that a chatbot he helped build had become sentient. The story sparked a global conversation about whether some artificial intelligence is — or could be — conscious.
William Shatner as Capt. James T. Kirk on Star Trek talks to the Starship Enterprise computer.
Credit: Photo by CBS Photo Archive / Getty Images
Such claims work to reinforce fears long embedded in popular culture — that one day the advanced technology enabling humans to achieve extraordinary things could be too smart, perhaps leading to machines that are self aware and want to hurt people.
“We don’t want the HAL 9000, ‘I’m sorry, Dave. I can’t open the pod bay doors,'” Chambers told Mashable. “That’s the first thing that everybody said when we first suggested this.”
“We don’t want the HAL 9000, ‘I’m sorry, Dave. I can’t open the pod bay doors.That’s the first thing that everybody said when we first suggested this.”
Rather, Lockheed Martin and its collaborators believe having a voice-activated virtual assistant and video calls in the spacecraft would be more convenient for astronauts, affording them access to information away from the crew console. That flexibility might even keep them safer, engineers say.
An experiment to test the technology is riding along with Artemis on its first spaceflight. The project, named Callisto after one of Artemis’ favorite hunting companions in Greek mythology, is programmed to give crew live answers about the spacecraft’s flight status and other data, such as water supply and battery levels. The companies are paying for the technology — not NASA.
A custom Alexa system built specifically for the spacecraft will have access to some 120,000 data readouts — more than astronauts have had before, with some bonus information previously only available within Houston’s mission control.
Howard Hu, NASA’s Orion deputy program manager, and Brian Jones, Lockheed Martin’s chief engineer for the Callisto project, observe signals from the Orion spacecraft at NASA’s Kennedy Space Center in Florida during a connectivity test.
No astronaut will actually be onboard Orion for this first mission — unless the dummy in the cockpit counts. But the inaugural 42-day spaceflight, testing various orbits and atmosphere reentry, will clear the way for NASA to send a crew on subsequent missions. Whether a virtual assistant is integrated into the spacecraft for those expeditions depends on a successful demonstration during Artemis I.
Mission control also will use video-conferencing software provided by Cisco Webex inside the spacecraft. Cisco will run its software on an iPad in the capsule. Cameras mounted all over Orion will monitor how the system is working.
Want more science and tech news delivered straight to your inbox? Sign up for Mashable’s Top Stories newsletter today.
For the most part, the virtual assistant will answer queries, like “Alexa, how fast is Orion traveling?” and “Alexa, what’s the temperature in the cabin?” The only thing the system can actually control are the lights, said Justin Nikolaus, an Alexa voice designer on the project.
“As far as control of the vehicle, we don’t have access to any critical components or mission critical software onboard,” Nikolaus told Mashable. “We’re safely sandboxed in Orion.”
The space-faring Alexa might not seem so advanced. But engineers had to figure out how to get the device to recognize a voice in a tin can. The acoustics of Orion, with mostly metal surfaces, were unlike anything developers had encountered before. What they learned from the project is now being applied to other challenging sound environments on Earth, like detecting speech in a moving car with the windows rolled down, Nikolaus said.
The most significant change from off-the-shelf Amazon devices is that the system will debut a new technology the company calls “local voice control,” which allows Alexa to work without an internet connection. Back on Earth, Alexa operates on the cloud, which runs on the internet and uses computer servers warehoused in data centers.
In deep space, when Orion is hundreds of thousands of miles away, the time delays to reach the cloud will be, shall we say, astronomical. Looking toward the future, that lag could stretch from seconds to an hour to transmit messages back and forth to a spacecraft on its way to Mars, about 96 million miles from Earth.
That’s why engineers built a spacecraft computer to handle the data processing, Chambers said.
“It’s not canned things. It’s actual real-time processing,” he said. “All that smarts has to be on the spacecraft because we didn’t want to suffer the time lag of going back up to the spacecraft, back down to Earth, back up, and back down again.”
“All that smarts has to be on the spacecraft because we didn’t want to suffer the time lag of going back up to the spacecraft, back down to Earth, back up, and back down again.”
NASA added a new 111-foot beam waveguide antenna to the Deep Space Network at the ground station in Madrid in February 2022.
Credit: NASA / JPL-Caltech
For the questions that Alexa can’t handle offline, Callisto will tap into the Deep Space Network, the radio dish system NASA uses to communicate with its farthest spacecraft, and route the signals to the cloud on Earth. This could allow Callisto to support a wider range of requests, like reading the news or reporting sports scores.
Or ordering more toilet paper and trash bags. Seriously.
The designers built in the capability for astronauts to buy things from Amazon. Overnight delivery to the moon wouldn’t be an option, but sending flowers to a spouse on Earth for a special occasion would.
Cisco also will use the Deep Space Network to provide video-conferencing calls. Engineers say astronauts would be able to use this tool for “whiteboarding” meetings with their Houston colleagues. Imagine how handy that would have been for the Apollo 13 crew as NASA tried to talk them through how to make a round air filter fit into a square hole with no visual aids.
Broadcasting pictures in high resolution across the solar system isn’t easy, especially with such limited data capacity. One of the reasons Lockheed Martin chose Cisco as a collaborator was for the company’s expertise in video compression, Chambers said. As video travels through space, the data can get garbled. Cisco worked on error-correction technology to smooth out the transmissions.
“One of my colleagues at Cisco refers to this as trying to do 4K, high bandwidth, gigabit-type ethernet, using a 1980s dial-up modem,” he said. “Obviously, the Deep Space Network is very, very capable, but we’re trying to do modern video-conferencing.”
“One of my colleagues at Cisco refers to this as trying to do 4K, high bandwidth, gigabit-type ethernet, using a 1980s dial-up modem.”
To make the custom virtual assistant, the collaborators spent time interviewing astronauts. One of the things they asked for was a dictation service, Nikolaus said. Often their notepads and pens float away. It’s hard to use a computer in a weightless environment, too.
“If you go to a keyboard and you’re not used to microgravity and you start typing, your force on the keyboard pushes your body away from it,” Nikolaus said.
But: Alexa, can you fly me to the moon?
Yes, if what you want is a little Frank Sinatra crooning through the cabin.
Alexa, can you open or close the pod bay doors?
Fortunately, no. The system can’t do anything to put the astronauts in danger, Chambers said.
“We think about that a lot, not necessarily that they’ll become sentient and, you know, Rise of the Machines, and [become] our software overlords,” he said.
But software is complex. Strange behaviors can occur through unexpected convolutions of activities, he said: “What we do is we architect the system such that it is actually not possible for this device to talk to this other device.”
So if all goes according to plan, perhaps the most havoc the real HAL could cause is to prank an astronaut’s family with an unwanted Amazon Fresh pizza delivery.
UPDATE: Aug. 23, 2022, 2:10 p.m. UTC An earlier version of this story incorrectly said mission control would use the Cisco Webex software to talk to the onboard Alexa. A company spokeswoman says it won’t.