Advanced Computing in the Age of AI | Thursday, March 28, 2024

NASA’s Perseverance Rover Lands on Mars: Here’s How It Is Using AI 

Image credit: NASA/JPL

After its ambitious Feb. 18 landing on Mars, NASA’s Perseverance Mars rover is already getting critical help from the multiple onboard AI systems that are designed to guide its two-year exploratory mission on the red planet.

Perseverance successfully completed its 293-million-mile voyage from Earth to Mars and landed safely on the Martian surface at 3:55 p.m. EST Thursday. That triumph came after a dramatic descent which saw the spacecraft that carried it to Mars slow from 12,100 mph on entry into the planet’s atmosphere to zero as the rover was carefully deposited on the surface about seven minutes later.

Now it’s time for NASA engineers and mission controllers to check over the rover’s systems to be sure that everything is in good condition after its voyage and then dive into the planned science experiments that will search for traces of microscopic life on Mars dating back billions of years.

To make these experiments and mission possible, Perseverance carries more AI capabilities and technologies than any Mars rover before it, including the Curiosity rover which arrived on Aug. 6, 2012, and is still in operation, Raymond Francis, a science operations engineer at NASA’s Jet Propulsion Laboratory, told EnterpriseAI.

Raymond Francis of the JPL

“The systems on Perseverance are upgraded by comparison to what we did on the Mars Science Laboratory (MSL) on the Curiosity mission,” said Francis. “Some of it is an evolution of what we had on Curiosity and the part that I work most closely with is a stepwise upgrade from Curiosity.”

AI for Landing

One of the major AI system enhancements aboard Perseverance has already proven to be a success during the rover’s precarious and speedy descent to the Martian surface for its landing near the 28-mile-wide Jezero Crater, said Francis. The touchdown was dangerous due to the presence of an ancient river delta, steep cliffs, sand dunes, boulder fields and smaller impact craters at the site.

But that’s where the new AI-enabled Terrain Relative Navigation (TRN) system came in to help, said Francis. “Perseverance has a camera on board which allows it to take an image or several images as it's descending towards the landing site,” he said. “And it has a map on board, which it can correlate with those images and recognize where it's coming down. It then computes where the image was just taken and where it's going to touch down.”

Image credit: NASA/JPL

Those autonomous capabilities were critical for the landing because due to the long distance between Earth and Mars, commands sent up to the rover from mission engineers don’t even arrive to the spacecraft until five to 40 minutes later, depending on conditions, said Francis. That means getting the rover to a safe landing spot in the hazardous crater landing zone required the TRN system because it would have been impossible if it had to rely on delayed manual control from flight engineers.

“If it recognizes it's coming down on a place that's not safe, it will autonomously steer during its supersonic descending-to-zero speed descent to Mars,” he said.

That’s essentially what NASA astronaut Neil Armstrong had to do during the Apollo 11 flight to the Moon on July 20, 1969. On that mission Armstrong was forced to take over manual control to land Eagle, the first crewed lunar lander, on the lunar surface after automated systems were directing the craft to a dangerous landing site at the Sea of Tranquility.

“But because we don't have a human there, we had to get a computer that could do it right,” said Francis. “This allows us to go to Jezero Crater, which would not have been safe for Curiosity. It is safe for Perseverance because we have the intelligence in the landing system. We did not have this during the Curiosity landing.”

AI for Targeting Instruments

AI will also be used on Perseverance through its Autonomous Exploration for Gathering Increased Science system (AEGIS), which is intelligent targeting software that allows mission engineers to remotely aim and control the rover’s chemistry camera, called SuperCam. An earlier version of AEGIS is used on Curiosity with that rover’s ChemCam camera, but this newer version is enhanced to work with the latest rover’s updated SuperCam.

“We will begin using it on Perseverance a little bit after landing,” said Francis, who is also the lead system engineer for AEGIS and its use with the ChemCam and SuperCam cameras.

Image credit: NASA/JPL

“What we do with ChemCam on Curiosity and SuperCam on Perseverance is point a laser spectrometer,” he explained. “Both of them carry a laser-induced breakdown spectrometer … where we fire up a powerful laser, usually at a rock somewhere within about seven meters of the rover, and vaporize part of the surface. We then look at that rock plasma we just generated to determine the elemental composition of the rock we're studying.”

The experiments aim to help scientists understand what the rocks are made of and use other measurements to determine how they were formed, where they came from and other details.

“Normally we let the humans on Earth pick the rocks for the research,” said Francis. “The science team likes to look at the pictures [from Mars taken by the rover] and choose their favorite rocks. But there are times when, for example, the rover is just driven into a new place, and no one has seen this place yet because the images haven't gotten back to Earth yet. So, we can either let the rover pick the best rocks around, using some onboard intelligence, or we just wait for the humans on Earth to have the chance. Time on Mars is valuable, so we often let the system choose.”

Improved Autonomous Navigation Using AI

NASA’s ongoing Curiosity lander mission already included autonomous navigation fueled by AI, but the systems on Perseverance are greatly improved, said Francis.

“We're going to be able to drive faster and farther,” he said. “We have a better computer onboard that we hope to be able to use to run faster computations. On Curiosity, we had to drive a short distance using auto-nav, take stereo pictures, compute the stereo pictures, determine which things were obstacles and what was the safe path, and then drive along that path. And it was really short drives, only a meter or two.”

Now those processes are speeded up quite a bit, he added.

“With Perseverance, we've streamlined that algorithm, we've improved its overall capabilities, so we can actually drive continuously,” he said. “We can take the pictures and crunch the numbers while we're driving, so we can auto-nav faster and farther.”

Image credit: NASA/JPL

All of these enhanced AI capabilities and new ones to come will make it easier for future crewed and robotic missions to Mars and deep space as well, said Francis.

“I think that you're already seeing on the International Space Station, that they're using autonomy and intelligence for things like scheduling, which doesn't sound glamorous,” he said. “But, when you have a spacecraft like the ISS, it's extremely complicated. You have a lot of things with lots of different dependencies – this has to be done before that, this has to be done in parallel with this, this is required at certain times. It's very complicated to do those scheduling things, especially when something changes.”

That’s where AI helps by enabling intelligence in scheduling activities for the ISS, said Francis. “We already do similar things on robotics missions and I suspect that that will be more important on future missions, especially as they get more and more complex.”

These requirements will also come into play as spacecraft continue to travel farther from Earth, when it becomes better and more productive to let robotic craft make decisions without waiting for commands from Earth-bound scientists.

“On Mars, you can get away with it, it's inefficient, but you can do things without that autonomy,” said Francis. “But the further you go in the solar system, the more that the one-way light time limitations mean that you either have to script everything ahead, or you have to let the spacecraft have some autonomous agency. And for things like a fly-by of a distant planet, farther in the solar system, if you want the spacecraft to be able to respond to unexpected things you have to give it some decision agency. I think it will become more and more important.”

Space missions to outer planets or harsh environments are going to depend on AI-based autonomy more and more, he said.

“Part of the challenge is making people comfortable with it, making people believe that we can trust the autonomous system to make good decisions about the spacecraft or to select the right science data,” said Francis. “AEGIS is an example where the science team has become very comfortable with it and they use it regularly because it gives them good science data. And so we have to show that it will be either productive for science or that it will be safe and effective for the spacecraft. That's sort of the next frontier.”

Also happening in space this week is the Feb. 20 (Saturday) launch of powerful AI, edge and cloud computing tools from HPE and Microsoft Azure to the International Space Station. The new AI and other tools will be part of ongoing technology experiments aimed at preparing NASA for launching future crewed exploratory missions to Mars.

The new equipment and software, including HPE’s specialized, second-generation Spaceborne Computer-2 (SBC-2), will mark the first time that broad AI and edge computing capabilities will be available to researchers on the space station.

The new hardware, software and services are scheduled for launch to the ISS at 12:36 p.m. on Feb. 20 aboard Northrop Grumman’s 15th (NG-15) Commercial Resupply Services cargo mission. The NG-15 mission’s launch from the Wallops Flight Facility at Wallops Island, Virginia, is contracted by NASA to bring needed supplies.

EnterpriseAI