Members of the public can now help teach an artificial intelligence algorithm to recognize scientific features in images taken by NASA’s Perseverance rover.
Artificial intelligence, or AI, has enormous potential to change the way NASA’s spacecraft study the universe. But because all machine learning algorithms require training from humans, a recent project asks members of the public to label features of scientific interest in imagery taken by NASA’s Perseverance Mars rover.
Called AI4Mars, the project is the continuation of one launched last year that relied on imagery from NASA’s Curiosity rover. Participants in the earlier stage of that project labeled nearly half a million images, using a tool to outline features like sand and rock that rover drivers at NASA’s Jet Propulsion Laboratory typically watch out for when planning routes on the Red Planet. The end result was an algorithm, called SPOC (Soil Property and Object Classification), that could identify these features correctly nearly 98% of the time.
SPOC is still in development, and researchers hope it can someday be sent to Mars aboard a future spacecraft that could perform even more autonomous driving than Perseverance’s AutoNav technology allows.
Images from Perseverance will further improve SPOC by expanding the kinds of identifying labels that can be applied to features on the Martian surface. AI4Mars now provides labels to identify more refined details, allowing people to choose options like float rocks (“islands” of rocks) or nodules (BB-size balls, often formed by water, of minerals that have been cemented together).
With AI4Mars, users outline rock and landscape features in images from NASA’s Perseverance Mars rover. The project helps train an artificial intelligence algorithm for improved rover capabilities on Mars.Credits: NASA/JPL-Caltech
The goal is to hone an algorithm that could help a future rover pick out needles from the haystack of data sent from Mars. Equipped with 19 cameras, Perseverance sends anywhere from dozens to hundreds of images to Earth each day for scientists and engineers to comb through for specific geological features. But time is tight: After those images travel millions of miles from Mars to Earth, the team members have a matter of hours to develop the next set of instructions, based on what they see in those images, to send to Perseverance.
“It’s not possible for any one scientist to look at all the downlinked images with scrutiny in such a short amount of time, every single day,” said Vivian Sun, a JPL scientist who helps coordinate Perseverance’s daily operations and consulted on the AI4Mars project. “It would save us time if there was an algorithm that could say, ‘I think I saw rock veins or nodules over here,’ and then the science team can look at those areas with more detail.”
Especially during this developmental stage, SPOC requires lots of validation from scientists to ensure it’s labeling accurately. But even when it improves, the algorithm is not intended to replace more complex analyses by human scientists.
It’s All About the Data
Key to any successful algorithm is a good dataset, said Hiro Ono, the JPL AI researcher who led the development of AI4Mars. The more individual pieces of data available, the more an algorithm learns.
“Machine learning is very different from normal software,” Ono said. “This isn’t like making something from scratch. Think of it as starting with a new brain. More of the effort here is getting a good dataset to teach that brain and massaging the data so it will be better learned.”
AI researchers can train their Earth-bound algorithms on tens of thousands of images of, say, houses, flowers, or kittens. But no such data archive existed for the Martian surface before the AI4Mars project. The team would be content with 20,000 or so images in their repository, each with a variety of features labeled.
The Mars-data repository could serve several purposes, noted JPL’s Annie Didier, who worked on the Perseverance version of AI4Mars. “With this algorithm, the rover could automatically select science targets to drive to,” she said. It could also store a variety of images onboard the rover, then send back just images of specific features that scientists are interested in, she said.
That’s on the horizon; scientists may not have to wait that long for the algorithm to benefit them, however. Before the algorithm ever makes it to space, it could be used to scan NASA’s vast public archive of Mars data, allowing researchers to find surface features in those images more easily.
Ono noted it’s important to the AI4Mars team to make their own dataset publicly available so that the entire data science community can benefit.
“If someone outside JPL creates an algorithm that works better than ours using our dataset, that’s great, too,” he said. “It just makes it easier to make more discoveries.”
Parts of Perseverance are visible beside an area outlined in AI4Mars. The project already used images from NASA’s Curiosity Mars rover and help from the public to train an artificial intelligence algorithm; now the project is using images from Perseverance.Credits: NASA/JPL-Caltech
More About the Mission
A key objective for Perseverance’s mission on Mars is astrobiology, including the search for signs of ancient microbial life. The rover will characterize the planet’s geology and past climate, pave the way for human exploration of the Red Planet, and be the first mission to collect and cache Martian rock and regolith (broken rock and dust).
Subsequent NASA missions, in cooperation with ESA (European Space Agency), would send spacecraft to Mars to collect these sealed samples from the surface and return them to Earth for in-depth analysis.
The Mars 2020 Perseverance mission is part of NASA’s Moon to Mars exploration approach, which includes Artemis missions to the Moon that will help prepare for human exploration of the Red Planet.
JPL, which is managed for NASA by Caltech in Pasadena, California, built and manages operations of the Perseverance rover.
For more about Perseverance: