Landmarks like trees or buildings on Earth can serve as obscure but useful distance metrics, a feature not found on the Moon.
Teaching AI to be ‘GPS-like’
An astronaut on the Moon would also have a hard time navigating vast unexplored regions because there is no atmosphere to scatter the light. In strong sunlight, these rays cause changes in vision and depth perception.
Now, Alvin Yu, a research engineer at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, is working to create an artificial intelligence system that will guide rovers on the lunar surface.
Much like the way GPS finds places on Earth, Yew taught AI to mimic the features of the lunar horizon as seen by lunar rovers.
The technology was developed using data from NASA Lunar reconnaissance vehicle. More specifically, it uses the Lunar Orbiter Laser Altimeter (LOLA), which measures the tilt and roughness of the lunar surface. Simply put, LOLA creates a high-resolution topographic map of the Moon.
These digital panoramas are then used to compare images taken by the rover or astronaut with known rocks, hills and even craters, so the exact location of a given area can be determined.
“Hipster music fans. Analyst. Beer professional. Very charming Twitter pioneer. Communicators”.