Advertisement

Lunar 'sandbox' helps robots see in harsh moon lighting

The moon has no shades of gray.

Everything is more extreme on the moon. On top of temperatures that range from -300 F to +224 F, future astronauts and probes must deal with lighting conditions generously described as "harsh." To help, researchers at Ames Research Center in Silicon Valley created a lunar testbed, complete with craters, fluffy dust and solar simulator lights. The goal is to develop sensors that can "see" in such conditions to help probes and, eventually, humans navigate the surface safely.

With no atmosphere to scatter and reflect lighting, "what you get on the Moon are dark shadows and very bright regions that are directly illuminated by the Sun -- the Italian painters in the Baroque period called it chiaroscuro," says NASA Ames computer scientist Uland Wong. That's a conundrum for surface probes, "because cameras don't have the sensitivity to see the details that you need to detect a rock or crater," he adds.

Pictures snapped by astronauts give us a decent idea of what the moon's lighting is like. However, those shots were taken at well-lit spots in the early afternoon, when the illumination is best. To support future colonies, scientists are more interested in the polar regions, where it may be possible to drill for water and other essential elements. There, the sun is always on the horizon, producing long shadows that could cache jagged rocks and other dangers.

Computer simulations are fine, but nothing beats the real thing, as filmmakers have recently realized. The NASA Ames researchers built a 12-foot square sandbox with eight tons (!) of simulated lunar soil called JSC-1A. Craters, surface ripples and other obstacles were then added, and topped with a final, fine layer of soil, much like the real, pervasive moondust that exasperated Apollo astronauts. The whole thing was then lit with solar simulator lights to create "low-angle, high-contrast illumination," NASA writes.

Guided by supercomputers, the team filmed everything with stereoscopic cameras, using multiple setups and lighting angles. They used that to build the POLAR (polar optical lunar analog reconstruction) dataset, which can be used by robotic vision designers on future probes, whether on the moon or other bodies in the solar system.

So far, the results show that stereo imaging might work best on rovers working at the moon's poles. "One of the mission concepts that's in development right now, Resource Prospector ... might be the first mission to land a robot and navigate in the polar regions of the Moon," Wong said. "And in order to do that, we have to figure out how to navigate where nobody's ever been."