The Earth's magnetic field is what protects us from space radiation. Some comes from outside our galaxy, but our sun also emits highly charged particles that would be incredibly harmful if we weren't protected from them. (That's one of the serious concerns about sending astronauts to Mars and beyond: How do we protect them from radiation when they're outside the protection of the Earth's magnetic field?) As far as scientists know, a magnetic field is a key ingredient in finding life on another planet, but they were under the impression that a body as small as the moon wouldn't be able to support one for very long. This new study may prove that theory wrong.
In order to conduct their experiments, the team used moon rocks that were collected by astronauts on Apollo 15. Objects lose their magnetism when exposed to extreme heat, so Sonia Tikoo, the lead author of the article published in Science Advances, heated the lunar rock to 1,436 degrees F and demagnetized it. She was then able to determine what its original magnetization was; it was higher than expected.
Scientists have dated the lunar rock to between 1 billion and 2.5 billion years old. This means that the moon's magnetic field, which was once as strong as the Earth's, lasted a lot longer than scientists thought. It was still active as of 2ish billion years ago. The moon currently has no magnetic field, but scientists are unclear as to when it shut down. In comparison, Mars lost the bulk of its magnetic field about 4 billion years ago.
Not only did we just find out more about our satellite, but this study tells us that we possibly should be looking at exomoons as potential hosts for life, in addition to exoplanets. "The question becomes what size planets and moons should we be considering as possibly habitable worlds," said Tikoo. The answer is clear: We have a lot more work to do.