Adobe explains how space images are 'Photoshopped'
It's not exactly a secret that those glorious space images NASA and other space agencies release go through extensive "Photoshopping," just like magazines and billboard ads. Now, Adobe has explained what exactly happens during the post-processing stage in its latest blog post. First of all, the person who does the retouching can't be an ordinary graphic designer: it usually has to be an astronomer. That's because they have to be able to interpret raw data and differentiate image artifacts that need to be erased from vague planets and cloudy nebulae, among other celestial bodies.
One of these astronomers is Robert Hurt from Caltech, who explains that his work begins with "raw grayscale data from different parts of the infrared spectrum." He amps up the contrast to bring out the most interesting features and gets rid of any artifact that shouldn't be there. After that, he translates "infrared colors," which are invisible to the naked eye into colors we can actually see. He also layers photos and data from different telescopes (such as Hubble and Spitzer) sometimes to create a more vivid and accurate image. Take for example, his work on the Orion Nebula below:
The Andromeda galaxy panorama at the top of this post also went through the same process, and so did the numerous Mars photos captured by Curiosity. Its cameras are optimized to take photos of the landscape and apparently can't capture Martian skies. Someone like Hurt smooths out the jagged edges of the stitched photos the rover sends back and adds the reddish sky we typically associate with the red planet.
[Image credit: NASA, ESA, J. Dalcanton, B.F. Williams, and L.C. Johnson (University of Washington), the PHAT team, and R. Gendler / NASA, JPL-Caltech, UCLA]