How Photoshop Helps NASA Reveal the Unseeable

Want to know what 100 million stars looks like?

If so, check out this image of Andromeda, a galaxy located a mere 2.5 million light-years away. The 1.5 billion-pixel image was taken by the Hubble Space Telescope—the highest-resolution image by Hubble yet produced, composed of 7,398 exposures—and sent back to Earth for public viewing.

AndromedaM31_HST

Part of the Andromeda Galaxy, M31, as seen by the Hubble Space Telescope. Credit: NASA, ESA, J. Dalcanton, B.F. Williams, and L.C. Johnson (University of Washington), the PHAT team, and R. Gendler.

Preparing images like these isn’t always as easy as snapping a selfie and uploading it to social media. As Martian rovers send images back to Earth and space telescopes grow increasingly powerful, scientists at NASA’s Jet Propulsion Laboratory (JPL) often need to combine their knowledge of astronomy with advanced image-editing skills to transform unwieldy raw data into the breathtaking cosmic images we see. Indeed, the world’s first digital-image processing tools were actually invented at JPL fifty years ago, and to produce the stunning images the public enjoys today, scientists there continue to use a variety of image-editing tools—including Adobe Photoshop.

Assisting the Robots of Mars

If you consider, for instance, the rough and rocky terrain the Mars rovers traverse while snapping photos, there’s often a lot left for specialists to polish. Because Mars rovers are rarely aligned with a flat horizon, and their cameras are geared primarily to focus on the landscape and geology, photographic exposures aren’t adjusted to compensate for variations in the brightness of the atmosphere. To fix this, image specialists working at JPL often use Photoshop to crop out and replace, with color gradients, an otherwise jagged or washed-out sky.

MarsOpportunity_sol3948-49bw-838x216MarsOpportunity_sol3948-49_3D-838x215

Before and after: an original composite and a 3D-glasses-required, enhanced view of ‘Marathon Valley’ taken by the Mars Exploration Rover Opportunity on March 3-4, 2015. Credit: NASA/JPL-Caltech.

 As many viewers to NASA’s websites will have noticed, the Mars rover cameras also provide images in stereo—one image from the right and one from the left, the same way our eyes operate—which enables editing specialists back home to easily create some truly lifelike 3D color images in Photoshop.  But because images are often taken in a series, stitching them together can be challenging.  Obvious seams can detract from the visual impact of a photo, so NASA’s image-processing experts have developed methods using camera models and tie-points to spatially align data across these seams.  To complete the mosaics, specialists then correct brightness variations using Photoshop to compensate for Martian dust and its interaction with sunlight on the camera lenses.

JPL points out that changes made using Photoshop are enhancements to improve the aesthetic appeal of the images—the majority of which are already stunning in their original form—and that the scientific integrity of the images is their top priority. And for purists, or those wishing to make their own space-image masterpieces, NASA makes all images from the rovers available to the public exactly as they came down from Mars in their raw, unprocessed form.

Revealing a Universe Beyond Visible Light

One man equally adept at astronomy and Photoshop is visualization scientist Robert Hurt, who works for Caltech’s Infrared Processing and Analysis Center. Among a number of other NASA projects he handles in his role at Caltech, he devotes much of his time to processing infrared images from missions including the Wide-field Infrared Survey Explorer (WISE) and the Spitzer Space Telescope. “Spitzer is the infrared component of NASA’s Great Observatory program,” he tells us proudly, whose images provide “a chance to help complete our view of the universe.”

 

RobertHurt

Robert Hurt, an astronomer and visualization scientist at Caltech’s Infrared Processing and Analysis Center, uses Photoshop at work on April 10, 2015. Credit: Robert Hurt.

With all of the potentially beautiful images that can be captured by these trans-galactic cameras, Hurt says he usually feels both a sense of artistic and scientific responsibility.

“I oversee the visual side of our science communications,” he says. “A big part of that is rendering the data images that are collected by the telescopes, followed by a final polish using Photoshop to create a publication-quality, photographic representation of what we see that is useful to the public and astronomers alike.”

Unlike most other Photoshop users, who tend to correct images of physical objects, human models, or earthbound natural settings, Hurt’s responsibility is compounded by the fact that he uses Photoshop to let us see what is literally unseeable—and not just because the astronomical objects are unfathomably far away.

OrionM42_WISE

OrionM42_WISE

The Orion Nebula, M42, as imaged by NASA’s Wide-field Infrared Survey Explorer, or WISE, in January 2013. Infrared wavelength representations (blue = hotter, green/red = cooler), from left to right: Red, green, cyan, blue, raw 3-channel, level adjustment, 4th cyan channel, and final Photoshop composite. Credit: NASA/JPL-Caltech/UCLA.

 “I basically take raw grayscale data from different parts of the infrared spectrum, and then remap them into visible colors—typically with red, green, and blue Photoshop layers—to create images that are accurately representative of the infrared colors that human eyes cannot see,” he explains. “I think of it as a visual translation process.”

To achieve this, Hurt must not only understand the astronomy involved in any given image, but also how to use Photoshop’s myriad of tools to best represent the celestial sights.

“My general workflow for this,” he says, “is to first take the original observational data from the telescope, which is kind of an HDR representation of the sky. Each observation involved in compiling that HDR image will then have its own layer grouping in Photoshop, to which I’ll make layer adjustments and curve adjustments to bring out the contrast and important details in the data. Sometimes I’ll bring in Hubble’s visible-light photos of the same astronomical region, too, and layer the Spitzer data on top of those to create images highlighting the interesting contrasts between different parts of the spectrum that the general public can enjoy and understand.”

SombreroSpitzerHST

This striking composite image (right) of the Sombrero Galaxy, M104, was produced by combining images from NASA’s Spitzer (left) and Hubble Space Telescopes (middle).

Credit: Infrared: NASA/JPL-Caltech/R. Kennicutt (University of Arizona) and the SINGS Team.

Despite the otherworldly images involved, Hurt is quick to add that his work is frequently no different from the “Photoshopping” that goes on in any other field. “The optics of the camera can create artifacts that to a naive viewer might look like something real from the universe,” he explains. “But these are things we want to clean out of the image, because we don’t people to think there’s some weird planet thing floating out there when there isn’t.” However, he points out that it usually isn’t easy, because he always runs the risk of airbrushing out real scientific data.

“I always save my layers, generating massive multi-gigabyte files, because I want to be able to backtrack in case I accidentally delete something real.” He pauses, adding, “Telling the difference between a digital blemish and an astrophysical object is where being an astronomer comes in.”