Rabu, 04 Juni 2008

The Tech Behind the Phoenix Mars Lander's Onboard Cameras

A 3-D stereoscopic imager and a robotic arm camera with an LED flash make up Phoenix's Red Planet gear bag

By John Mahoney
Posted 06.03.2008 at 5:23 pm



Say Cheese, Martians!: The Phoenix Lander's main camera can capture 3-D stereoscopic images. Photo by NASA/JPL/University of Arizona


For the past two weeks, NASA's Phoenix Mars Lander has been broadcasting a wealth of incredible images from its landing site in the Martian arctic. I've been refreshing the mission's raw photo stream obsessively—no little green men yet, just gorgeous panoramas and detailed closeups of the most foreign of all foreign lands. Being a bit of a camera geek, I was quite curious as to what kind of hardware was behind the action, and naturally, Phoenix has some pretty sweet gear on board to make it all possible.

Phoenix's primary camera is the Surface Stereo Imager (or SSI, the adorable Johnny 5-looking piece shown above). It's actually a new and improved version of a similar camera used for the Mars Pathfinder mission. It consists of a housing mounted a little over six feet above the surface (simulating human height), containing two stereoscopic lenses (simulating, you guessed it, human eyes), which can each capture a 1024x1024 (1-megapixel) image on a monochrome CCD sensor. The SSI can be rotated 360-degrees around the craft, as well as panned vertically for images of the sky and soil.

The camera's stereoscopic setup isn't just for spitting out fuzzy images to look at with 3D glasses (although those are awesome)—the stereoscopic view can also be used to generate an accurate three-dimensional topographic map of the landing site, which the team will then use to select areas for digging and move the robot arm safely without getting snagged on any surface projections.

Taking a look at the spec sheet, the SSI may look painfully outdated when compared with even low-end consumer digicams. One measly megapixel? Monochrome sensor? But these "limitations" are by design. Resolution wise, the cameraphone in your pocket can probably capture more megapixels than Phoenix's main imager. But your cameraphone doesn't have to beam its images millions of miles over a finicky, slow data connection, does it? The Phoenix team chose a balance between resolution and file size; when a higher-res image or a panorama is needed, the SSI can collect multiple captures which are then composited into mosaics back on Earth.

The monochrome CCD sensor, on the other hand, actually allows for a more accurate color image in the end by giving the Phoenix team control over each wavelength of light individually. In your digital camera here on Earth, a filter overlay divides the image sensor pixel-by-pixel into receptors for red, green and blue light, which are then combined by the image processor to create a full-color RGB image. Onboard the Phoenix, this same basic concept is done with a 12-position filter wheel that can isolate lights of various wavelengths for capture by the entire sensor. After compositing the single-wavelength captures together back on Earth, a more accurate color representation is possible. The filters can also be used to measure Martian atmospheric conditions by estimating density, water vapor levels and the presence of airborne dust.


Phoenix's Robot Arm Camera (RAC): Red, green and blue LEDs above and below the lens help isolate light wavelengths for accurate color images. Photo by NASA/JPL/University of Arizona


Phoenix's second main camera, the Robotic Arm Camera (RAC), uses an even more novel approach to capturing color images on a monochrome CCD. Since the RAC is mounted on the robotic arm itself, it has to be light enough to not impede the arm's motion. To save weight, the color filtering is handled by two banks of red, green and blue LEDs. For each image, the RAC fires off four individual frames—one with each color LED illuminated individually, and a fourth with only natural light which is then subtracted from the three LED images so that only pure color remains. These are then combines to form full-color images.

The RAC's CCD sensor captures at an even lower pixel resolution—just 512x256, or about 1/10 of a megapixel. A close-focusing macro mode however can focus down to 13 millimeters, resulting in a real-world resolution of 23 microns per pixel—basically enabling the RAC to act as a microscope. The RAC's main duties are to photograph soil samples both before and after they're scooped up by the robotic arm. It can also reach areas that the SSI can't, such as the white patches on the surface beneath the lander that could be salt deposits or water ice.

Phoenix's First Soil Sample: As photographed by the RAC camera. Photo by NASA/JPL/University of Arizona

Using these two cameras together, Phoenix can capture everything from stunning 360-degree Martian vistas to extreme close-ups of soil samples, all with fairly straightforward tech that's smartly optimized specifically for the task at hand. Now if we could just get a Phoenix Flickr account going...

Special thanks to Horst Uwe Keller of the RAC team and Keri Bean of the SSI team to take a moment away from their downlink screens to answer a few questions, and to everyone at the University of Arizona and the Max Planck Institute.

Tidak ada komentar:

Posting Komentar