Though photographs are accepted as subjective but ultimately faithful visual reproductions of reality, in many instances they don’t correspond to our experience. Pupils don’t regularly glint red, and people don’t transform into the streaked, evanescent smears we so often witness in photos. Yet we have no trouble accepting these inconsistencies, knowing that taking a picture of a scene intrudes into the "reality" of that scene, and thus the resulting photograph is both a depiction of a real image and the story of its capture. And so we explain away the red eyes as a product of the camera’s flash, and the smear as the result of someone moving more quickly than the camera’s exposure time. We take intuitive steps in these instances to interpret the photos and reconcile them with reality. Something similar happens in understanding the photo-like reproductions of physical data produced by radar imaging, but the interpretive steps are far more complex. Bruce Campbell, of the Center for Earth and Planetary Sciences (CEPS) at the Smithsonian National Air and Space Museum, is at the forefront of Earth-based radar mapping of the moon. With the help of Lynn Carter and myself, both at CEPS, he has been working the last five years to make radar maps of the entire near side of the Moon and its poles. The maps are being constructed in a piecemeal fashion: about 120 egg-shaped patches, each covering a unique portion of the lunar surface and overlapping slightly with those adjacent to it, will be mosaicked together into a 72000 by 48000 pixel cylindrical map and two 23000 by 23000 pixel polar maps. Each patch is mapped at 80m/pixel, the highest resolution Earth-based radar maps of the Moon’s surface yet produced. As of today, the project is over half complete; around 60 of these patches have been processed and incorporated into the working mosaics.
With all of the high-resolution visible light images of the lunar surface obtained since the 1960’s, what additional information does radar imaging provide? Radar employs radio waves, the longest in the electromagnetic spectrum, with wavelengths up to the meter scale. Longer waves are able to penetrate surfaces that would reflect shorter waves like those in the visible portion of the spectrum. Thus the 12.6-cm waves used in this mapping probe a meter or more into the regolith (the layer of dust and rock blanketing almost the entire Moon), and are reflected, or backscattered, by surface and subsurface rocks, providing data that can be manipulated into a realistic image. Knowledge of the surface and subsurface roughness helps elucidate the Moon’s history and provides information vital for future missions and geologic studies. Recently, for example, data from this project aided NASA’s LCROSS mission by providing high-resolution views of possible impact sites in craters near the north pole.
The process by which this backscattered energy is collected and translated into an accurate picture of the Moon’s surface is complicated. A radar signal is transmitted toward the Moon from the 300-m telescope in Arecibo, Puerto Rico. For 29 minutes, this pulsing beam follows a designated coordinate on the surface while the 100-m telescope in Green Bank, West Virginia samples the reflected signal, recording it as a string of voltage values. From these values, a matrix of time-delay and frequency measurements is derived that forms a crude image. The surreal image produced at this stage betrays the method of acquisition. The diffraction caused by the beam’s transmission from a circular aperture can be seen in the dark "nulls" and progressively dimmer rings surrounding the illuminated central oval. Moreover, during the 29 minutes the Moon and Earth are moving, a fact visible in the warped nature of the picture. At such a high resolution, it’s impossible for the mind to decode the convoluted effects of this motion in detail and thereby intuit the actual spatial layout of this portion of the surface. Specialized processing software is used to relate segments of the illuminated area to a latitude-longitude grid, in a "focusing" method based on quantitative models approximating the movement of the Moon in relation to the receiving telescope. The resultant focused image is trimmed to the boundary of the central beam - the bright, inner egg - and added to the mosaic.
As we intuitively do with camera images, we can account for the effects of the data collection method on the unfocused radar image to obtain a high-resolution picture of the surface of the Moon consistent with our conception of reality.
By Ross Anderson, Physical Scientist, Smithsonian Center for Earth and Planetary Sciences