Imaging spectroscopy, also known as hyperspectral remote sensing, allows a sensor on a moving platform to gather reflected radiation from a ground target such that a special detector system consisting of CCD devices can record up to 200+ spectral channels simultaneously over the range from 0.38 to 2.5 mm. With sampling thus at a 0.01 mm interval it is possible to plot the data as quasi-continuous narrow bands that approximate a spectral signature rather than histogram-like broader bands. With such detail, the ability to detect and identify individual materials or classes greatly improves. The AVIRIS instrument developed at JPL is described. Examples are shown that confirm that the hyperspectral approach is now the state-of-the- art cutting edge of remote sensing from air and space.
Another major advance, now coming into its own as a powerful and versatile means for continuous sampling of broad intervals of the spectrum, is hyperspectral imaging (see second half of Section 13, starting on page 13-5, for more principles and details). Heretofore, because of the high speeds of air and space vehicle motion, insufficient time was available for a spectrometer to dwell on a small area of Earth's surface or an atmospheric target. Thus, data were necessarily acquired for broad bands in which spectral radiation is integrated within the sampled areas to cover ranges, such as 0.1 µm, for instance Landsat. In hyperspectral data, that interval narrows to 10 nanometers (1 micrometer [µm] contains 1000 nanometers [1 nm = 10-9m]). Thus, we can subdivide the interval between 0.38 and 2.55 µm into 217 intervals, each approximately 10 nanometers (0.01 µm) in width. These are, in effect, narrow bands. The detectors for VNIR intervals are silicon microchips, while those for the Short Wave InfraRed (SWIR, between 1.0 and 2.5 µm) intervals consist of an Indium-Antimony (In-Sb) alloy. If a radiance value is obtained for each such interval, and then plotted as intensity versus wavelength, the result is a sufficient number of points through which we can draw a meaningful spectral curve.
The Jet Propulsion Lab (JPL) has produced two hyperspectral sensors, one known as AIS (Airborne Imaging Spectrometer), first flown in 1982, and the other known as AVIRIS (Airborne Visible/InfraRed Imaging Spectrometer), which continues to operate since 1987. AVIRIS consists of four spectrometers with a total of 224 individual CCD detectors (channels), each with a spectral resolution of 10 nanometers and a spatial resolution of 20 meters. Dispersion of the spectrum against this detector array is accomplished with a diffraction grating. The total interval reaches from 380 to 2500 nanometers (about the same broad interval covered by the Landsat TM with just seven bands). It builds an image, pushbroom-like, by a succession of lines, each containing 664 pixels. From a high altitude aircraft platform such as NASA's ER-2 (a modified U-2), a typical swath width is 11 km.
From the data acquired, we can calculate a spectral curve for any pixel or for a group of pixels that may correspond to an extended ground feature. Depending on the size of the feature or class, the resulting plot will be either a definitive curve for a "pure" feature or a composite curve containing contributions from the several features present (the "mixed pixel" effect discussed in Section 13). In principle, the intensity variations for any 10-nm interval in the array extended along the flight line can be depicted in gray levels to construct an image. In practice, to obtain strong enough signals, data from several adjacent intervals are combined. Some of these ideas are elaborated in the block drawing shown here.
Below is a hyperspectral image of some circular fields (see Section 3) in the San Juan Valley of Colorado. The colored fields are identified as to vegetation or crop type as determined from ground data and from the spectral curves plotted beneath the image for the crops indicated (these curves were not obtained with a field spectrometer but from the AVIRIS data directly).
In Section 13 other AVIRIS images, used for mineral exploration near Cuprite, Nevada and other mining districts are displayed (see page 13-10) following an extended narrative on principles of spectroscopy and further consideration of the hyperspectral approach. A preview of the remarkable results achievable by this technology is given by this trio of images of the Cuprite district. The left image shows the area mapped as rendered in a near natural color version; the center image utilizes narrow bands that are at wavelengths in which certain minerals reflect energy related to vibrational absorption modes of excitation; in the right image, modes are electronic absorption (see page 13-7).Shown here without the mineral identification key, the reds, yellows, purples, greens, etc. all relate to specific minerals.
We know hyperspectral data are usually superior for most analyses to broader-band multispectral data, simply because such data provide so much more detail about the spectral properties of features to be identified. In essence, hyperspectral sensing yields continuous spectral signatures rather than the band histogram plots that result from systems like the Thematic Mapper which "lump" varying wavelengths into single-value intervals. Plans are to fly hyperspectral sensors on future spacecraft (see Section 21, page 21-1). The U.S. Navy is presently developing a more sophisticated sensor called HRST and industry is also designing and building hyperspectral instruments such as ESSI's Probe 1.
I-25: In your own words, using a single sentence, state the major advantage of hyperspectral sensors over broad band sensors. ANSWER
As of 2000, there are plans to put several hyperspectral sensors on to space platforms. One such instrument, called Hyperion, is part of EO-1, the first satellite in NASA's New Millenium series, launched in December, 2000. It was inserted into an orbit that places it just about 50 km (30 miles) behind Landsat 7, which allows similar images acquired at almost the same time to be compared for performance evaluation. Operated by Goddard Space Flight Center, this satellite is a test bed for new ideas in instrumentation that can be made smaller and lighter, so that launch costs can be lowered. Here is an artist's illustration of EO-1, with its 3 main sensors:
The Hyperion consists of CCD detectors and other components that break the spectral range from 0.4 to 2.5 µm into 220 channels. Each resulting image is 7.5 by 100 km in ground coverage; resolution is 30 meters. This next image shows a color composite made with 3 narrow channels all in the visible in which the scene is of Maryland and Virginia along the Potomac River:
Hyperion images commonly are presented as long strips corresponding to down track scene acquisition. This image shows sedimentary rocks in a fold belt in the Mount Fritton area of the Flinders Range in South Australia: The Atmospheric Corrector takes measurements that help to remove adverse effects from the atmosphere on image/data quality. A third sensor, the ALI (Advanced Land Imager) has 9 spectral bands and provides both multispectral images (30 m resolution) and panchromatic images (10 m). Here is an ALI image of the central part of Washington, D.C.
Hyperion images commonly are presented as long strips corresponding to down track scene acquisition. This image shows sedimentary rocks in a fold belt in the Mount Fritton area of the Flinders Range in South Australia:
The Atmospheric Corrector takes measurements that help to remove adverse effects from the atmosphere on image/data quality. A third sensor, the ALI (Advanced Land Imager) has 9 spectral bands and provides both multispectral images (30 m resolution) and panchromatic images (10 m). Here is an ALI image of the central part of Washington, D.C.
Primary Author: Nicholas M. Short, Sr. email: email@example.com