seminar report on remote sensing

39
Contents: 1. Introduction to Remote Sensing Basic Principle Active & Passive systems 2. Types of Remote sensing Satellite Remote sensing Optical & IR Remote sensing Microwave Remote sensing 3. Platforms for Remote Sensing 4. Sources & Sensors Energy sources Sensors 5. Process of Remote Sensing Interaction Process Interaction of EM radiation with earth’s surface Effect of atmosphere 6. Electromagnetic radiation Electromagnetic Waves 7. Electromagnetic spectrum Characteristics 8. Reflectance 9. Color composite images True color images False color images Natural color images 10. Synthetic aperture radar 11. Limitations of Remote Sensing 12. Applications Remote Sensing 13. Bibliography

Upload: akki11190

Post on 28-Apr-2015

458 views

Category:

Documents


27 download

DESCRIPTION

this is a full seminar report on the topic Remote Sensing. it can help you to make your own report on this topic. it also include some part of image processing & color composition as true & false color composition.

TRANSCRIPT

Page 1: seminar report on remote sensing

Contents:

1. Introduction to Remote Sensing Basic Principle Active & Passive systems

2. Types of Remote sensing Satellite Remote sensing Optical & IR Remote sensing Microwave Remote sensing

3. Platforms for Remote Sensing4. Sources & Sensors

Energy sources Sensors

5. Process of Remote Sensing Interaction Process Interaction of EM radiation with earth’s surface Effect of atmosphere

6. Electromagnetic radiation Electromagnetic Waves

7. Electromagnetic spectrum Characteristics

8. Reflectance9. Color composite images

True color images False color images Natural color images

10. Synthetic aperture radar11. Limitations of Remote Sensing12. Applications Remote Sensing13. Bibliography

INTRODUCTION

Page 2: seminar report on remote sensing

Remote sensors measure electromagnetic (EM) radiation that has interacted with the Earth’s surface. Interactions with matter can change the direction, intensity, wavelength content, and polarization of EM radiation. The nature of these changes is dependent on the chemical make-up and physical structure of the material exposed to the EM radiation. Changes in EM radiation resulting from its interactions with the Earth’s surface therefore provide major clues to the characteristics of the surface materials.

Basic Principal:-

Electromagnetic radiation that is transmitted passes through a material (or through the boundarybetween two materials) with little change in intensity. Materials can also absorb EM radiation. Usually absorption is wavelength-specific: that is, more energy is absorbed at some wavelengthsthan at others. EM radiation that is absorbed is transformed into heat energy, which raises the material’s temperature.

Some of that heat energy may then be emitted as EM radiation at a wavelength dependent on thematerial’s temperature. The lower the temperature, the longer the wavelength of the emitted radiation. As a result of solar heating, the Earth’s surface emits energy in the form of longer-wavelength infrared radiation (see illustration on the preceding page). For this reason the portion of the infrared spectrum with wavelengths greater than 3 micro meters is commonly called the thermal infrared region. Electromagnetic radiation encountering a boundary such as the Earth’s surface can also be reflected. If the surface is smooth at a scale comparable to the wavelength of the incident energy, seculars reflection occurs: most of the energy is reflected in a single direction, at an angle equal to the angle of incidence. Rougher surfaces cause scattering, or diffuse reflection in all directions.

Page 3: seminar report on remote sensing

We perceive the surrounding world through our five senses. Some senses (touch and taste) require contact of our sensing organs with the objects. However, we acquire much information about our surrounding through the senses of sight and hearing which do not require close contact between the sensing organs and the external objects. In another word, we are performing Remote Sensing all the time.

Page 4: seminar report on remote sensing

Generally, Remote sensing refers to the activities of recording/observing/perceiving (sensing) objects or events at far away (remote) places. In remote sensing, the sensors are not in direct

contact with the objects or events being observed. The information needs a physical carrier to travel from the objects/events to the sensors through an intervening medium. The electromagnetic radiation is normally used as an information carrier in remote sensing. The output of a remote sensing system is usually an image representing the scene being observed. A further step of image analysis and interpretation is required in order to extract useful information from the image. The human visual system is an example of a remote sensing system in this general sense.

In a more restricted sense, remote sensing usually refers to the technology of acquiring information about the earth's surface (land and ocean) and atmosphere using sensors onboard airborne (aircraft, balloons) or spaceborne (satellites, space shuttles) platforms.

Active & Passive Systems

Source of Electromagnetic Radiation: One of the most important distinctions among remote sensing systems involves the source of the radiation used. The easiest example to use is that of a camera. When a camera is used utilizing sunlight or even ambient light in a room, it is said to be a passive system. On the other hand, when it utilizes flash bulbs, it is an active system. That is, an active system provides its own radiation. Ordinary radar is an active system, while imaging near infrared systems are passive systems. Passive systems are used when there is sufficient illumination of the object of interest to allow detection. Active systems are used when there is insufficient radiation and it must be provided. A second reason for using passive systems is in situations where the radiation given off is not used for imaging alone, but also quantitatively describes properties of the object. Thermal infrared is an example here. The radiation measured is related to the temperature of the object.

Page 5: seminar report on remote sensing

Transmission through the Atmosphere: In all these systems it is necessary for the radiation to pass through the atmosphere (once for passive systems, twice for active systems). Therefore, it is sometimes necessary to keep in mind the interaction between the atmosphere and the radiation. Perhaps the best example of this is the scattering of blue light by the atmosphere. Blue light is actually scattered out of the beam from the sun. It is then scattered toward us from all directions. If blue light were not scattered, the sun would look white instead of red and the sky would be transparent. (We would see stars in the daytime; shadows would be very pronounced.) Ultraviolet light is somewhat "bluer" than blue light and it is scattered even more in the atmosphere than blue light. Furthermore, ultraviolet light will expose photographic film. On a bright day this scattered ultraviolet light will fog a photograph of distant objects. In order to avoid this, we use a filter which passes visible light but not ultraviolet light (called a UV filter). The utilization of almost every remote sensing system used requires some consideration of the transmission and scattering properties of the atmosphere for a particular wavelength. These problemswill be discussed where appropriate.

Interaction with the Earth's Surface: A major aspect of interpretation of remotely sensed data is the nature of the interaction of radiation with the earth's surface. Each kind of surface material has its own signature. For instance, a water surface absorbs the near infrared and reflects a fair amount of green light. Snow reflects both. While it is possible for the observer to catalogue these signatures, occasionally he will encounter an object whose signature is puzzling. In those cases it may be necessary to play "detective" and consider the aspects of the surface which may be producing the signature observed. For instance, the unusual occurrence of a rainstorm upon snow-covered sea ice may create an area with unusual absorption in the near infrared. It is not likely that this signature would be listed in any reference manual. The nature of the interaction of radiation with the earth's surface can be quite different for active and passive systems. Passive systems depend on illumination from a natural source, usually the sun or radiation emitted from the object. In this case, the angle of illumination is different from the "look" angle. However, usually there is sufficient illumination that there are few total shadows. Actually, we are quite used to this situation since we experience it daily. Most active systems depend on radiation emitted and reflected directly back to the source. This can create effects we do not experience on a daily basis. Consider how things look to you when using a flashlight on a dark night; shadows are troublesome. Yet, this is how the earth looks on airborne imaging radar.

Types of Remote Sensing:

Satellite Remote Sensing

These remote sensing satellites are equipped with sensors looking down to the earth. They are the "eyes in the sky" constantly observing the earth as they go round in predictable orbits.

Several remote sensing satellites are currently available, providing imagery suitable for various types of applications. Each of these satellite-sensor platforms are characterized by the wavelength bands employed in image acquisition, spatial resolution of the sensor, the coverage area and the temporal coverage, i.e. how frequent a given location on the earth surface can be imaged by the imaging system.

In terms of the spatial resolution, the satellite imaging systems can be classified into:

Low resolution systems (approx. 1 km or more) Medium resolution systems (approx. 100 m to 1 km)

Page 6: seminar report on remote sensing

High resolution systems (approx. 5 m to 100 m) Very high resolution systems (approx. 5 m or less)

In terms of the spectral regions used in data acquisition, the satellite imaging systems can be classified into:

Optical imaging systems (include visible, near infrared, and shortwave infrared systems) Thermal imaging systems Synthetic aperture radar (SAR) imaging systems

Optical/thermal imaging systems can be classified according to the number of spectral bands used:

Monospectral or panchromatic (single wavelength band, "black-and-white", grey-scale image) systems

Multispectral (several spectral bands) systems Superspectral (tens of spectral bands) systems Hyperspectral (hundreds of spectral bands) systems

Synthetic aperture radar imaging systems can be classified according to the combination of frequency bands and polarization modes used in data acquisition, e.g.:

Single frequency (L-band, or C-band, or X-band) Multiple frequency (Combination of two or more frequency bands) Single polarization (VV, or HH, or HV) Multiple polarization (Combination of two or more polarization modes)

Optical and Infrared Remote Sensing:

In Optical Remote Sensing, optical sensors detect solar radiation reflected or scattered from the earth, forming images resembling photographs taken by a camera high up in space. The wavelength region usually extends from the visible and near infrared (commonly abbreviated as VNIR) to the short-wave infrared (SWIR).

Page 7: seminar report on remote sensing

Different materials such as water, soil, vegetation, buildings and roads reflect visible and infrared light in different ways. They have different colors and brightness when seen under the sun. The interpretation of optical images require the knowledge of the spectral reflectance signatures of the various materials (natural or man-made) covering the surface of the earth.

There are also infrared sensors measuring the thermal infrared radiation emitted from the earth, from which the land or sea surface temperature can be derived.

Microwave Remote Sensing

There are some remote sensing satellites which carry passive or active microwave sensors. The active sensors emit pulses of microwave radiation to illuminate the areas to be imaged. Images of the earth surface are formed by measuring the microwave energy scattered by the ground or sea back to the sensors. These satellites carry their own "flashlight" emitting microwaves to illuminate their targets. The images can thus be acquired day and night. Microwaves have an additional advantage as they can penetrate clouds. Images can be acquired even when there are clouds covering the earth surface.

A microwave imaging system which can produce high resolution image of the Earth is the synthetic aperture radar (SAR). The intensity in a SAR image

Page 8: seminar report on remote sensing

depends on the amount of microwave backscattered by the target and received by the SAR antenna. Since the physical mechanisms responsible for this backscatter is different for microwave, compared to visible/infrared radiation, the interpretation of SAR images requires the knowledge of how microwaves interact with the targets.

Electromagnetic radiation in the microwave wavelength region is used in remote sensing to provide useful information about the Earth's atmosphere, land and ocean.

A microwave radiometer is a passive device which records the natural microwave emission from the earth. It can be used to measure the total water content of the atmosphere within its field of view.

A radar altimeter sends out pulses of microwave signals and record the signal scattered back from the earth surface. The height of the surface can be measured from the time delay of the return signals.

A wind Scatterometer can be used to measure wind speed and direction over the ocean surface. it sends out pulses of microwaves along several directions and records the magnitude of the signals backscattered from the ocean surface. The magnitude of the backscattered signal is related to the ocean surface roughness, which in turns is dependent on the sea surface wind condition, and hence the wind speed and direction can be derived. Platforms to generate high resolution images of the earth surface using microwave energy.

Page 9: seminar report on remote sensing

Platforms For Remote Sensing:

Page 10: seminar report on remote sensing

Sources & Sensors:

Energy Sources:

• Visible Light is only one form of electromagnetic energy.

• Radio waves, heat, ultra-violet rays and X-rays are other familiar forms.

• All of this energy is inherently similar, and radiates in accordance with basic wave theory.

SOURCES OF ELECTROMAGNETIC RADIATION:

Black Body Radiation. All objects with a temperature above absolute zero emit electromagnetic radiation. The amount of radiation in each wavelength region depends on the temperature of the object in a complicated way but the total radiation is proportional to the object's Kelvin temperature taken to the 4th power (T 4). Hence an object at 373°K (boiling water) emits four times as much radiant energy as water at 273°K (O°C) although its absolute temperature is only 36% greater. The exact relationship between temperature and radiated energy per wavelength for a perfect radiator is called the black body curve. Figure shows this relationship for objects at 2500, 2750 and 3000 K.

The Sun and Earth as Black Body Radiators. The sun's black body curve peaks at a wavelength of 0.5 pm or 0.0005 millimeters, the wavelength of blue-green light. Therefore, the highest radiation level available for remote sensing detectors is at this wavelength. This is also close to the center of the wavelength range of human eyesight. Hence, human eyes are adapted to making the most of the available radiant energy from the sun. However, the sun's black body curve extends from the visible wavelengths to the infrared and even to the microwave region and beyond.

Page 11: seminar report on remote sensing

The earth's absolute temperature is around 290 0K (17%). The black body curve for this temperature peaks around 9.7pm. This wavelength is well within the thermal infrared region of the spectrum. However, the earth radiates less energy at all wavelengths than the sun even at this peak for the earth's black body curve. For this reason, daytime thermal infrared measurements can be highly distorted by reflected or backscattered solar energy.

SENSORS:

Reflected solar radiation sensors: These sensor systems detect solar radiation that has been diffusely reflected (scattered) upward from surface features. The wavelength ranges that provide useful information include the ultraviolet, visible, near infrared and middle infrared ranges.Reflected solar sensing systems discriminate materials that have differing patterns of wavelength specific absorption, which relate to the chemical make-up and physical structure of the material. Because they depend on sunlight as a source, these systems can only provide useful images during daylight hours, and changing atmospheric conditions and changes in illumination with time of day and season can pose interpretive problems. Reflected solar remote sensing systems are the most common type used to monitor Earth resources.

Thermal infrared sensors: that can detect the thermal infrared radiation emitted by surface features can reveal information about the thermal properties of these materials. Like reflected solar sensors, these are passive systems that rely on solar radiation as the ultimate energy source.Because the temperature of surface features changes during the day, thermal infrared sensing systems are sensitive to time of day at which the images are acquired.

Imaging radar sensors Rather than relying on a natural source, these “active” systems “illuminate” the surface with broadcast microwave radiation, then measure the energy that is diffusely reflected back to the sensor. The returning energy provides information about the surface roughness and water content of surface materials and the shape of the land surface. Long wavelength microwaves suffer little scattering in the atmosphere, even penetrating thick cloud cover. Imaging radar is therefore particularly useful in cloud-prone tropical regions.

Sensors type:Optical:•Visible Reflectance•Near Infrared Reflectance•Thermal Infrared Thermal Radiation

Page 12: seminar report on remote sensing

Microwave:•Passive (Scatterometer) Microwave Radiation•Active (SAR, Altimeter) Backscatter

Laser:•Active Intensity, Time

FUSING DATA FROM DIFFERENT SENSORS:-Materials commonly found at the Earth’s surface, such as soil, rocks, water, vegetation, and manmade features, possess many distinct physical properties that control their interactions with electromagnetic radiation. In the preceding pages we have discussed remote sensing systems thatuse three separate parts of the radiation spectrum: reflected solar radiation (visible and infrared), emitted thermal infrared, and imaging radar. Because the interactions of EM radiation with surface features in these spectral regions are different, each of the corresponding sensor systems measures a different set of physical properties. Although each type of system by itself can reveal a wealth of information about the identity and condition of surface materials, we can learn even more by combining image data from different sensors. Interpretation of the merged data set can employ rigorous quantitative analysis, or more qualitative visual analysis.

Process of Remote Sensing:

Energy Source or Illumination (A) - the first requirement for remote sensing is to have an energy source which illuminates or provides electromagnetic energy to the target of interest.

Radiation and the Atmosphere (B) - as the energy travels from its source to the target, it will come in contact with and interact with the atmosphere it passes through. This interaction may take place a second time as the energy travels from the target to the sensor.

Page 13: seminar report on remote sensing

Interaction with the Target (C) - once the energy makes its way to the target through the atmosphere, it interacts with the target depending on the properties of both the target and the radiation.

Recording of Energy by the Sensor (D) - after the energy has been scattered by, or emitted from the target, we require a sensor (remote - not in contact with the target) to collect and record the electromagnetic radiation.

Transmission, Reception, and Processing (E) - the energy recorded by the sensor has to be transmitted, often in electronic form, to a receiving and processing station where the data are processed into an image (hardcopy and/or digital).

Interpretation and Analysis (F) - the processed image is interpreted, visually and/or digitally or electronically, to extract information about the target which was illuminated.

Application (G) - the final element of the remote sensing process is achieved when we apply the information that we have been able to extract from the imagery about the target, in order to better understand it, reveal some new information, or assist in solving a particular problem.

Interaction Process In Remote Sensing:-

As sunlight initially enters the atmosphere, it encounters gas molecules, suspended dust particles,and aerosols. These materials tend to scatter a portion of the incoming radiation in all directions, with shorter wavelengths experiencing the strongest effect. (The preferential scattering of blue light in comparison to green and red light accounts for the blue color of the daytime sky. Clouds appear opaque because of intense scattering of visible light by tiny water droplets.) Although most of the remaining light is transmitted to the surface, some atmospheric gases are very effective at absorbing particular wavelengths. (The absorption of dangerous ultraviolet radiation by ozone is a well-known example).

Page 14: seminar report on remote sensing

Interaction of Electromagnetic Radiation With The Earth's Surface:

Definition of Terms

Reflection: True reflection occurs when radiation which has been incident on a surface at some angle, s, leaves the surface at that same angle as measured from a normal to the surface.

Scattering: Scattering occurs when light which has been incident on a surface (or within a volume such as a cloud) leaves at a wide range of angles. Very often the intensity of the scattered radiation varies with scattering angle. In some cases, such as volume scattering within our atmosphere, some wavelengths are scattered more than others. Some wavelengths will be scattered from a surface that will reflect other wavelengths. The general rule is that if the surface roughness elements are long compared to the radiation's wavelength, the radiation will be reflected; if the roughness elements are short compared to the wavelength, it will be scattered. For instance, a side-scanning radar image of a typical ice scene will show smooth ice as almost black since nearly all the incident radiation is reflected away at the ice surface. There is often a tendency to interpret this as open water. On the other hand, even a small ridge will produce a bright return signal because the incident radiation is scattered in all directions by the many small surfaces of the ice composing the ridge.

Absorption: Except for unusual cases, some of an incident electromagnetic signal is absorbed by the material of the surface it strikes or the medium it passes through. In the case of both active and passive systems, absorption of the electromagnetic signal by the atmosphere plays a major role in determining which wavelengths are used. Water vapor in the atmosphere absorbs many of the microwave wavelengths leaving a few "windows" through which we can transmit and receive this information. Water strongly absorbs radiation in the near infrared portion of the spectrum used by Landsat band 7 images and NOAA series spacecraft (near IR band images). This absorption is so strong that wet snow and ice can be interpreted as water unless data from other wavelengths are available. Microwave radiation is also strongly absorbed by a thin film of water.

Effects of Atmosphere

In satellite remote sensing of the earth, the sensors are looking through a layer of atmosphere separating the sensors from the Earth's surface being observed. Hence, it is essential to understand the effects of atmosphere on the electromagnetic radiation travelling from the Earth to the sensor through the atmosphere.

The atmospheric constituents cause wavelength dependent absorption and scattering of radiation. These effects degrade the quality of images. Some of the atmospheric effects can be corrected before the images are subjected to further analysis and interpretation.

A consequence of atmospheric absorption is that certain wavelength bands in the electromagnetic spectrum are strongly

Page 15: seminar report on remote sensing

absorbed and effectively blocked by the atmosphere. The wavelength regions in the electromagnetic spectrum usable for remote sensing are determined by their ability to penetrate atmosphere. These regions are known as the atmospheric transmission windows. Remote sensing systems are often designed to operate within one or more of the atmospheric windows. These windows exist in the microwave region, some wavelength bands in the infrared, the entire visible region and part of the near ultraviolet regions. Although the atmosphere is practically transparent to x-rays and gamma rays, these radiations are not normally used in remote sensing of the earth.

Electromagnetic Radiation

Experiments with electricity and magnetism in the 1800's developed a body of knowledge which led James Clerk Maxwell to predict in 1886 on purely theoretical grounds that it might be possible for electric and magnetic fields to combine, forming self-sustaining waves which couldtravel great distances. These waves would have many of the behavior characteristics of waves on water (reflection, refraction, defraction, etc.) and would travel at the speed of light. These properties gave rise to the possibility that light was an electromagnetic wave, but at that time, there was no proof that electromagnetic waves really existed. In 1888, Heinrich Hertz built an apparatus to send and receive Maxwell's waves. In this case the waves were around 5 meters long. The apparatus worked and, in addition, proved that the waves could be polarized which turns out to be an important property from a remote sensing point of view. After this, it was learned that light, x-rays, infrared, ultraviolet, radio, microwaves, and gamma rays were all electromagnetic waves. The only property dividing them was their wavelength ranges. The names for these divisions arise from the interaction properties each wavelength range exhibits. (For instance, we see light, radio waves are useful for communication, x-rays pass through objects, etc.).

Long before the wave description of light was developed by Maxwell, Sir Isaac Newton had also considered it and discarded the idea. The waves Newton considered were not electromagnetic, but compressional waves in the space-filling ether. Newton's reasoning was sound and based on the fact that the ether waves could not have some of the properties of light which had been observed. (Electromagnetic waves can have these properties.) Instead, he reasoned that light was composed of a vast flow of very tiny particles or corpuscles called photons. He was able to show that the flood of photons could have the known properties of light. The discovery of electromagnetic waves created the suspicion that the photon concept was incorrect. However, in 1905, Albert Einstein was able to show that no matter how light travels from place to place, it is emitted and absorbed in small packets of energy (photons again). As a result, electromagnetic waves are dichotomous. They are emitted and absorbed as particles, but travel as waves. Scientific thought concerning this paradox continues to the present. Each representation has been found to have its

Page 16: seminar report on remote sensing

particular utility.

Use of Electromagnetic Radiation For Remote Sensing

Aerial photography was the earliest form of remote sensing other than the telescope. For a long time, this technique relied on the portion of electromagnetic radiation used by our eyes (the visible spectrum). Early aerial photography was usually obtained on black and white film which responded to light over a broad range of visible light. Later, it was learned that by placing a filter in front of the lens which would pass only a particular color of light, a black and white record could be made of the objects reflecting light in that range. For instance, an aerial photograph of a developed area with a red-passing filter would show bare ground and many man-made surfaces which reflect a significant amount of red light. Hence, this photograph would be useful for identifying man-made features. This technique is used by the Landsat series of satellites today. Later, as color photography became available, color film was used in aerial photography. Again, filters could be used to enhance particular features.

Near Infrared Aerial Photography: During the second World War there was a need to detect camouflaged objects. Although a great deal of aerial photography was obtained, it was often difficult to detect objects which had been painted green or had been covered with cut tree branches. Some experimental film was developed which responded to light in the near infrared portion of the spectrum, light just a little more red than the red light detected by the human eye. One of the anticipated uses for this film involved the monitoring of healthy vegetation whose chlorophyll reflects the near infrared extremely well. This film was simply a black and white film with extended sensitivity which would record the near infrared if the visible light was filtered out. Later, a color film was developed which responded to the near infrared as well as visible colors (except blue). This was called color infrared film.

Growth of Remote Sensing: Encouraged by these results, efforts were made to utilize other electromagnetic wavelengths such as heat infrared, microwave, and radar for remote sensing purposes. Here the topic becomes complex because the radiation does not behave exactly as light does and it is not quite as simple to understand as the near infrared.

Imaging Satellite Systems: Another important factor in the development of remote sensing, particularly for ice surveillance, was the development of satellite systems which routinely return images to earth. The first of these systems operated in the visible portion of the spectrum because existing television technology was most easily applied there. Quickly, however, systems were developed to make use of other portions of the electromagnetic spectrum. At this time, satellite remote sensing systems based on radar were being developed.

Electromagnetic Waves

Page 17: seminar report on remote sensing

Electromagnetic waves are energy transported through space in the form of periodic disturbances of electric and magnetic fields. All electromagnetic waves travel through space at the same speed, c = 2.99792458 x 108 m/s, commonly known as the speed of light. An electromagnetic wave is characterized by a frequency and a wavelength. These two quantities are related to the speed of light by the equation,

Speed of light = frequency x wavelength

The frequency (and hence, the wavelength) of an electromagnetic wave depends on its source. There is a wide range of frequency encountered in our physical world, ranging from the low frequency of the electric waves generated by the power transmission lines to the very high frequency of the gamma rays originating from the atomic nuclei. This wide frequency range of electromagnetic waves constitutes the Electromagnetic Spectrum.

The Electromagnetic Spectrum:

The electromagnetic spectrum can be divided into several wavelength (frequency) regions, among which only a narrow band from about 400 to 700 nm is visible to the human eyes. Note that there is no sharp boundary between these regions. The boundaries shown in the above figures are approximate and there are overlaps between two adjacent regions.

Page 18: seminar report on remote sensing

Wavelength units: 1 mm = 1000 µm; 1 µm = 1000 nm.

Radio Waves: 10 cm to 10 km wavelength. Microwaves: 1 mm to 1 m wavelength. The microwaves are further divided into different

frequency (wavelength) bands: (1 GHz = 109 Hz) o P band: 0.3 - 1 GHz (30 - 100 cm) o L band: 1 - 2 GHz (15 - 30 cm) o S band: 2 - 4 GHz (7.5 - 15 cm) o C band: 4 - 8 GHz (3.8 - 7.5 cm) o X band: 8 - 12.5 GHz (2.4 - 3.8 cm) o Ku band: 12.5 - 18 GHz (1.7 - 2.4 cm) o K band: 18 - 26.5 GHz (1.1 - 1.7 cm) o Ka band: 26.5 - 40 GHz (0.75 - 1.1 cm)

Infrared: 0.7 to 300 µm wavelength. This region is further divided into the following bands:

o Near Infrared (NIR): 0.7 to 1.5 µm. o Short Wavelength Infrared (SWIR): 1.5 to 3 µm. o Mid Wavelength Infrared (MWIR): 3 to 8 µm. o Long Wavelength Infrared (LWIR): 8 to 15 µm. o Far Infrared (FIR): longer than 15 µm.

The NIR and SWIR are also known as the Reflected Infrared, referring to the main infrared component of the solar radiation reflected from the earth's surface. The MWIR and LWIR are the Thermal Infrared.

Visible Light: This narrow band of electromagnetic radiation extends from about 400 nm (violet) to about 700 nm (red). The various color components of the visible spectrum fall roughly within the following wavelength regions:

o Red: 610 - 700 nm o Orange: 590 - 610 nm o Yellow: 570 - 590 nm o Green: 500 - 570 nm o Blue: 450 - 500 nm o Indigo: 430 - 450 nm o Violet: 400 - 430 nm

Ultraviolet: 3 to 400 nm X-Rays and Gamma Rays

CHARACTERISTICS OF THE ELECTROMAGNETIC SPECTRUM

Ultraviolet (UV): This wavelength region has not been used to monitor sea ice and it is not likely to be used in the future. Because of the high degree of atmospheric scattering in this wavelength region, there is a tendency for imagery to appear "fuzzy". The radiation source is the sun and the systems used are, therefore, passive.

Visible Light: This wavelength region, principally the green and red portion, is used by Landsat and NOAA weather satellites to produce map-like imagery. The green portion is particularly sensitive to ice regardless of whether it is newly formed and thin or old and flooded. This portion of the spectrum is cloud-limited. As with the UV, the radiation source is the sun.

Page 19: seminar report on remote sensing

Near Infrared: This wavelength region is often detected along with visible light. Landsat and NOAA weather satellites produce images in this wavelength region. The imagery is of great utility to remote sensing of sea ice because it is highly sensitive to water/ice boundaries and water upon ice. It generally presents greater contrast between ice types and ice and water than do the visible wavelengths. Thermal Infrared. This wavelength region is truly representative of heat. However, interpretation of thermal infrared imagery can be somewhat difficult. For many purposes, the best imagery is obtained just before dawn so that solar heating effects are at a minimum. Since the thermal infrared is absorbed by clouds and fog, it is useful to have a visual image as well as a thermal image to help identify them and the areas modified by their influences.

Microwave: Data is obtained in this region by both active and passive methods. The earth's surface does emit microwave radiation in very small amounts as a manifestation of its temperature. It is, therefore, necessary to use very sensitive microwave receivers (radiometers) to measure this radiation. This wavelength region is not affected by ordinary cloudiness, but the shorter wavelengths can be absorbed by the raindrops in severe storms. The active systems using this wavelength region come under the heading of radar. Side-scanning radar systems are operated routinely aboard Canadian ice surveillance aircraft. Imaging radar has also been used experimentally aboard spacecraft and it is likely that data from an operational satellite imaging radar system will be available relatively soon. The active systems send out and receive back a much stronger signal than the passive microwave systems. Hence, the "background" radiation of the earth does not confuse the signal received.

Reflectance

When solar radiation hits a target surface, it may be transmitted, absorbed or reflected. Different materials reflect and absorb differently at different wavelengths. The reflectance spectrum of a material is a plot of the fraction of radiation reflected as a function of the incident wavelength and serves as a unique signature for the material. In principle, a material can be identified from its spectral reflectance signature if the sensing system has sufficient spectral resolution to distinguish its spectrum from those of other materials. This premise provides the basis for multispectral remote sensing.

The following graph shows the typical reflectance spectra of five materials: clear water, turbid water, bare soil and two types of vegetation.

Page 20: seminar report on remote sensing

Reflectance Spectrum of Five Types of Landcover

The reflectance of clear water is generally low. However, the reflectance is maximum at the blue end of the spectrum and decreases as wavelength increases. Hence, clear water appears dark-bluish. Turbid water has some sediment suspension which increases the reflectance in the red end of the spectrum, accounting for its brownish appearance. The reflectance of bare soil generally depends on its composition. In the example shown, the reflectance increases monotonically with increasing wavelength. Hence, it should appear yellowish-red to the eye.

Vegetation has a unique spectral signature which enables it to be distinguished readily from other types of land cover in an optical/near-infrared image. The reflectance is low in both the blue and red regions of the spectrum, due to absorption by chlorophyll for photosynthesis. It has a peak at the green region which gives rise to the green color of vegetation. In the near infrared (NIR) region, the reflectance is much higher than that in the visible band due to the cellular structure in the leaves. Hence, vegetation can be identified by the high NIR but generally low visible reflectance. This property has been used in early reconnaisance missions during war times for "camouflage detection".

Page 21: seminar report on remote sensing

Typical Reflectance Spectrum of Vegetation. The labeled arrows indicate the common wavelength bands used in optical remote sensing of vegetation: A: blue band, B: green band; C: red band; D: near IR band; E: short-wave IR band

The shape of the reflectance spectrum can be used for identification of vegetation type. For example, the reflectance spectra of vegetation 1 and 2 in the above figures can be distinguished although they exhibit the generally characteristics of high NIR but low visible reflectance. Vegetation 1 has higher reflectance in the visible region but lower reflectance in the NIR region. For the same vegetation type, the reflectance spectrum also depends on other factors such as the leaf moisture content and health of the plants.

The reflectance of vegetation in the SWIR region (e.g. band 5 of Landsat TM and band 4 of SPOT 4 sensors) is more varied, depending on the types of plants and the plant's water content. Water has strong absorption bands around 1.45, 1.95 and 2.50 µm. outside these absorption bands in the SWIR region, reflectance of leaves generally increases when leaf liquid water content decreases. This property can be used for identifying tree types and plant conditions from remote sensing images. The SWIR band can be used in detecting plant drought stress and delineating burnt areas and fire-affected vegetation. The SWIR band is also sensitive to the thermal radiation emitted by intense fires, and hence can be used to detect active fires, especially during night-time when the background interference from SWIR in reflected sunlight is absent.

Color Composite Images

Page 22: seminar report on remote sensing

In displaying a color composite image, three primary colors (red, green and blue) are used. When these three colors are combined in various proportions, they produce different colors in the visible spectrum. Associating each spectral band (not necessarily a visible band) to a separate primary color results in a color composite image.

Many colors can be formed by combining the three primary colors (Red, Green, Blue) in various proportions.

True Color Composite

If a multispectral image consists of the three visual primary color bands (red, green, blue), the three bands may be combined to produce a "true color" image. For example, the bands 3 (red band), 2 (green band) and 1 (blue band) of a LANDSAT TM image or an IKONOS multispectral image can be assigned respectively to the R, G, and B colors for display. In this way, the colors of the resulting color composite image resemble closely what would be observed by the human eyes.

A 1-m resolution true-color IKONOS image.

False Color Composite

Page 23: seminar report on remote sensing

The display color assignment for any band of a multispectral image can be done in an entirely arbitrary manner. In this case, the color of a target in the displayed image does not have any resemblance to its actual color. The resulting product is known as a false color composite image. There are many possible schemes of producing false color composite images. However, some scheme may be more suitable for detecting certain objects in the image.

A very common false color composite scheme for displaying a SPOT multispectral image is shown below:

R = XS3 (NIR band)G = XS2 (red band)

B = XS1 (green band)

This false color composite scheme allows vegetation to be detected readily in the image. In this type of false color composite images, vegetation appears in different shades of red depending on the types and conditions of the vegetation, since it has a high reflectance in the NIR band (as shown in the graph of spectral reflectance signature).

Clear water appears dark-bluish (higher green band reflectance), while turbid water appears cyan (higher red reflectance due to sediments) compared to clear water. Bare soils, roads and buildings may appear in various shades of blue, yellow or grey, depending on their composition.

False color composite multispectral SPOT image:Red: XS3; Green: XS2; Blue: XS1

Another common false color composite scheme for displaying an optical image with a short-wave infrared (SWIR) band is shown below:

R = SWIR band (SPOT4 band 4, Landsat TM band 5)G = NIR band (SPOT4 band 3, Landsat TM band 4)B = Red band (SPOT4 band 2, Landsat TM band 3)

An example of this false color composite display is shown below for a SPOT 4 image.

Page 24: seminar report on remote sensing

False color composite of a SPOT 4 multispectral image including the SWIR band:Red: SWIR band; Green: NIR band; Blue: Red band. In this display scheme, vegetation

appears in shades of green. Bare soils and clearcut areas appear purplish or magenta.The patch of bright red area on the left is the location of active fires.

A smoke plume originating from the active fire site appears faint bluish in color.

False color composite of a SPOT 4 multispectral image without displaying the SWIR band:Red: NIR band; Green: Red band; Blue: Green band. Vegetation appears in shades of red.

The smoke plume appears bright bluish white.

Page 25: seminar report on remote sensing

Natural Color Composite

For optical images lacking one or more of the three visual primary color bands (i.e. red, green and blue), the spectral bands (some of which may not be in the visible region) may be combined in such a way that the appearance of the displayed image resembles a visible color photograph, i.e. vegetation in green, water in blue, soil in brown or grey, etc. Many people refer to this composite as a "true color" composite. However, this term is misleading since in many instances the colors are only simulated to look similar to the "true" colors of the targets. The term "natural color" is preferred.

The SPOT HRV multispectral sensor does not have a blue band. The three bands, XS1, XS2 and XS3 correspond to the green, red, and NIR bands respectively. But a reasonably good natural color composite can be produced by the following combination of the spectral bands:

R = XS2G = (3 XS1 + XS3)/4B = (3 XS1 - XS3)/4

where R, G and B are the display color channels.

Natural color composite multispectral SPOT image:Red: XS2; Green: 0.75 XS2 + 0.25 XS3; Blue: 0.75 XS2 - 0.25 XS3

Page 26: seminar report on remote sensing

Synthetic Aperture Radar (SAR)

In synthetic aperture radar (SAR) imaging, microwave pulses are transmitted by an antenna towards the earth surface. The microwave energy scattered back to the spacecraft is measured. The SAR makes use of the radar principle to form an image by utilising the time delay of the backscattered signals.

A radar pulse is transmitted from the antenna to the ground

The radar pulse is scattered by the ground targets back to the antenna.

In real aperture radar imaging, the ground resolution is limited by the size of the microwave beam sent out from the antenna. Finer details on the ground can be resolved by using a narrower beam. The beam width is inversely proportional to the size of the antenna, i.e. the longer the antenna, the narrower the beam.

It is not feasible for a spacecraft to carry a very long antenna which is required for high resolution imaging of the earth surface. To overcome this limitation, SAR capitalizes on the motion of the space craft to emulate a large antenna (about 4 km for the ERS SAR) from the small antenna (10 m on the ERS satellite) it actually carries on board.

Limitations of Remote Sensing Systems

Page 27: seminar report on remote sensing

Scale Limitations: There can be a tremendous amount of information available to a remote sensing system. Imaging systems in particular often have more data available than they can collect and transmit back to earth or that can be processed when returned to earth. As a consequence, there are often compromises struck among data resolution, area covered, and frequency of coverage. These compromises can be a cause of some frustration to the image user.

Picture Element Size. There is one feature common to all imaging remote sensing systems which has a direct influence on the analysis of the imagery. This is the size of the picture element, the pixel. Perhaps the best example of pixels comes from photography. Photographic film contains many small silver halide crystals. Exposure to light causes the transparent silver halide crystals to become opaque silver crystals. The objects we see on photographic film are actually composed of many of these silver crystals. These crystals impose the limitation on the size of a clear enlargement which can be made from the film. The reason that enlargements look "grainy" is that the individual crystals on the original film are becoming separately visible. Obviously, it takes many silver crystals to define an object. For instance, a hundred crystals may be necessary just to make an object identifiable as a human being. Satellite remote sensing systems divide the earth's surface into an array of rectangular pixels and transmit back to earth a digital signal defining the amount of electromagnetic radiation received at the satellite from that pixel. The size of the pixel on the ground defines the limiting resolution of that particular remote sensing system. It should be clear that in order to "see" something on an image, its minimum size must be comparable to the size of the pixel. This concept will be discussed in more detail in a subsequent section.

Measurable Levels of Radiation: The interpretation of remotely sensed data is ultimately limited by the amount of radiation received by the recording system. Although some "image enhancement" and even pattern recognition logic can be applied, no magic is possible. This seems like an obvious statement, but very often one finds persons engaged in analysis of remotely sensed data, attempting to identify features or surface cover types when the information needed for their distinction was simply not available to be recorded. Either the reflectance or emission of the object or surface type was not distinctive compared with its surroundings or it was too subtle to be recorded. Occasionally, the analyst must ask himself if there really was a distinctive signal difference available to define the target of interest in the first place. If a negative answer is obtained, a great deal of time may be saved. Later sections will give background material upon which to make such judgements.

Other Limiting Factors: There are many factors which can limit the ability to receive the desired signal even if an adequate signal did originate from the earth's surface. In the case of visual signals, clouds are an obvious example. Clouds can be present over a specific study site a good deal of the time. Figures V-i and V-2 show the average percent cloud cover over the North Polar Region during January and July. Examination of the January map shows that the lowest percent cloud cover (35%) is over the central arctic basin.

Applications 

Page 28: seminar report on remote sensing

Natural resource management is a broad field covering many different application areas as diverse as monitoring fish stocks to effects of natural disasters (hazard assessment).Remote sensing can be used for applications in several different areas, including:

 

       Geology and Mineral exploration

       Hazard assessment

       Oceanography

       Agriculture and forestry

       Land degradation

       Environmental monitoring

Each sensor was designed with a specific purpose. With optical sensors, the design focuses on the spectral bands to be collected. With radar imaging, the incidence angle and microwave band used plays an important role in defining which applications the sensor is best suited for.  

Each application itself has specific demands, for spectral resolution, spatial resolution, and temporal resolution.  

For a brief, spectral resolution refers to the width or range of each spectral band being recorded. As an example, panchromatic imagery (sensing a broad range of all visible wavelengths) will not be as sensitive to vegetation stress as a narrow band in the red wavelengths, where chlorophyll strongly absorbs electromagnetic energy.

 Spatial resolution refers to the discernible detail in the image. Detailed mapping of wetlands requires far finer spatial resolution than does the regional mapping of physiographic areas.

 Temporal resolution refers to the time interval between images. There are applications requiring data repeatedly and often, such as oil spill, forest fire, and sea ice motion monitoring. Some applications only require seasonal imaging (crop identification, forest insect infestation, and wetland monitoring), and some need imaging only once (geology structural mapping). Obviously, the most time-critical applications also demand fast turnaround for image processing and delivery - getting useful imagery quickly into the user's hands.

 Let as consider an application, in concrete the use of remote sensing in the forest inventory. Forest inventory is a broad application area covering the gathering of information on the species distribution, age, height, density and site quality.

 For species identification, we could use imaging systems or aerial photos. For the age and height of the trees, radar could be used in combination with the species information assessed at a first stage. Density is achieved mainly by an optical interpretation of aerial photos and/or high-resolution panchromatic images.

Page 29: seminar report on remote sensing

As for site quality, is one of the more difficult things to assess. It is based on topological position, soil type and drainage and moisture regime. The topological position can be estimated using laser or radar. However, the soil type and drainage and moisture regime could be more profitably collected using ground data.

 The use of Remote Sensing in Crop monitoring (real case)

 The countries involved in the European Communities (EC) are using remote sensing to help fulfill the requirements and mandate of the EC Agricultural Policy, which is common to all members. The requirements are to delineate, identify, and measure the extent of important crops throughout Europe, and to provide an early forecast of production early in the season. Standardized procedures for collecting this data are based on remote sensing technology, developed and defined through the MARS project (Monitoring Agriculture by Remote Sensing).

The project uses many types of remotely sensed data, from low resolution NOAA-AVHRR, to high-resolution radar, and numerous sources of ancillary data. These data are used to classify crop type over a regional scale to conduct regional inventories, assess vegetation condition, estimate potential yield, and finally to predict similar statistics for other areas and compare results. Multisource data such as VIR and radar were introduced into the project for increasing classification accuracies. Radar provides very different information than the VIR sensors, particularly vegetation structure, which proves valuable when attempting to differentiate between crop types.

 One the key application within this project is the operational use of high resolution optical and radar data to confirm conditions claimed by a farmer when he requests aid or compensation. The use of remote sensing identifies potential areas of non-compliance or suspicious circumstances, which can then be investigated by other, more direct methods.

As part of the Integrated Administration and Control System (IACS), remote sensing data supports the development and management of databases, which include cadastral information, declared land use, and parcel measurement. This information is considered when applications are received for area subsidies.

This is an example of a truly successfully operational crop identification and monitoring application of remote sensing.

Bibliography

Page 30: seminar report on remote sensing

www.eletronics4u.com

http://www.star.ait.ac.th

http://www.acrors.ait.ac.th

www.gac.ait.ac.th

www.thesis.com