All the pages are being updated with more information, images and videos.. Please keep visiting!!

‘Digitally’ ‘Reading’ a Satellite Image

Satellite images contain loads of information regarding the land area in their coverage. These are like mines having tons of metal but you have to explore, process and convert them into finished products to make them usable and to get desired information.

Information extraction from satellite images requires basic knowledge of image interpretation, skills in image processing and compatible software which can convert data into information. We have already discussed in detail regarding elements of visual image interpretation in the section Reading a Satellite Image.

With the advancement of technology digital image processing is also advanced a lot and is being practiced for information extraction from satellite images very effectively. What we often skip in visual image interpretation due to limitation of our eyes can be bring out using digital image processing methods.

In digital image processing number of algorithms are used to process satellite images. These algorithms digitally manipulate the raw images and convert them into desired information. These are mostly used to emphasize and extract features of our interests. For example- vegetation indices are applied for deriving valuable information regarding vegetation.

Digital image processing, in simple word, is playing with the digital numbers (DN) of pixels. But we have to be master of this play if we want to get desirable results. It is as easy and as difficult as playing with simple numbers. Image processing techniques are based on our day-to-day addition, subtraction, multiplication and division operators. A good knowledge of statistics is also required because in many of image processing techniques statistics is very frequently applied. For example- Principal Component Analysis (PCA) of satellite images is statistics based process. PCA is widely used to extract useful information from multiple bands filtering noise from the data.

Vegetation Index

Vegetation indices are very frequently and commonly used for vegetation related studies. These indices are (mostly) based on 'ratioing' of infrared and red bands. This is because vegetation reflects a large number of EMRs in infrared regions while absorbs EMRs in red region.

Some of the vegetation indices are ratio vegetation index (RVI), normalized difference vegetation index (NDVI), transformed vegetation index (TVI) etc. NDVI is particularly helpful in analysis of vegetation health and vegetation cover density.

Principal Component Analysis (PCA)

PCA techniques are used to ‘compile’ information from a large number of bands to lesser number of bands. Suppose we have to extract information from five bands of Landsat ETM+ image. When we do PCA- it will analyze all the five bands and remove redundant information to provide output in the form of PCA images. The first PCA image will contain most of the information and the information content will keep on decreasing in second, third and subsequent images. In other words we can say PCA compresses multiple band information into one or two images. Need not to say this technique enhances variance in the satellite images.

Thermal Pollution & Remote Sensing

Thermal water pollution is a potential hazard for aquatic life. Many industries (like thermal power stations) release hot water in their effluent directly to rivers or water bodies. This is also against environmental laws in many countries but detection is difficult because hot water doesn’t have different colour from cold water so our eyes can not recognize it. Secondly such illegal activities mostly practiced in night. So, how to detect it? Answer is- Thermal Remote Sensing.

Thermal pollution can easily detected by thermal remote sensing (or thermal imaging) devices. These operate in thermal infrared bands of EMR and are acquires data to produce night time images. Thermal imaging captures emissions of EMR from an object rather than reflections. When hot water released into a stream it emits thermal radiations which are sensed by thermal imaging devices. Hot water emits more thermal radiations than cold water that’s why it appears brighter & cold water appears darker. Hot water plumes become very clear & conspicuous in thermal images, we can detect the source of release and the extent of spread of hot water.

Urban Heat Detection

Urban areas behave as heat islands due to very high emission of thermal radiations. Their temperature remain higher than surrounding non-urban areas- basically due to high energy consumption, very large surface area covered by asphalt and concrete, lesser soil surfaces and low vegetation cover.

Energy is mostly consumed in the form of electricity and in the process large amount of heat is released. While concrete and asphalt surfaces (like buildings, roads, pavements etc.) absorb solar radiations and later release thermal emissions. Vehicular and industrial pollution in cities also contribute to heat emissions and so in increasing urban ambient temperature.

Thermal imaging devices help in detecting relative warmness/coolness of urban islands. High energy consumption areas and hot spots can be mapped using thermal remote sensing images. As we all know vegetation reduces ambient temperature significantly so thermal imaging can help in identifying urban areas where plantation is required to cool down the hot spots.

Before Selecting a Satellite Image!!

Choice of satellite imagery very much depends on purpose of the user, in other words what kind of information we want. There are certain things which always kept in mind while procuring a satellite image:
  • Time of the year
  • Cloud coverage
  • Spatial resolution
  • Spectral resolution
  • Financial constraints

Time of the Year

Time is very important factor to decide whether to pick a satellite image or not. For example, if you want to study deciduous vegetation one must not go for image that is acquire by the sensor in autumn. Or if you want to map water holes and have procured summer time imagery it will be disastrous.

One should not misunderstand that autumn-time images are useless; instead these are very useful if evergreen vegetation are to be studied and mapped. In fact these images are helpful in differentiating deciduous and evergreen vegetation type.

Cloud Cover

Cloud is also important factor as these may mask the ground features partially or completely. That’s why rainy season images are not preferred (until unless one wants to do some specific study for this season). If you are lucky you can get cloud-free image in rainy season too. Sometimes winter & summer time images may also contain clouds. So always see preview of images before procuring them and avoid images having more than 10% cloud coverage.

Clouds are the main constraint in various types of land use/land cover studies for North-East India and Andmans & Nicobar Islands as most of the time these areas remain covered with clouds.

Spatial & Spectral Resolution

These resolutions are important in the sense- what kind of study you want to do? If you want satellite images for cartographic purpose then high resolution panchromatic images are good (e.g., CARTOSAT-1 & 2 images, IKONOS panchromatic images etc.). On the other hand for forest mapping, multispectral images are required which may be low in spatial resolution like Landsat ETM+ Images or IRS-1D LISS III images. In some cases when built-up area & vegetation types are mapped to be togather- then multispectral images with high spatial resolutions are used (e.g. IRS-P6 MX LISS IV images).

Financial Constraints

If you have selected a cloud free satellite image for a right season with right resolution then you must see you pocket also- whether you have sufficient money or not. Going for high resolution images are definitely going to cost much higher. So where a Landsat ETM+ or IRS-1D images can work don’t go for IKONOS multispectral images. Also processed images cost high. If you have good image processing software and skilled team then don’t buy processed images- as processing you can do in your lab itself. If you can spend some extra pennies then procuring georeferenced images from your vendor (like NRSA, Space Imaging etc.) can be a good idea as it will save your time and also you get data with acceptable accuracy.

A Note on Digital Satellite Images

Satellite images are used in two forms – analogue and digital. Analogue images are popularly known as hard copy images.

Digital images are made up of tiny picture elements known as ‘pixels’. These images are like matrices which are divided into rows and columns. It is obvious from their name that digital images have something to do with digits (or number). In fact these images are numerical representation of the features observed by a remote sensing device. It means every pixel must have some number or value to represent itself. These values are called as pixel values or digital numbers (DN). Digital number of a pixel depends upon the reflectance of a feature recorded by a sensor. Not necessarily a pixel has reflectance of a single object; it may be aggregate of many features ‘falling’ in that pixel. Suppose we are interpreting a satellite image acquired by IRS-1D LISS III sensor. Its single pixel represents 23.5mx23.5m of ground area; within this area a pixel may contain composite reflectance of a road, some road side trees and a building close to it.

A pixel of a satellite image has spatial and spectral properties. The spatial property represents spatial resolution of a satellite sensor while spectral one the appearance of the objects in a image.

On the basis of spectral properties, satellite images are of two type- panchromatic and multispectral images.

In panchromatic images pixels appears in different shades of grey that’s why often these are referred to as black & white images. Panchromatic images have single spectral band covering almost whole visible range of electromagnetic spectrum, e.g. CARTOSAT-1 images. The number grey shades which can be displayed by a pixel depend on number of bits per pixel. One bit images will have only two grey levels while 8-bit will have 256 grey levels.

Multispectral images contain multiple layer of spectral bands and are displayed in combination of red (R), green (G) and blue (B) colours. Hence these are coloured images e.g. IRS-P6 LISS IV MX images.

CARTOSAT: A Milestone achieved by India

CARTOSAT is the one of the most advanced remote sensing systems launched by Indian Space Research Organization (ISRO) so far. In the history of remote sensing it is a milestone achieved by India.

The main purpose of this mission is to provide large-scale high resolution satellite imagery which can efficiently be used for cartographic purposes. These are particularly very useful for urban planners in urban mapping and planning. Till date two satellites are launched in CARTOSAT series- CARTOSAT-1 and CARTOSAT-2.

CARTOSAT-1

It is the first in the CARTOSAT series and was launched in May 2005 by satellite launch vehicle PSLV-C6. It operates in panchromatic mode (black & white) in the EMR’s range of 0.50 micron to 0.85 micron. It carries two cameras to generate stereoscopic images. It provides spatial resolution of 2.5mx2.5m and radiometric resolution of 10-bit. Its revisit period is of 5 days.

The stereoscopic capability of CARTOSAT-1 gives it edge over many other remote sensing systems internationally. We can generate Digital Elevation Models (DEMs) and 3-D maps using its stereo-images. Apart from cartographic applications CARTOSAT-1 images are useful in resource management, pre & post disaster planning & management.

CARTOSAT-2

The second satellite of CARTOSAT series was launched in January 2007. It also operates in panchromatic mode with a spectral band of 0.50-0.85 micron. It is a very high resolution sensor system with less than one meter spatial resolution. Its revisit period is of 4-days.It will be highly useful for urban, rural planning and in cadastral level studies. It has already started acquiring images successfully. ISRO recently released some sample images acquired by CARTOSAT-2.


Popular Remote Sensing Systems

LANDSAT

Landsat satellite sensors are one of the most popular remote sensing systems, the imagery acquired from these are widely used across the globe.
NASA’s Landsat satellite programme was started in 1972. It was formerly known as ERTS (Earth Resource Technology Satellite) programme. The first satellite in the Landsat series Landsat-1 (formerly ERTS-1) was launched on July 23, 1972 .Since then five different types of sensors have been included in various combinations in Landsat mission from Landsat-1 through Landsat-7. These sensors are Return Beam Vidicon (RBV), the Multispectral Scanner (MSS), the Thematic Mapper (TM), the Enhanced Thematic Mapper (ETM) and the Enhanced Thematic Mapper plus (ETM+). Landsat ETM (or Landsat 6) was launched in 1993 but it could not achieve the orbit. Six year later in 1999 Landsat ETM+ (or Landsat 7) was launched and it is the recent one in the series.
Landsat ETM+ contains four bands in Near Infrared-visible (NIR-VIS) region with 30mx30m spatial resolution, two bands in Short Wave Infrared (SWIR) region with same resolution, one in Thermal Infrared (TIR) region with spatial resolution of 60mx60m and one panchromatic band with resolution. Its revisit period is 16 days.

SPOT

SPOT (Systeme Pour l’Observation de la Terre) was developed by the French Centre National d’ Etuded Spatiales with Belgium and Sweden. The first satellite of SPOT mission, SPOT-1 was launched in 1986. It was followed by SPOT-2 (in 1990), SPOT-3 (in 1993), SPOT-4 (in 1998) and SPOT-5 (in 2002).
There are two imaging systems in SPOT-5- HRVIR and Vegetation. The HRVIR records data in three bands in VIS-NIR region with 10mx10m spatial resolution, one band in SWIR region with 20mx20m spatial resolution and one panchromatic band with 5mx5m resolution. The Vegetation instrument is primarily designed for vegetation monitoring and related studies. It acquires images in three bands in VIS-NIR region and in one band in SWIR region (all with 1000mx1000m) spatial resolution.

Advanced Very High Resolution Radiometer (AVHRR)

Several generations of satellites have been flown in the NOAA-AVHRR series. NOAA-15 is the recent in the series. The sensor AVHRR (Advanced Very High Resolution radiometer) contains five spectral channels two in VIS-NIR region and three in TIR. One thermal band is of the wavelength range 3.55-3.93 mm, meant for fire detection. Spatial resolution of AVHRR is 1100mx1100m. NOAA-AVHRR mainly serves for global vegetation mapping, monitoring land cover changes and agriculture related studies with daily coverage.

Indian Remote Sensing (IRS) Satellites

The Indian Remote Sensing programme began with the launch of IRS-1A in 1988. After that IRS-1B (1999), IRS-1C (1995) and IRS-1D (1997) was launched. IRS-1D carries three sensors: LISS III with three bands of 23.5mx23.5m spatial resolution in VIS-NIR range and one band in SWIR region with 70.5x70.5 m resolution, a panchromatic sensor, with 5.8mx5.8m resolution and a Wide Field Sensor (WiFs) with 188mx188m resolution. WiFS is extensively used for vegetation related studies.

ISRO’s IRS-P6 (RESOURCESAT-1) is very advanced remote sensing system. It was launched in 2003. It carries high resolution LISS IV camera (three spectral bands in VIS-NIR region) with spectral resolution of 5.8mx5.8m which has capability to provide stereoscopic imagery. IRS-P6 LISS III camera acquires images in VIS-NIR (3 spectral bands) and SWIR (one spectral band) with spatial resolution of 23.5mx23.5m. IRS-P6 AWiFS (Advanced Wide Field Sensor) operates in VIS-NIR (3 spectral bands) and SWIR (one spectral band) with spatial resolution of 56mx56m.

Chronology of Remote Sensing

1800: Discovery of Infrared by Sir William Herschel.

1826: First photographic Image taken by Joseph Nicephore Niepce.

1839: Beginning of practice of Photography.

1855: Additive Colour Theory postulated by James Clerk Maxwell.

1858: First Aerial Photograph from a balloon, taken by G. F. Tournachon.

1873: Theory of Electromagnetic Energy developed by J. C. Maxwell.

1903: Airplane invented by Wright brothers.

1909: Photography from airplanes.

1910s: Aerial Photo Reconnaissance: World War I.

1920s: Civilian use of aerial photography and Photogrammetry.

1934: American society of Photogrammetry founded.

1935: Radar invention by Robert Watson-Watt.

1939-45: Advances in Photo Reconnaissance and applications of non-visible portion of EMR: World War II.

1942: Kodak patents first false colour infrared film.

1956: Colwell’s research on diseases detection with IR photography.

1960: Term “Remote Sensing” coined by Office of Naval Research personnel

1972: ERTS-1 launched (renamed Landsat-1).

1975: ERTS-2 launched (renamed Landsat-2).

1978: Landsat-3 launched.

1980s: Development of Hyperspectral sensors.

1982: Landsat-4 TM & MSS launched.

1984: Landsat-5 TM launched.

1986: SPOT-1 launched.

1995: IRS 1C launched.

1999: Landsat-7 ETM+ launched.

1999: IKONOS launched.

1999: NASA’s Terra EOS launched.

2002: ENVISAT launched.

2003: ISRO's RESOURCESAT-1 (IRS P6) launched.

2005: ISRO's CARTOSAT-1 launched.

2007: ISRO's CARTOSAT-2 launched.

Understanding Hyperspectral Remote Sensing

Remote sensing techniques are changing very fast and undergoing a lot of advancements. Newer and advanced techniques are being introduced. Hyperspectral Remote Sensing is one of these techniques proving worth for various studies like environment, agriculture geology etc.

Hyperspectral remote sensing involves acquisition of the digital images in many, narrow, contiguous spectral bands throughout the visible, Near Infrared (NIR) , Mid-Infrared (MIR) and Thermal Infrared (TIR) regions of the electromagnetic spectrum.

Higher spectral resolution enables hyperspectral remote sensing instruments capable of detailed identification of material, geological features and vegetation at finer level, which is not possible with conventional multispectral remote sensors.

The multispectral satellite sensors contain less number of bands with broad spectral band width hence these are not capable of detecting fine details of the earth features being sensed. The broad band width cannot separate the objects having very little difference in their spectral reflectance. For instance if we talk about vegetation studies there are many plant species and vegetation classes, which possess almost similar spectral properties and hence these seem to fall in same class or seem to belong to same species. Such misinterpretation of the data leads to erroneous results thus creating limitations for multispectral sensors to work at micro level. Also, these sensors cannot detect very little changes in the moisture and chlorophyll content of the leaves.

On the other hand, Hyperspectral Imaging (HSI) instruments, possessing narrow large number of bands, make it possible to differentiate between the objects/features which may look similar in multispectral sensors. HSI can afford the detection of small alterations in the moisture content of the leaves, moisture status of soil, nutrient stress and other environmental stresses in the plants. The narrow bandwidth allows HSI to discriminate between the plant species and vegetation types having very small difference in the spectral reflectance.

Hyperspectral remote sensing is becoming popular among botanists, plant biochemists environmentalists, agriculturists and geologists. It is proving to be a good tool for studying plant physiology, canopy biochemistry, plant productivity, biomass, detecting health of the plants and for vegetation mapping.

This recent technique of remote sensing generates a large volume of data hence requires a lot of space to store it. Hyperspectral image processing is different from multipsectral one hence it requires special tools for processing and analysis.Also a lot of expertise and skills are needed for interpreting data acquired from HSI instruments correctly and for getting desired results.

Some of the Hyperspectral Remote Sensing systems are as follows:

Airborne Visible Infrared Imaging Spectrometer (AVIRIS)
AVIRIS acquires images in 224 spectral bands which are 9.6 nm wide. The range of these bands is in between 400 nm to 2500 nm region of electromagnetic spectrum.

Compact Airborne Spectrographic Imager (CASI)
This imaging spectrometer collects data in 288 bands in the range between 400 nm to 1000 nm. The spectral interval of each band is 1.8 nm.

Hyperspectral Mapping (HYMAP) System
It is an across-track hyperspectral imaging instrument. It collects data in 128 bands in the range of 400-2500 nm.

Moderate Resolution Imaging Spectrometer (MODIS)
This hyperspectral imaging sensor is one of the sensors on TERRA satellite. It acquire data in 36 spectral bands and its spatial resolution ranges between 250 m to 1 km (to be precise- Band 1 & 2 : 250m x 250m, Band 3 to 7 : 500m x 500m and Band 8 to 36 : 1km x 1km.)

Why Remote Sensing?

Remote Sensing is unique in that it can be used to collect data, unlike other techniques, such as thematic cartography, geographic information systems, or statistics that must rely on data that are already available. Remote sensing data derived may then be transformed into information using analog or digital image processing techniques if appropriate logic and methods are used.

Remote sensing derived information is critical to the successful modeling and monitoring of numerous natural (e.g., watershed run off) and cultural processes (e.g., land use conversion at the urban fringe). In fact, for successful execution of many models- that rely on spatially distributed information- remote sensing is must.

The principal advantages of remote sensing are the speed at which data can be acquired from large areas of the earth’s surface, and the related fact that comparatively inaccessible areas may be investigated in this way.

The major advantages of this technique over ground-based methods are summmarized as follows:


Synoptic View

Remote sensing process facilitates the study of various earths’ surface features in their spatial relation to each other and helps to delineate the required features and phenomena.

Repitivity

The remote sensing satellites provide repetitive coverage of the earth and this temporal information is very useful for studying landscape dynamics, phenological variations of vegetation and change detection analysis.

Accessibility

Remote sensing process made it possible to gather information about the area when it is not possible to do ground survey like in mountainous areas and foreign areas.

Time Conservation

Since information about a large area can be gathered quickly, the techniques save time and efforts of human. It also saves the time of fieldwork.

Cost Effective

Remote sensing especially when conducted from space, is an intrinsically expensive activity. Nevertheless, cost-benefit analysis demonstrates its financial effectiveness, and much speculation or developmental remote sensing activity can be justified in this way. It is a cost-effective technique as again and again fieldwork is not required and also a large number of users can share and use the same data.

Remote Sensing: Basic Principle

The basic principle of remote sensing is that the different objects based on their structural, chemical and physical properties return (reflects or emits) different amount of energy in different wavelength ranges (commonly referred to as bands) of the electromagnetic spectrum incident upon it.

Most remote sensing systems utilize the sun’s energy, which is a predominant source of energy. These radiations travel through the atmosphere and are selectively scattered and/or absorbed depending upon the composition of the atmosphere and the wavelengths involved. These radiations upon reaching the earth’s surface interact with the target objects.

Everything in nature has its own unique pattern of reflected, emitted or absorbed radiation. A sensor is used to record reflected or emitted energy from the surface. This recorded energy is then transmitted to the users and then it is processed to form an image, which is then analyzed to extract information about the target.

Finally the information extracted is applied to assist in decision making for solving a particular problem.

Remote Sensing: An Overview

There are many interrelated process, which are involved in acquisition of remotely sensed images, hence an isolated focus on any single component produces a fragmented picture. It is necessary to identify these components to have proper knowledge of remote sensing.

There four essential components of it-

Physical Features
The first component includes the physical features, like buildings, vegetation, soil, water and rocks. Knowledge of the physical features resides within such specific disciplines as geology, forestry, soil science, geography, and urban planning.

Sensor Data
Sensor data are formed in the manner as a remote sensing device views the physical features by recording electromagnetic signals emitted or reflected from the landscape. The effective use of sensor of sensor data requires analysis and interpretation of to convert data into useful information.

Extracted Information
These interpretations create extracted information, which consists of transformations of sensor data designed to reveal specific kinds of information. A more realistic view demonstrates that the same sensor data can be examined from alternative perspectives to yield different interpretations. Therefore a single can be interpreted to provide information about, for instance, vegetation, soils, rocks, water, depending on the specific data, information required and the purpose of the analysis.

Application
The fourth component is the applications, in which the analyzed remote sensing data can be combined with other data to address a specific practical problem, such as land use planning, mineral exploration or vegetation mapping.

Stages in Remote Sensing

The process of remote sensing involves a number of processes starting from energy emission from source to data analysis and information extraction. The stages of remote sensing are described in follows steps:

Source of Energy
The source of energy (electromagnetic radiations) is a prerequisite for the process of remote sensing. The energy sources may be indirect (e.g. the sun) or direct (e.g. radar). The indirect sources vary with time and location, while we have control over direct sources. These sources emit electromagnetic radiations (EMRs) in the wavelength regions, which can be sensed by the sensors.

Interaction of EMR with the Atmosphere
The EMR interacts with the atmosphere while traveling from the source to earth features and from earth features to the sensor. During this whole path the EMR changes its properties due to loss of energy and alteration in wavelength, which ultimately affects the sensing of the EMR by the sensor. This interaction often leads to atmospheric noise (it will be discussed in separate topic).

EMR Interaction with Earth Features
The incident EMR on the earth features interacts in various ways. It get reflected, absorbed, transmitted & emitted by the features and ground objects. The amount of EMR reflected, absorbed, transmitted and emitted depends upon the properties of the material in contact and EMR itself.

Detection of EMR by the remote sensing sensor
The remote sensing device records the EMR coming to the sensor after its interaction with the earth features. The kind of EMR which can be sensed by the device depends upon the amount of EMR and sensor’s capabilities.

Data Transmission and Processing
The EMR recorded by the remote sensing device is transmitted to earth receiving and data processing stations. Here the EMR are transformed into interpretable output- digital or analogue images.

Image Processing and Analysis
The digital satellite images are processed using specialized software meant for satellite image processing. The image processing and further analysis of satellite data leads to information extraction, which is required by the users.


Application
The extracted information is utilized to make decisions for solving particular problems. Thus remote sensing is a multi-disciplinary science, which includes a combination of various disciplines such as optics, photography, computer, electronics, telecommunication and satellite-launching etc.

Types of Remote Sensing

Based on Source of energy
The sun provides a very convenient source of energy for remote sensing. The sun’s energy is either reflected, as it is for visible wavelength or absorbed and then re-emitted (for thermal infrared wavelength).

Remote sensing systems, which measure this naturally available energy, are called passive sensors. This can only take place when the sun is illuminating the earth. There is no reflected energy available from the sun at night. Energy that is naturally emitted can be detected day and night provided that the amount of energy is large enough to be recorded.

Remote sensing systems, which provide their own source of energy for illumination, are known as active sensors. These sensors have the advantage of obtaining data any time of day or season. Solar energy and radiant heat are examples of passive energy sources. Synthetic Aperture Radar (SAR) is an example of active sensor.

Based on Range of Electromagnetic Spectrum

Optical Remote Sensing
The optical remote sensing devices operate in the visible, near infrared, middle infrared and short wave infrared portion of the electromagnetic spectrum. These devices are sensitive to the wavelengths ranging from 300 nm to 3000 nm. Most of the remote sensors record the EMR in this range, e.g., bands of IRS P6 LISS IV sensor are in optical range of EMR.

Thermal Remote Sensing
The sensors, which operate in thermal range of electromagnetic spectrum record, the energy emitted from the earth features in the wavelength range of 3000 nm to 5000 nm and 8000 nm to 14000 nm. The previous range is related to high temperature phenomenon like forest fire, and later with the general earth features having lower temperatures. Hence thermal remote sensing is very useful for fire detection and thermal pollution. e.g., the last five bands of ASTER and band 6 of Landsat ETM+ operates in thermal range.

Microwave Remote Sensing
A microwave remote sensor records the backscattered microwaves in the wavelength range of 1 mm to 1 m of electromagnetic spectrum. Most of the microwave sensors are active sensors, having there own sources of energy, e.g, RADARSAT.
These sensors have edge over other type of sensors, as these are independent of weather and solar radiations.

Spectral Reflectance

The reflectance characteristics of earth surface features may be quantified by measuring the portion of incident energy that is reflected. This is measured as a function of wavelength (l) and is called spectral reflectance, rl.

rl = ER (l) /EI (l)

Where ER is reflected energy and EI is incident energy.
A graph of the spectral reflectance of an object as a function of wavelength is termed as spectral reflectance curve.

Spectral Reflectance of Vegetation
The spectral characteristics of vegetation vary with wavelength. A compound in leaves called chlorophyll strongly absorbs radiation in the red and blue wavelengths but reflect green wavelength. The internal structure of healthy leaves act as diffuse reflector of near-infrared wavelengths. Measuring and monitoring the infrared reflectance is one way that scientists determine how healthy particular vegetation may be.
Leaves appear greenest to us in summer and become red or yellow with decrease in chlorophyll content in autumn.


Spectral Reflectance of Water
Majority of the radiation incident upon water is not reflected but either is absorbed or transmitted. Longer visible wavelengths and near-infrared radiations are absorbed more by water than the visible wavelengths. Thus water looks blue or blue-green due to stronger reflectance at these shorter wavelengths and darker if viewed at red or near-infrared wavelengths. The factors that affect the variability in reflectance of a water body are depth of water, materials within water and surface roughness of water.

Spectral Reflectance of Soil
The majority of radiation on a surface is either reflected or absorbed and little is transmitted. The characteristics of soil that determine its reflectance properties are its moisture content, texture, structure iron-oxide content. The soil curve shows less peak and valley variations. The presence of moisture in soil decreases its reflectance.


By measuring the energy that is reflected by targets on earth’s surface over a variety of different wavelengths, a spectral signature for that object can be made. And by comparing the response pattern of different features we may be able to distinguish between them.


Satellite Sensor Resolutions

Spatial resolution
Spatial resolution is the measure of smallest object that can be detected by a satellite sensor. It represents area covered by a pixel on the ground. Mostly, it is measured in meters.
For example, CARTOSAT-1 sensor has a spatial resolution of 2.5x2.5 m , IRS P6 LISS IV sensor has a spatial resolution of 5.6x5.6 m for its multispectral bands and LISS III has spatial resolution of 23.5x23.5 m in its first three bands. The smaller the spatial resolution, the greater the resolving power of the sensor system.
That's why one can detect even a car in the satellite image acquired by IKONOS (spatial resolution 1x1 m) but can see hardly even a village in a satellite image acquired by AVHRR (spatial resolution 1.1x1.1 km).

Spectral resolution
Spectral resolution refers to the specific wavelength intervals in the electromagnetic spectrum for which a satellite sensor can record the data. It can also be defined as the number and dimension of specific wavelength intervals in the electromagnetic spectrum to which a remote sensing instrument is sensitive. For example, band 1 of the Landsat TM sensor records energy between 0.45 and 0.52 µm in the visible part of the spectrum.The spectral channels containing wide intervals in the electromagnetic spectrum are referred to as coarse spectral resolution and narrow intervals are referred to as fine spectral resolution. For instance the SPOT panchromatic sensor is considered to have coarse spectral resolution because it records EMR between 0.51 and 0.73 µm. on the other hand; band 2 of the ASTER sensor has fine spectral resolution because it records EMR between 0.63 and 0.69 µm.
Radiometric resolution
Radiometric resolution defined as the sensitivity of a remote sensing detector to differentiate in signal strength as it records the radiant flux reflected or emitted from the terrain. It refers to the dynamic range, or number of possible data-file values in each band. This is referred to by the number of bits into which the recorded energy is divided. For instance, ASTER records data in 8-bit for its first nine bands, it means the data file values range from 0 to 255 for each pixel, while the radiometric resolution of LISS III is 7-bit, here the data file values for each pixel ranges from 0 to 128.

Temporal Resolution
The temporal resolution of a satellite system refers to how frequently it records imagery of a particular area. For example, CARTOSAT-1 can acquire images of the same area of the globe every 5 days, while LISS III doest it every 24 days.
The temporal resolution of a satellite sensor is very much helpful in change detection. For instance, agricultural crops have unique crop calendars in each geographic region. To measure specific agricultural variables it is necessary to acquire remotely sensed data at critical dates in the phenological cycle. Analysis of multiple-date imagery provides information on how the variables are changing through time. Multi-date satellite images are also used to detect change in forest cover.

Google