Satellite images are used in two forms – analogue and digital. Analogue images are popularly known as hard copy images.
Digital images are made up of tiny picture elements known as ‘pixels’. These images are like matrices which are divided into rows and columns. It is obvious from their name that digital images have something to do with digits (or number). In fact these images are numerical representation of the features observed by a remote sensing device. It means every pixel must have some number or value to represent itself. These values are called as pixel values or digital numbers (DN). Digital number of a pixel depends upon the reflectance of a feature recorded by a sensor. Not necessarily a pixel has reflectance of a single object; it may be aggregate of many features ‘falling’ in that pixel. Suppose we are interpreting a satellite image acquired by IRS-1D LISS III sensor. Its single pixel represents 23.5mx23.5m of ground area; within this area a pixel may contain composite reflectance of a road, some road side trees and a building close to it.
A pixel of a satellite image has spatial and spectral properties. The spatial property represents spatial resolution of a satellite sensor while spectral one the appearance of the objects in a image.
On the basis of spectral properties, satellite images are of two type- panchromatic and multispectral images.
In panchromatic images pixels appears in different shades of grey that’s why often these are referred to as black & white images. Panchromatic images have single spectral band covering almost whole visible range of electromagnetic spectrum, e.g. CARTOSAT-1 images. The number grey shades which can be displayed by a pixel depend on number of bits per pixel. One bit images will have only two grey levels while 8-bit will have 256 grey levels.
Multispectral images contain multiple layer of spectral bands and are displayed in combination of red (R), green (G) and blue (B) colours. Hence these are coloured images e.g. IRS-P6 LISS IV MX images.