Académique Documents
Professionnel Documents
Culture Documents
Hahn
Page 1 of
56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
Remote Sensing
(Part 1)
Table of Contents
1. Basic Principles of Remote Sensing
1.1. 1.2. 1.3. 1.4. 1.5. 1.6. 1.7. Definitions, overall Remote Sensing process Electromagnetic radiation The electromagnetic spectrum Interaction of electromagnetic radiation with the atmosphere Interaction of electromagnetic radiation with Earth-surface material Energy sources and sensing Satellite images and visualization
3. Classification 3.1. 3.2. 3.3. 3.4. Concept of supervised and unsupervised classification Scatterplot and decision making Supervised classification Unsupervised classification
Page 2 of
56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
Page 3 of
56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
The science of Remote Sensing consists of the interpretation of measurements of electromagnetic energy reflected or emitted by a target from a vantage-point that is distant from the target. Earth observation (EO) by Remote Sensing is the interpretation and understanding of measurements P.M. Mather Computer processing of Remotely-sensed images, Wiley book Aircraft and satellites are the common platform from which Remote Sensing observations are made. F.F. Sabins Remote Sensing principle and interpretation, Freeman book
Energy Source The first requirement for remote sensing is to have an energy source which illuminates or provides electromagnetic energy to the target of interest. Radiation and the Atmosphere As the energy travels from its source to the target, it will interact with the atmosphere it passes through.
Page 4 of
56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
This interaction may take place a second time as the energy travels from the target to the sensor.
Interaction with the Target The energy (electromagnetic radiation) interacts with the target depending on the properties of both the target and the radiation.
Recording of Energy by the Sensor After the energy has been scattered by, or emitted from the target, a sensor is required to collect and record the electromagnetic radiation.
Page 5 of
56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
Transmission, Reception, and Processing The energy recorded by the sensor has to be transmitted, often in electronic form, to a receiving and processing station where the data are processed into an image (hardcopy and/or digital).
Interpretation and Analysis The processed image is interpreted, visually and/or digitally (image analysis), to extract information about the target which was illuminated.
1.2.1 Basic terminology Energy - the capacity to do work, expressed in J (Joules) Radiant energy - the energy associated with electromagnetic radiation (EMR)
Flux of energy
Page 6 of
56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
place to another (Latin word, meaning = flow) - is measured in W (Watts) Radiant flux density - to understand the interaction between electromagnetic Radiant flux is the rate of transfer radiation and surfaces of radiant
that is incident upon or, conversely, is emitted by a surface of unit area (measured in W/m2) (if radiant energy falls upon a surface)
== Irradiance
Example
Radiance
area on the Earths surface as viewed through a unit solid (3D) angle.
is measured in steradians = 3 D equiv. of the radian
Page 7 of
56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
surface
Then:
A proportion of the radiant flux might be measured per unit solid viewing angle --this proportion is the radiance flux
surface normal solid (3D ) angle
source area
Page 8 of
56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
irradiance
and the
Diffuse reflectance ( see above) Specular reflectance: = angle of incidence and angle of reflectance
scattering
angle of incidence
angle of reflection
Remarks: 1 When remotely-sensed images collected over a time period (= multi-temporal images) are to be compared it is common radiance values recorded by the sensor
variable irradiance 2
rather than to the whole electromagnetic spectrum. Precede the terms by the adjective spectral Spectral reflectance, spectral irradiance,etc.
Page 9 of
56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
Controversy in physics over the past 250 years EMR: wave theory corpuscular theory
the wave-like characteristics of EMR allows the distinction with regard to wavelength e.g. microwave, infrared radiation
in order to understand the interactions between EMR and the Earths atmosphere and surface.
Todays view of quantum mechanics: EMR is both a wave and a stream of particles.
1.2.3 Wave characteristics of electromagnetic radiation EMR is travelling at a velocity c (=speed of light) equal to 3*108 m/s in a sinusoidal, harmonic fashion. EMR consists of an electrical (E) field and a magnetic (M) field
Page 10 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
Wavelength ---- length of one wave cycle = distance between successive wave crests
Frequency ---- number of cycles (or crests) of a wave passing a fixed f point per unit of time ( = 1 sec)
Wavelength and frequency are related according to c= * f c --- speed of light (is essentially constant) = 3*108m/s (in a vacuum) --- unit is [m] or mm, f --- unit is [Hz=cycle/sec] m, m inverse is period T = 1/f
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
Examples:
Explanation i) 2.5 cycles per second f = 2.5Hz period = 0.4 sec per cycle ii) iii) 4 Hz 1.5 Hz
Example 1: Given f = 4 Hz, calculate : with c = *f = 3*108 m /sec = 300 000 km/sec and f = 4 Hz = c/f = (3*108m/sec)*(1/4)sec = 75 000 km
Page 12 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
Example 2: Given wavelength in micrometers, = 1m, calculate f: = 1m (Near infrared) = 10-6m f = (3*108m/sec) / (1/10-6m) = 3*1014 Hz
In the particle description, electromagnetic energy travels in quanta (discrete units) of energy.
The energy of a quantum is given as Q= h * f Q = energy of a quantum h = Plancks constant f = frequency (Joules J) h=6.26*10-34 J*sec (Hz=1/sec)
Page 13 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
Substituting
f = c/
into
Q = h*f
yields
Q = h*c/
h*c = constant
Conclusion: The shorter the wavelength , the higher the energy content
Page 14 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
long
Page 15 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
The names assigned to regions of the spectrum make a discussion of the spectrum more convenient. In each of the regions wavelengths behave similarly mechanisms. However the division between UV and visible or microwave and thermal infrared is not hard. The regions blur into each other. adjacent
a) The visible spectrum (visible light) is so called because it is detected by the eyes, whereas other forms of EMR are invisible to the unaided eye. The spectrum range of visible light is 0.4-0.7 um wavebands are perceived as particular colours:
0.40
0.46
0.50
0.70 m
Blue, green and red are the primary colours or wavelengths of the visible spectrum. All other colours can be formed by combining R-G-B.
Page 16 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
b) The infrared spectrum (infrared = beyond the red) covers the wavelength range from approximately 0.7 m to 100 m
Regions:
m Radiation property
Near IR
0.7
Mid IR
1.3 3.0
Thermal/Far IR
100
emissive, radiative, thermal emitted from Earths surface in the form of heat
c) The microwave spectrum ranges from submillimetre to 1 (to 3) metres further subdivided in bands : K, X, C, S, L, P band
Some microwave sensors can detect small amounts of radiation at those wavelengths that are emitted by the Earth. passive sensors
But the important RS microwave sensors are all active systems Generation, transmission and recording of the reflected radiation
Page 17 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
depends on interaction with the Earths atmosphere (particles and gases in the atmosphere) obviously absorption (cf. figure above) happens not everywhere and not to the same degree in the spectrum visible area and other high spectral transmission in the atmospheric windows energy level of the sun has its peak in the visible area all passive RS sensor systems have to take these two aspects (transmission and energy) into account. the heat energy emitted by the Earth corresponds to a windows around 10 m (max energy) in the thermal IR
Page 18 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
in the
The total amount of radiation that strikes an object is equal to reflected radiation absorbed radiation transmitted radiation incident radiation
Page 19 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
1.4.1 Scattering
How much scattering takes place depends on the wavelength of the radiation the abundance of the particles or gases the distance the radiation travels through the atmosphere Three types of scattering take place:
a)
Rayleigh scattering occurs when particles are very small compared to the wavelength of the EMR particles: small specks of dust or nitrogen and oxygen molecules shorter wavelengths of energy are much more scattered than longer wavelengths Rayleigh scattering is the mechanism dominant scattering
Page 20 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
Remarks: Blue sky phenomenon (during the day) ---- stronger scattering of the blue wavelength of the sunlight the blue light seems to reach our eyes from all directions Sunrise and sunset ---- the scattering of the shorter
wavelengths is more complete (longer distance through atmosphere) longer wavelengths (orange, red) penetrate
b) Mie scattering occurs when particles are particles: dust, pollen, smoke Mie scattering tends to affect
water vapour, salt particles from oceanic evaporation longer wavelengths than those
Page 21 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
where larger particles are more abundant. dominates when cloud conditions are overcast.
(a) and (b) are selective scattering processes i.e. scattering affects specific wavelengths of energy.
c)
Nonselective scattering occurs when the particles are of the radiation (above 10 m) particles: water droplets, ice fragments, large dust particles all wavelengths are scattered about equally causes fog and clouds to appear white to our eyes. All wavelengths are scattered (by the water droplets) in approximately equal quantities. ( white light)
Page 22 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
1.4.2 Absorption
The three main constituents (gases) which absorb radiation are ozone carbon dioxide water vapour
Ozone absorbs the harmful UV radiation from the sun. protective layer in the atmosphere avoids skin burn Carbon dioxide the greenhouse gas tends to absorb radiation in the the spectrum thermal heating far infrared portion of
atmosphere greatly varies from location to location and at different times of the year. (Little water vapour above deserts but high humidity in the tropics)
Page 23 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
These gases absorb EM energy in very specific regions of the spectrum (called absorption bands ). Those areas which are not severely atmospheric windows. .
(= areas which are useful for RS purposes). cf. figure in section 1.3 transmission
Page 24 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
1.5
Electromagnetic energy that is not absorbed or scattered in the atmosphere can reach and interact with the Earths surface.
The total incident energy will interact with the target in one or more of these three ways. The proportions of each will depend on The wavelength of the EMR The material and condition of the target
Target dependency: There will be a variation of the interaction from time to time during the year. Example: vegetation, from leafing stage to maturity.
Page 25 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
A: absorption: T: transmission:
radiation is absorbed into the target radiation passes through a target radiation bounces of the target, is redirected travels upwards through the atmosphere
R: reflection:
(interaction with the atmosphere) enters the field of view of the sensor is
detected and recorded by the sensor most interest in RS is in measuring radiation reflected from targets
Amount and distribution of reflected energy are used in RS to infer the nature of the reflecting surface.
Background: basic assumption made in remote sensing is that specific targets (soils, rocks, vegetation, water, ) have an
Page 26 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
spectral response
specular reflection
diffuse reflection
incidence angle
reflection angle
i = r
- specular and diffuse reflection represent the two extreme ends of the way in which energy is reflected - most Earth surface features are located somewhere between
Page 27 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
- but: in the visible part of the spectrum terrestrial targets diffuse reflectors calm water specular reflector
diffuse reflector
--
rough smooth
surface surface
specular reflector --
Rough/smooth is defined by
that make up the surface in comparison to the wavelength of the incoming radiation
Example: fine-grained sand would appear fairly smooth to microwaves (long wavelength) but quite rough to the visible (short wavelength).
Page 28 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
spectral signature
20
vegetation
10
0.4
0.5
0.6
0.7
0.8
(m)
A spectral reflection curve describes the spectral response of a target for a certain region e.g. 0.4 - 2.5 m.
Note: A satellite sensor operating in the visible and NIR region does not observe and detect all reflected energy FOV. To make use of such measurements, the distribution of radiance all possible observation and illumination angles, called the
bi-directional reflectance distribution function (or BRDF) must be taken into consideration
Page 29 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
Spectral response curses for about 2000 materials can be found in the JPL ASTER library: e.g. 9 different ones for water/ snow/ ice 1350 minerals etc.
Reminder violet 0.4 Example 1. leaves: lower reflection = higher absorption in B, R higher reflection in G very higher reflection in IR ---- not plotted 2. water: lower reflection = higher absorption in R, NiR darker if viewed in R, NiR higher reflection in B, G water looks blue, green blue 0.46 0.50 green yellow orange 0.58 0.60 0.62 red 0.70
Page 30 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
Summer Autumn
healthy leaves: internal structure of leaves act as all excellent diffuse reflector of NiR wavelengths extremely bright ( but not visible to our eyes )
Target interaction with water longer wavelength radiation R, NiR is than shorter wavelength water looks blue or blue-green due to stronger reflectance absorbed more by water
Page 31 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
If
sediment
slight shift to longer wavelengths ( green, yellow ) because of a better reflectivity brighter appearance of water. Chlorophyll (algae) absorbs blue, reflects green water will appear in a more green colour Topography of the water surface (rough, smooth, floating material, etc.) can also lead to complications for water-related interpretations due to problems of specular reflection or other influences on colour or brightness. spectral response can be quite variable, even for the same target type, and can also vary with time and location. important to know where to look spectrally
but
Page 32 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
1.6
The sun is the most obvious source of the electromagnetic energy measured in remote sensing. The suns energy is either reflected (visible, reflected iR) or absorbed and re-emitted (thermal iR).
EM energy that is naturally available comes from a passive source. The RS instruments which detect the naturally available energy are called passive sensors.
Passive sensors: Can only be used to detect energy when natural energy is available. reflected energy: -- requires illumination of the Earth (daytime) re-emitted energy: -- can be detected day or night, as long as the amount of energy is large enough.
Passive source: solar energy Visible iR (include thermal) UV, X- and Gamma-ray
Page 33 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
Sensing
Recording with passive sensors: Reflected energy is mainly recorded by instruments which travel in sun-synchronous orbits (the satellite
travels southwards over the illuminated side of the Earth and crosses the Equator at the same local Sun time on each orbit). Data are recorded only on the way from North pole to South pole because the other
The reflected radiation is detected and recorded by the sensor. Man-made energy source
An active system requires the generation of a fairly large amount of energy to adequately illuminate targets.
Active sensors can be used to examine wavelengths that are not sufficiently provided by the Sun, such as microwaves.
Microwave imaging radar (synthetic aperture radar, SAR) and laser scanner (airborne platform) are examples of active sensors.
Page 34 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
1.7
Example: SPOT 1, 2 (2 HRV-instruments) 1986, 1990 XS-bands 0.50 0.59 um 0.61 0.68 um 0.79 0.89 um M (= pan) band 0.51 0.73um 1998 1.58 1.75 um (mid iR additionally) SPOT 4
Data formats for digital satellite imagery Unfortunately, no world-wide standard for storage and transfer of remotely-sensed data has been agreed upon specific procedures for reading satellite image data requested CEOS (Committee on Earth Observation Satellites) tries to standardise.
Display of satellite images Display of one band: Display of three bands: grey scale image pseudo colour image
Page 35 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
Inspection of three images of Honolulu (cf. the corresponding pictures) Band 2 Band 7 Band 11 0.45 0.52 um 0.76 0.90 um 0.5 14.0 um (blue-green (NiR (thermal iR 0.460.50.58 m) 0.7 1.0/1.3 m) 3 15 m)
Observation: For different regions of Honolulu different brightness levels can be observed in different wavebands value of obtaining multiple images at different wavelength.
Discussion: 1) Region indicated by (a) in band 2 blue wavelength: along the coast line. 2) Region (b) in band 2 and band 7 blue: rain forest appears fairly dark NiR: rain forest (vegetation) appears quite bright. reflective nature of chlorophyll. one can see through the shallow water
Page 36 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
3) Region (c) in band 11 Thermal iR: dark patches are clouds, which in the thermal iR are cold. Blue, NiR: at shorter wavelengths the high reflectivity of the water droplets leads to bright patches. 4) Region (d) in band 11 Bright areas in the thermal band that are dark in the other bands. Areas (d) include parts of the airport runways which are facing the sun are warmer than the average scene and so are bright.
Page 37 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
deficiencies. Removal of flaws and correction of deficiencies are termed preprocessing. Some corrections are carried out at the receiving station.. ground Nevertheless there is often a need on the users
Preprocessing may include: Corrections for geometric, radiometric and atmospheric deficiencies Removal of flaws (data errors)
a) Partially or entirely missing scan lines are normally seen as horizontal black (0) and white (255) lines on the images.
Page 38 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
b) Horizontal banding pattern electro-mechanical scanners (Landsats MSS and TM) have several (a small number of) detectors that are used in the scanning process. The imbalance in the six (MSS) detectors shows up by strips (banding pattern) in the image.
Missing scan lines and banding patterns can be considered to be a cosmetic defect that interferes with the visual appreciation of the patterns on the image. It might be even more problematic for statistical/pattern analysis of images.
Background: Spatial autocorrelation have similar values. Therefore, neighboring pixels of objects will strongly correlate = points that are close geographically tend to
Page 39 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
Processing for replacement of missing scan lines Option 1: Replace a missing pixel value by the value of the corresponding pixel on the immediately preceding scan line. Option 2: Replace the missing value by the average of the neighbouring pixels on the scan lines above and below of a defective line.
I (i, k) = (I (i, k-1) + I (i, k+1))/2 Note: read as ''pixel i on scan line k'' Option 3: Use the neighboring bands of multi-spectral imagery.
For instance, the Landsat (1 to 3) MSS band 4 (green) and 5 (red) are normally highly correlated. In general, bands in the same region of the spectrum are highly correlated and can be used to correct missing scan lines. I ( i, k, b) = b/r*( I( i, k, r) - ( I(i, k+1, r) + I(i, k-1, r) )/2 ) + ( I(i, k+1, b)+ I(i, k-1, b) )/2
k
b, r -- bands
band b
band r
Detection of missing scan lines is a tedious task if such lines are located interactively (by visual
examination)
Page 40 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
the auto-correlation property can be used for semi-automatic E.g. by comparison of the average grey values of search along
localization.
neighbouring scan lines. In case of large differences these scan lines for unexpected sequences
0's or 255's). Mark the suspected sequence and display it for inspection by a operator.
b) De-striping methods A horizontal banding pattern is sometimes seen on Landsats MSS and TM data. (electro-mechanical scanners). This pattern is more apparent when seen against a areas. dark, low-radiance background such as water
The MSS has six detectors for each band (MSS: 4 spectral
bands) why the banding pattern is known as sixth-line banding in Landsat MSS images. TM has 16 detectors per band and produces seven bands of imagery.
The underlying idea of de-striping is based upon the assumption that each detector sees a categories similar distribution of all the land cover
the histograms generated for a given band from the pixel values produced by all n detectors should be identical. This implies that
the mean and standard deviation of the data from each detector should be the same. To get rid of the stripping effects the means and standard deviations are calculated from lines 1, 7, 13, 19, (histogram 1),
Page 41 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
and so on
All n histograms are equalized by forcing mean and standard deviation to be equal to the corresponding average values of mean and standard deviation of all of the pixels in the image.
registration, also called rubber-sheeting, is typically defined by a polynomial transformation of an image to a set of control points.
For presentation of RS images in a map-like form rectification (geometric correction) has to be carried out. Rectified images can be overlaid with maps or used to locate features of interest on the map and the image. Rectification may also be used to bring registration adjacent images into
different sensors. Rectification procedures of photogrammetry range from simple plane rectification to the more complex process of generating digital orthophotos.
Page 42 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
- distortions of the optical system, non-linearity of the scanning mechanism and non-uniform sampling rates. panoramic distortion
is a function of the angular field of view and affects instruments with wide AFOV (such as AVHRR) more than those with a narrow AFOV (Landsat MSS + TM, SPOT HRV) Earth rotation During the movement of a satellite southwards above the earths surface the Earth moves eastwards thus the effect of Earth rotation is to skew the image.
Potential scan lines at time t2 without Earth rotation scan lines at time t2 with Earth rotation Satellites ground track
Page 43 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
platform instabilities Include variations in altitude and attitude . The information needed to correct for the variations is not generally available (modern enough satellites carry GPS, INS, star sensors, ) or for correction of the image data.
not precise
nominal orbital parameters must be replaced by a transformation using ground control points. Instead of the attempt to define the sources of error and their effects an alternative method is to look at the problem from the opposite end the differences between the positions of points recorded on image and map can be used to estimate the distortions present in the image. ,
Processing: a) Relate the image and map coordinate system by an empirical transformation. --- commonly polynomials of second or third order are used for locate gcps
map-to-image (image-to-map) coordinate transformation. b) Locate suitable ground control points by using GPS or on the map
Note: gcp chips (19*19 pixels) of existing image maps may also be used. c) Estimate the transformation parameter by least squares and d) Determine the pixel values of the rectified image by resampling (gray scale interpolation)
Page 44 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
Third-order polynomial for mapping (x, y) map coordinates to (r, c) image coordinates (and vice versa). X = a00+a10*c+a01*r+a20*c2+a11*c*r+a02*r2+a30*c3+a21*c2*r+a12*c*r2+a03*r3 Y = b00+b10*c+b01*r+b20*c2+b11*c*r+b02*r2+b30*c3+b21*c2*r+b12*c*r2+b03*r3 The unknowns are the parameters aij, bij of the transformation. First order polynomial: Second order polynomial: Third order polynomial: 6 unknowns ( 3 or more gcps) 12 unknowns ( 6 or more gcps) 20 unknowns ( 10 or more gcps)
Experience: Around (<) 10 gcps give acceptable result for a first-order fit with a small image area of up to 10242 pixels. More gcps will be needed in area of moderate relief (where a secondorder polynomial may be required)
Page 45 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
that point. Scattering redirects some of the incoming EM energy and some of the reflected EM energy within the atmosphere into the field of view of the sensor.
sensor
ground
Relationship for estimating atmospheric effects on multi-spectral images in the 0.4 2.4 m reflective solar region: LS = Htotal * * T + LP
(2.1)
Htotal
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
is the
reflectance
of the target.
(ratio: downwelling/upwelling or irradiance/radiant emittance) T is the transmittance given by the transmission curves as a
Model (2.1) is a simplified model which does not explicitly take account of the following aspects: reflectance of a surface will vary with the view angle as well as with the solar illumination angle (particularly important for wide FOV and offnadir viewing) the slope of the ground and the disposition of topographic features.
More complex models are developed but operational use of these models is limited by the need to supply data relating to the condition of
the atmosphere at the time of imaging. The costs of such data-collection activities is considerable, hence reliance is placed upon the use of standard atmospheres such as mid-latitude summer. In this case horizontal visibility
only a very small number of parameters, e.g. the in kilometres have to be supplied.
Page 47 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
Atmospheric correction might be beneficial in three cases: 1) If ratios of the values in two bands of a multi-spectral image are computed e.g. the normalised difference vegetation index. NDVi=(NiR-R)/(NiR+R) study vegetation patterns A simple technique for compensation of atmosphere path radiance might be sufficient.
NiR
2)
If upwelling radiance from a surface is related to some property of that surface in terms of a physically based model => the atmospheric component must be estimated and removed.
3)
If results found at one time are to be compared with results achieved at a later date => the state of the atmosphere will undoubtedly vary from time 1 to time 2.
Page 48 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
methodologies for the calibration of the Landsat TM optical bands, SPOT HRV, other optical sensors and Radar are proposed.
The relationship between radiance and pixel value (PV) can be defined for spectral band as Ln* = a0 + a1 *PV Where a0 and a1 are offset and gain coefficients and Ln* is apparent radiance at the sensor. (Measured in units of mW/(m*sr*m))
Spot provides gain values ai in the header of the XS image. The apparent radiance of a given pixel is calculated from L = PV/ ai Given the value of radiance L it is usual to convert to apparent reflectance by = (*L*d2)/(ES*cos(S)) d = relative Earth - Sun distance Es = exoatmospheric solar irradiance s = solar zenith angle
(Reference: Floyd F. Sabins, Remote Sensing: Principles and Interpretation, 3rd edition, W.H.Freeman and company, 1997 Note: is not corrected for atmospheric effects
Page 49 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
Chapter 3 Classification
3.1 Supervised and unsupervised classification
Image classification can be decided into two categories: Supervised and unsupervised classification.
Supervised classification refers to the process of measuring characteristic features of the entities (or objects) one wants to classify by using training sets of known objects or object classes and use them to determine the class membership of all other pixels in an image.
clustering process
which aims
number of distinct,
naturally occurring
groups and the allocation of pixels to these groups (or classes). In this respect it can be considered as a segmentation technique which
In both cases the properties (features) of the pixel to be classified are used to label that pixel. In the simplest case, a pixel is
characterised by a
in each spectral band. This feature vector (also called pattern) represents the spectral properties of that pixel. Further features such as
Page 50 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
Note: If classification does not lead to proper results: i. try to find a more sophisticated classification scheme not recommended ii. try to find better or more features, e.g. from other sensors (Laser, Radar, in addition to optical sensors) or existing databases (DTMs,...)
other for each pixel and the vector (feature 1, feature 2) determines the position of that pixel in the two-dimensional Euclidean space.
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
The
is directly
related to the values of the two features. Obviously points belonging to the same class tend to cluster and points belonging to different classes tend to be separated. This is the underlying assumption
Example 2 R Water body Shadow region low low NiR low low MidiR low Any grey value possible
Adding a third feature leads to a three-dimensional scattergram. The problem of N-dimensional feature spaces is that they can not be visualised properly. 3D 1+2 4D 1+2+3 5D 1+2+3+4 3 6 10 two-dimensional scatterplots two-dimensional scatterplots two-dimensional scatterplots use several 2D scatterplots ?
Decision making: Given a scatterplot (cf. figure 3.2 ) one can recognise Two district clusters The compactness of each cluster The distance in feature space (example: d12, d23 ) A linear decision boundary (boundary between two clusters/classes)
Page 52 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
Feature 2
Training samples (learning phase) Task: Determine statistical characteristics of each class (the number of classes must be known)
feature value 1 feature value 2
If X is a feature vector:
X =
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
Determine: mean X extreme values: min, max, for each feature in each class variances and covariances > variancecovariance matrix
K-means or centroid method calculate mean/centre X of each training class calculate Euclidean distance from each unknown pixel
to the centre of each class. The pixel is given the label of the centre to which its distance is smallest. (nearest centre decision)
X2
X3
X1
X3
X4
Page 54 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
In n-dimensional feature space the decision boundary corresponds to a hyper plane which is perpendicular to the line connecting the two centroids X 1 and X 2 .
3. Maximum likelihood method If one of two neighbouring clusters is much smaller than the other one, it makes sense to move the boundary between them closer to the
centroid of the smaller one. Similarly, if the clusters are elongated in a certain direction, the boundary should be tilted toward the direction of their elongation. model of probability distribution (Gaussian normal distribution)
X2
Page 55 of 56
Module 2, Remote Sensing 1 1st semester Lecture Notes of Prof. Dr. M. Hahn
means of discovering which points belong together with be successful. This process is referred to unsupervised learning.
Principles of automatic cluster formation 1. Each point is considered a separate embryonic cluster. In an iterative process points (clusters) are merged together if they are closer
than any other two points. The iteration stops either when the expected number of clusters has been found or when the next points to be added to a clusters is more than some threshold distance away. K-means clustering
2.
Initially the whole collection of points is considered to be one huge cluster. Iteratively, existing clusters are split along lines of
weakness in two clusters. The splitting is repeated until some limits (max number of expected clusters) are passed. Splitting combined with merging to improve the results. ISODATA algorithm (Iterative Self-Organising Data Analysis Technique) can be
Page 56 of 56