Vous êtes sur la page 1sur 19

Digital Im

age
Processing
Lecture #
2

Contents
0 This lecture will cover:
0 Components of image processing system
0 Human eye
0 How is image formed?
0 How is image captured?
0 Sampling and quantization
0 Pixel relationships

Components of an Image
Processing System

colour
TV
monitors.
Monitors driven by image
and graphics display cards
outputs

Image
Image displays
displays

Hardcopy
Hardcopy
devices
devices
for recording images,
include laser printers,
film cameras, heatsensitive
devices,
Problem
Domaininkjet
units, optical and CD-Rom
disks.

Network

1.
Short-term
storage
:during
processing
2. on line storage :relative fast recall
3. Archival storage :infrequent use

Computer
Computer

Specialized
Specialized
image
image processing
processing
hardware
hardware

Image
Image sensors
sensors

Mass
Mass storage
storage
ALU + Digitizer to convert
output of sensor into
digital form.

Image
Image processing
processing
software
software
--sensitive
to
energy
radiated by the object we
wish to image
--produce electrical output
proportional to the light
intensity

Human eye
Continuous with
cornea.
Opaque membrane

Three layers:
1. Cornea + sclera
2. Choroid
3. Retina

Imaging plane

--Tough,
transparent
--light enters the
eye
Contracts &
expands to
control amount of
light

Made of
concentric layers
of fibrous cells

Contains blood vessels


Divided into Iris and ciliary
body

Formation of image in
eye
CORNEA

Light enters eye through cornea.


Lens is flattened/thickened to project image on retina to focus the image on the retina.

Image Formation
object

Lens for focusing

Sensor array: to
capture image

Image Formation

Image of object

Image Formation

Projection onto discrete sensor array

Image Formation

Sampled Image

Sensors register average color

Image Formation

Continuous Colors, Discrete


Locations

Discrete Real-value
Image

Sensor assembly for


image acquisition
Increase selectivity

The three principal arrangements of the sensors: single sensor, line sensor and
array sensor

Single sensor

Line sensor

Image Acquisition Using


Array Sensor

Image sampling &


quantization
Proportional to energy radiated by illuminating
source
0 < f(x,y) <
f(x,y) = i(x,y) r(x,y)
illumination component
component
0 < i(x,y) <

reflectance

-- determined by source
object

-- characteristic of

0 < r(x,y) < 1

gray-level, l = f(x,y).or Lmin l


Lmax
IMPORTANT:- Lmin > 0 and Lmax <
Range [L

,L

] Gray scale [0, L-1]

Image sampling and


Quantization

Digital Image
Representation

f ( x, y )

f (0, 0)
f (1, 0)
...
f ( M 1, 0)

f (0,1)
f (1,1)
...
f ( M 1,1)

...
...
...
...

f (0, N 1)
f (1, N 1)

...

f ( M 1, N 1)

Discrete intensity interval [0, L-1], L=2k


The number b of bits required to store a M N digitized
image is, b = M N k

Pixel relationship

(0,0 (0,1
)
)

4-neighbors

(1,0
)
x

Diagonal neighbors
x-1,
y-1
x,y1

x1,y
x, y

x-1,
y+1
x,y+
1

x+1, x+1, x+1,


y-1
y
y+1

8 neighbors

Pixel adjacency
0 Pixels: p(x,y) and q(s,t)
0 V = {1} set of intensity values used to define

adjacency
0 4-adjacency: p and q with values from V, are 4-

adjacent if they are 4 neighbors


0 8-adjacency: p and q with values from V, are 8adjacent if they are 8 neighbors
0 m-adjacency (mixed adjacency): p and q with
values from V, are m-adjacent if:
0 q is 4-neighbor of p, or
0 Q is diagonal neighbor of p and there is no common

4-neighbor fro p and q, that is, N4(p) N4(q) =

In the next lecture


0 Spatial and gray scale resolution
0 Connectivity, regions and boundary
0 Image enhancement

0 Reading assignments: Chapter 2

Vous aimerez peut-être aussi