Académique Documents
Professionnel Documents
Culture Documents
CHAPTER 6
6.1 INTRODUCTION
Neural networks have high fault tolerance and potential for adaptive
training. A Full Counter Propagation Neural Network (Full CPNN) is used
for restoration of degraded images. The quality of the restored imaged image
is almost the same as that of the original image. This chapter is organized as
follows. In section 6.1, the features of CPN are discussed. In section 6.2, the
architecture of Full CPNN is presented. In section 6.3, the training phase of
Full CPNN is discussed. Some experimental results which confirm the
performance of CPNN for image restoration are presented in section 6.4.
Finally section 6.5 concludes the chapter.
CPN is classified in two types. They are i) Full counter propagation network
and ii) Forward only counter propagation network. CPN advantages are that,
it is simple and forms a good statistical model of its input vector environment.
The CPN trains rapidly. If appropriately applied, it can save large amount of
computing time. It is also useful for rapid prototyping of systems.
Outstar model performs the second phase of training. The network is a fully
interconnected network.
Original
image
Full Counter-
Restored image
Propagation Neural
Network
Degraded
image
W V x*1
x1
Z1
x*2
x2
Z2
x*n
xn
. Output Layer
Input Layer .
. y*1
y1
y*2
y2
Znn
U T
y* m
ym Neuron
Degraded image Degraded image Y*
Figure 6.2 The Architecture of the Full CPNN for image restoration
In this phase, only the J unit remain active in the cluster layer. The
weights from the winning cluster unit J to the output units are adjusted, so that
vector of activation units in the y output layer, y*, is approximation of input
vector y; and x* is an approximation of input vector x. This phase is called
outstar modelled training. The weight updation is done by the Grossberg
learing rule which is used only for outstar learning. In outstar learning, no
competition is assumed among the units and the learning occurs for all units
in a particular layer. The weight updation rule is given as:
n 2 m
2
Z j x i w i, j y k u k, j (6.5)
i 1 k 1
Find min Zj and thus the winning neuron J, which receives the
minimum values.
k
(k) = (0) exp (6.6)
k0
141
k
(k) = (0) exp (6.7)
k0
Restoration Procedure
Step 2: Find Zj
n
Zj = (x i w ij ) 2 (6.8)
i 1
n
x *i v Ji (6.9)
i 1
m
y*j t Jk (6.10)
k 1
142
PSNR = 49 db
Figure 6.3 Degraded Lena image Figure 6.4 Restored Lena image
with impulse noise
143
Number of iterations
PSNR = 49.57db
Figure 6.8 shows the degraded medium information image with 0.8
noise probability and the restored image are shown in Figure 6.9.
PSNR = 49.34db
350
200
MSE
MSE
150
100
50
0
5 10 15 20 25 30
Number of iterations
Number of iterations
For the three types of images, the MSE is minimized as the number
of iterations increases.
The same network is also used for the restoration of colour images.
Colour images are divided into RGB distribution. Then each subspace can be
regarded as a gray image space and is processed by counter propagation
neural network used in gray images. Finally, they are combined to get a
restored colour image. In the first experiment, the image is degraded using
impulse noise. Figure 6.11 shows the image degraded by impulse noise and
the corresponding restored image is shown in Figure 6.12.
146
PSNR = 41.254 db
A blurred sail boat is shown in Figure 6.13 and the restored image
using full CPNN is shown in Figure 6.14.
PSNR = 41.59 db
.PSNR = 41.89db
PSNR = 41.79 db
Table 6.1 gives the processing time taken by the proposed CPN and
the resultant PSNR for different colour images like Lena, Mandrill and
parrot corrupted with impulse noise of p=0.6.
Table 6.1 Processing time in seconds for different colour images for
impulse noise
Mandrill 26 41.48
Parrot 30 41.72
148
6.5 CONCLUSION