Vous êtes sur la page 1sur 51

Technical report, IDE1032, June 2010

Localization of designed texture

On glass bottles

Master Thesis in Intelligent Systems

Abhishek Sharma Pallav Nandi Chaudhari

850326-1250 850908-T314

Localization of Designed Texture on Glass Bottles Page 1


School of Information Science, Computer and Electrical Engineering
Halmstad University
Localization of Designed Texture on Glass Bottles
Localization of Designed Texture on Glass Bottles

Preface

This treatise is a master’s thesis for the Master of Science program in intelligent systems at

the Halmstad University, Sweden in the IDE department. This thesis deals with the

localization of designed texture on glass bottles at the hot end. This is a research issue from

the company Gedevelop located at Helsingborg, Sweden. The work is carried out with the

guidance of Mr. Kenneth Nilsson. The data for the experiments are provided by Gedevelop.

This project is implemented in the Matlab programming language on a personal computer

system and experimental results on the proposed method are presented.


Localization of Designed Texture on Glass Bottles
Localization of Designed Texture on Glass Bottles

Abstract

Hot end inspection of glass bottles necessitates the localization of designed textures

on that bottle. In this paper we proposed a method to localize designed textures on a test

bottle by matching its image to a certain number of reference images and then mapping the

texture to the test bottle. The intended purpose of this algorithm is to localize the designed

texture on the bottle so that the designed texture areas and other areas on the image of the

bottle can be distinguished. The localization is done by finding the amount of rotation

caused due to the motion of the conveyor belt. The average orientation angle feature is

used in the matching. Reference databases are created which contain necessary information

regarding the different textured regions. A selected portion of the image is used to match

the bottle at the hot end with the same area of the reference bottles. The design texture is

marked, according to the appropriate match. All the methods involved experiments and

results and conclusions based upon them are presented subsequently.


Localization of Designed Texture on Glass Bottles
Localization of Designed Texture on Glass Bottles

Contents

1 Introduction ………………………………………………………… 1

2. Background…..…………………………………………………….. 4

2.1 Real world scenario and their impact………………………….. 5

2.2 Texture recognition and localization……………………………..7

2.2.1 Direct template matching……………………………………………………7

2.2.2 Feature based matching………………………………………………………7

2.3 Propose methods…………………………………………………….. 8

2.3.1 Reference model……………………………………………………………… 8

2.3.2 Computations on test image…………………………………………….9

2.3.3 Limitations………………………………………………………………………..10

2.3.3 Hot spot marking…………………………………………………………….11

3. Method…………………………………………………………………12

3.1 Feature Extraction………………………………………………………12

3.1.1Dominant local direction..………………………………………………….13


Localization of Designed Texture on Glass Bottles

3.2 Skew detection and correction and target area

extraction……………………………………………………………………. 15

3.2.1 Skew detection………………………………………………………………. 15

3.2.2 Target area extraction….………………………………………………….16

3.3 Similarity Measure……………………………………………………..22

3.4 Hot spot marking……………………………………………………….23

4. Experiments and results………………………………………..24

4.1 Software environment………………………………………………..24

4.2 Data…………………………………………………………………………24

4.3Feature Extraction………………………………………………………28

4.4 Registration……………………………………………………………….31

4.5 Matching……………………………………………………………………32

4.6 Hot Spot Marking……………………………………………………….36

5. Conclusions…………………………………………………………..38

References……………………………………………………………….40
Localization of Designed Texture on Glass Bottles
Localization of Designed Texture on Glass Bottles

1 Introduction

Fault detection is an essential part of the glass manufacturing process. Different types of

faults occur in glass containers during the manufacturing process [9]. The glass containers

are always inspected by automatic systems. Sometimes human inspection is also done [14].

Most of the inspection is done at the cold end of the manufacturing process. Cold end is a

relatively time consuming process and fault detection is an independent part of this phase.

The company Gedevelop[11] is looking for some method which can detect faults at the hot

end phase, because hot end inspection may give some advantages over regular cold end

inspection as an early warning of problems. Gedevelop is a system development company

which develops and maintains systems for glass manufacturing plants. They have their

camera system which can take pictures of bottles at the hot end and they are interested in

finding manufacturing defects. For detection of faults by the analysis of images, it is

necessary that the system must be able to differentiate between designed texture and

textures made out of defects. One precise way of doing this is by localizing the designed

texture on the glass containers. Once it is localized, it will be easier to find the areas on the

bottle where inspection should be made for defects.

Hot end phase puts several constraints on the task of fault detection. The environment at

the hot end is not a very friendly environment in which to perform an experiment.

Traditional fault detection equipments cannot work there; no kind of handling of the bottles

is possible. One possible way of inspection is vision based systems. There are also some

limitations imposed by the situation. Multiple numbers of cameras cannot be used. The glass

containers come very fast on the conveyor belt from the molding machine and there is very

1
Localization of Designed Texture on Glass Bottles

limited gap available between two bottles on the conveyor belt. Space is very limited for

fitting multiple numbers of cameras so, normally, only one camera can look at a bottle at a

certain time.

In vision based systems, one of the possible ways to detect different kinds of faults

e.g. cracks, foreign inclusions, etc, can be inspected by texture analysis. In that case, there

must be a classification method which can identify the designed texture on the bottles, so

that the designed texture and textures made out of faults can be differentiated.

In the identification of designed texture on glass bottles at the hot end, there are

certain factors which, to a certain extent, provide the opportunity for it to be modeled in an

easier way, because of its conceptual relevance to some very efficient texture recognition

algorithms, such as optical character recognition [2, 12]. On the other hand, the real world

situation makes it difficult by imposing some constraints upon the image capturing task.

Practically, when the bottle comes onto the conveyor belt from the forming machine, the

camera can look only at one side of the bottle, which implies that the whole texture area on

the bottle will not be visible to the camera. Due to the high speed of the belt, the bottle may

be rotated to a certain degree. Because of this rotation, the same texture area will not

always be captured by the camera. Keeping in mind all these points, here, we propose a

method to localize the designed texture by finding the rotation angle of the bottle by

matching it to certain number of references and then mapping the texture to the test bottle.

The method is passive in nature, because it does not directly extract the designed texture.

In our proposed method, the image pixels are represented by their average directions

extracted from linear symmetric features [4] in a local neighborhood. Rather than using

Matching procedure to find the texture directly, we used matching to find the amount of

rotation. After finding the amount of rotation with respect to a reference image, the

designed texture is marked according to the matching reference.

2
Localization of Designed Texture on Glass Bottles

This treatise is composed as follows:

In chapter 2, the background, an overview of the real world situation in the present

context and motivation for the proposed procedures, is explained. In chapter 3, the

methods involved are discussed in detail, whereas in chapter 4 and chapter 5, the

experiments and their results, and their future scopes, are described.

3
Localization of Designed Texture on Glass Bottles

2 Background

Texture recognition is an image processing technique which has a wide variety of

applications in different scientific research areas e .g. optical character recognition[2], face

recognition[15] fingerprint recognition[1][8], biometrical identification of laboratory

animals[13], medical image analysis[6], robot navigation [16]etc. For human eyes it is

possible to distinguish between different textures easily with natural visual capabilities.

Developing such a machine which will have such a competent visual system is difficult.

Usually, geometrical or mathematical representation of arbitrary uneven shapes is difficult

to implement. So, in algorithm design, uneven shapes are always approximated to their

closer geometrical shapes which are mathematically well defined. In certain situations, this

approach is not good enough, e.g. biometric person authentication. In simple terms, it can

be said that there cannot be any universal generalization to categorize textures. Different

researchers have come up with different commendable ideas which fit in to their relevant

application areas. The development of a texture recognition algorithm is mostly dependent

on the application scenario.

The problem of localization of designed texture on glass bottles at the hot end by image

processing has certain number of important facts associated with it which arise out of the

real world situation where the image has been taken.

The inspection of glass containers at the hot end of the manufacturing process is a common

intention of the glass industries all over the world nowadays. Research is being conducted

on this problem using different technologies. However, still today the research on this field

is not in the public domain, as they are bound inside the different companies [10] which are

working on this. So it is difficult to get ideas about the relevant ongoing work in this

4
Localization of Designed Texture on Glass Bottles

particular field. E.g. one of the renowned companies of Europe1, worked on this problem

and they put some information about this in their website. In short, they just disclosed that

they are using the heat radiation property of hot bottles and capturing images of hot bottles

using infrared camera. Information about further processing of these images is not

mentioned in detail.

2.1 Real world scenario and their impact

The hot bottles from the molding machine are put onto a conveyor belt and

transported, one after another, at a certain speed. The camera is placed at a certain

position, oriented orthogonal to the direction of motion of the conveyor belt, such that it can

take pictures when the bottles are passing by it. According to Gedevelop, the bottle can be

rotated around the vertical line passing through its physical center of gravity up to +/-15

degree due to the speed. This will be clearer from figure 2.1, which represents the top view

of a bottle on the conveyor belt. Here, the conveyor belt is moving along XX’ and the

camera is looking by YY’ direction. The maximum range of angle φ is +/-15 degrees. For the

same speed factor, it can be inclined at a certain angle to right or left. It is also not always

necessary that the optical axis of the camera will pass through the center of the bottle.

These facts have a deep impact on the opted strategy of localization in the present scenario.

1 The name cannot be disclosed.

5
Localization of Designed Texture on Glass Bottles

Figure 2.1 The top view of a bottle on the conveyor belt showing the

rotation in interval +/-φ, where φ=15 degrees

Because of the rotation, the pictures taken by the camera will not always contain the

same textured region. This indicates that, although textures can be extracted directly in

different ways, their localization will not be easier. This is because of the change of the

image due to rotation. If, for every bottle passing in front of the camera, the same picture is

taken, then the pixel coordinates of the designed texture can be easily defined. As the bottle

is rotating, the same designed texture has changed pixel locations on different images. The

localization of designed texture is difficult in a direct way, because the designed texture and

texture from faults must be differentiated. So the problem is twofold. In the first place, the

texture has to be extracted. Secondly, the designed texture has to be located according to

the change due to rotation. This can be done by finding the rotation angle of the bottle on

the conveyor belt. The rotation angle can be found by matching the image of the test bottle

on the conveyor belt with some reference images of the bottle rotated at different angles.

6
Localization of Designed Texture on Glass Bottles

2.2 Texture recognition and localization


In texture recognition and localization, the methods can be divided into two

categories in general. One is direct template matching and another is feature based

matching. The main difference between the two is that, while the former uses pixel values of

the images (e.g. for gray images gray values are used) directly, the latter uses different

useful features (e.g. linear symmetry feature) for texture description and matching.

2.2.1 Direct template matching

In simple applications, template matching is useful because it is easy to

implement and less complicated from a mathematical point of view. However, in complex

situations it is not so useful. In the present case, the designed texture occupies little area

compared to the whole image. As a result, if the matching is done using the whole images,

there will be aberrant results as non textured regions will dominate the calculations. It is

thus necessary to match only those parts of the image which contain the same designed

texture. Extraction of such an image area can be done in a robust way from feature images

than grey images. Using gray scale values as feature is more sensitive to noise, as gray

scale values change with illumination.

2.2.2 Feature based matching

Feature based matching is nowadays applied in wide variety of online and offline

applications. E.g. many optical character recognition algorithms use linear symmetry

features to localize a certain characters in a text [2]. In fingerprint analysis, the test finger

print is matched with the reference finger prints using different features. The two finger

prints are aligned using core and delta points [1], and then matched using the extracted

feature.

7
Localization of Designed Texture on Glass Bottles

2.3 Proposed methods

The matching in the methods mentioned in section 2.2 is done using Schwartz Inequality

[5]. If we try to find and localize the different designed textures on the test bottle directly

by matching the designed textures on the reference images with the test bottle, the

amount of computation will be high, as each and every design texture of each reference

image has to be matched with a single test bottle. We propose a step-by-step method,

which is relatively fast, because a limited number of reference templates are necessary.

Our proposed method is passive in nature.

As it was indicated by the Gedevelop that the rotation is a constraint which imposes a

number of limitations to the task of direct investigation of designed textures, it is important

to find the nature of the image at different degree rotations. This can be done by taking

reference images of the bottle at different angles. These reference images will give

information about the shifts in texture positions due to rotation. The relative position of

designed textures in the image at a certain rotation angle can also be found. If the test

bottle on the conveyor belt is matched with the reference images, then an approximation

about the rotation of the test bottle with respect to the direction of motion of the conveyor

belt and location of the designed texture corresponding to that rotation angle can be found.

In this way, the designed texture can be localized indirectly.

The steps involve some of the well known methods e.g. linear symmetry calculation,

cross correlation etc. in such a way that the design texture can be localized successfully.

The steps show our intention of how we look at the problem and its solution.

8
Localization of Designed Texture on Glass Bottles

2.3.1 Reference model

As indicated by the Gedevelop, there can be +/-15 degree rotation of the bottles on the

conveyor belt; the first step is to create reference models by taking pictures of the bottles

at different rotation angles. The reference model creation will involve the following sub

steps:

(a) Calculate the linear symmetry feature for each image as described in [4] and [5].

The linear symmetry feature is defined by second order complex moments, I20 and I11.

(b) Correct the inclination by detecting the skew angle based on the second order

complex moment component I20.

(c) Find the vertical boundaries of the bottle by using the second order complex moment I11

[2] and clip the image according to that contours.

(d) Using a mathematical formula, clip out the target area from the image obtained by

step(c). The region should be such that the pixels associated with textures are denser

compared to other parts. Extract the average orientation for each pixel value of this

clipped region, i.e. the angle of the I20 feature.

(e) Tabulate the result of step (d).

(f) The pixel coordinates for the designed texture are assumed to be known for each

reference image.

Steps a, b, c, d and e have to be performed for each image taken for reference creation.

2.3.2 Computations on test image

Test image is the image of the bottle on the conveyor belt, on which the designed

texture has to be located. The necessary computations on the test image can be listed as:

9
Localization of Designed Texture on Glass Bottles

1) Perform steps a,b,c and d of 2.3.1 on the test image.

2) The extracted average directions of each pixel of the region obtained by step 2.3.1d

have to be matched with each reference image. The maximum value of the match will

indicate the approximate rotation angle of the test bottle. This is a passive way of

calculating the amount of rotation.

3) Select the reference bottle with which the maximum value out of matching has been

found.

2.3.3 Limitations

For the full investigation of the problem, it is necessary to have reference images of a

particular type of bottle rotated at different degrees in the interval [-15degree,15

degree].Due to limited data provided by Gedevelop, we have had certain restrictions in our

experiments which cut our journey short towards the full investigation of the problem. E.g.

we do not know the actual rotation angle of the bottles of the images, so we have

approximated it by visualizing. We categorized the images into three classes. In class 1, the

images which are close enough in rotation are assumed to be in 0-5 degrees rotation range;

in class 2, the images which seemed to be not so close are assumed to be in 6-15 degrees

rotation range and, in class 3, the images which are far in rotation are assumed to have

more than 15 degrees rotational difference. Our work was to investigate the correctness of

the matching procedure, i.e. to get a high matching score for the images which are close in

rotation angle and low matching scores for high difference in rotation angle. No study of the

uncertainty in the localization of the design texture could be performed. For real time

implementations, those are necessary points.

10
Localization of Designed Texture on Glass Bottles

2.3.4 Hot spot marking

Hot spot marking is usually done in geographical maps to show the location of a

certain place on the map by putting a mark on that location. We took the same approach to

locate the design texture on the test image.

Knowing the rotation angle, i.e. the appropriate reference bottle, the pixel coordinates

belonging to the designed texture can be marked on the test bottle.

11
Localization of Designed Texture on Glass Bottles

3 Methods

3.1 Feature Extraction

The average directions from each local image have to be extracted. The average direction is

calculated from the linear symmetry features of the image. The linear symmetry feature is

nowadays the back bone of various texture recognition algorithms and biometrical analysis.

It has first been intensively investigated by J. Bigun and G.H.Granlund in 1987[4]. In an

image, usually the gray values do not change at a particular direction; rather, the change is

random. Near the edges and textures, the gray values are constant along a line. This

fundamental fact gives rise to linear symmetry approach where, in a local neighborhood, the

ideal local orientation tensor or linear symmetry tensor is calculated. The linear symmetry

calculation is a feature extraction method which extracts and represents the image using

the linear symmetry features.

The orientation tensor or linear symmetry tensor is the representation of the local

orientation of an image in a neighborhood. The details about this can be found in [4][5].

This orientation tensor is represented in complex form for the ease of calculations. Linear

symmetry needs the doubling of the angle of orientation. The reason lies in the difference

between the direction of a line and orientation. There is a subtle difference between the

two. The direction of a line spans over the 0 to 2π angle range. When we talk about

orientation, the orientation is described in the direction orthogonal to a line along which

gray values do not change. This means there could be situations where orientation vectors

will be bilaterally symmetrical around that line. So the doubling of the orientation angle is

necessary.

12
Localization of Designed Texture on Glass Bottles

3.1.1 Dominant Local Directions

In mathematics, moment is a description of the structure formed by a set of points.

In image processing and computer vision field, the moment almost does the same thing. It

is a weighted average of the image pixel intensities. Image moments are extensively used

to deduce certain properties e.g. centroid, orientation etc. The complex moments are

sometimes preferable compared to real moments [5], because of the ease provided by

complex representations in algebraic calculations. More about the use of complex moments

can be found in [7]. Similar to real moments, complex moments are weighted averages of

pixel values represented in complex form in a neighborhood. The complex moments of a

function, z(x,y), is calculated as follows

𝑝
𝐼𝑝𝑞 = 𝑥 + 𝑖𝑦 𝑥 − 𝑖𝑦 𝑞 . 𝑧(𝑥, 𝑦)𝑑𝑥𝑑𝑦 (3.1)

The order of the complex moment is (p+q)[5].

The second order complex moments 𝐼20 and 𝐼11 , which can be used to estimate the linear

symmetry features of an image in a local neighborhood are defined in the frequency

domain as

2
𝐼20 = 𝑤𝑥 + 𝑖𝑤𝑦 . |𝐹 𝑤𝑥 , 𝑤𝑦 |2 𝑑𝑤𝑥 𝑑𝑤𝑥 (3.2)

𝐼11 = 𝑤𝑥 + 𝑖𝑤𝑦 𝑤𝑥 − 𝑖𝑤𝑦 . |𝐹 𝑤𝑥 , 𝑤𝑦 |2 𝑑𝑤𝑥 𝑑𝑤𝑥 (3.3)

In the spatial domain they can be computed as

𝐼20 = 𝑧 𝑥, 𝑦 ∗ 𝑔(𝑥, 𝑦) (3.4)

𝐼11 = |𝑧 𝑥, 𝑦 | ∗ 𝑔(𝑥, 𝑦) (3.5)

13
Localization of Designed Texture on Glass Bottles

Here g(x,y) is a Gaussian window function and z(x,y) is the squared complex valued

gradient image produced by squaring the first symmetry derivatives.

𝛿𝑓 𝛿𝑓 2
𝑧 𝑥, 𝑦 = +𝑖 (3.6)
𝛿𝑥 𝛿𝑦

The amplitude of I20 is the magnitude of the fitting orientation and I11 represents the sum

of the magnitude of all the gradient vectors. This will be clearer with the following figure.

Figure 3.1 Pictorial view of I20. Vector 𝐴𝐷 is I20 and 𝐴𝐵, 𝐵𝐶, 𝐶𝐷 are the

squared complex valued gradient vectors

In the figure 3.1, Vector 𝐴𝐷 is I20. The absolute value of I20 is the magnitude of 𝐴𝐷 and

angle is 2θ. I11 is |𝐴𝐵| + |𝐵𝐶| + |𝐶𝐷| . Here, one point to note that the vectors shown in figure

3.1 are not linear symmetric. However, it is clear that if linear symmetry exists, the

absolute value of I20 and I11 will be equal and if the gradient directions vary as in figure

3.1, the amplitude of I20 will be smaller compared to I11 . So abs(I20)/ I11 can be thought of

as a certainty measure[5] of linear symmetry. In case of perfect linear symmetry the value

of abs(I20)/ I11 will be 1; whereas, if there is a lack of linear symmetry, the value will be

smaller. In practical situations, there can never be perfect linear symmetry, which indicates

14
Localization of Designed Texture on Glass Bottles

that the ratio can never be 1. The value will be near to 1. So, one threshold value for this

ratio have to be selected, which will indicate approximate linear symmetry. Only those pixel

positions will be taken for granted where this ratio will be higher than this threshold value.

As cited in [5], the dominant orientation in a local neighborhood can be extracted from I20

as

1
𝜃 = argument 𝐼20 (3.7)
2

1
The factor in equation (3.7) is due to the double angle representation.
2

3.2 Skew Detection and Correction and Target


Area Extraction
As indicated in section 2.3, it is important to correct the skew, find the left and right

most vertical contours of the bottle in the image and find some significant texture area. By

doing these steps, the inclination will be corrected, the contours of the bottles will be

aligned and lastly the computational cost of finding the amount of rotation will be small as

we will use only the selected texture portion of the bottle for matching purposes.

3.2.1 Skew detection and Correction

Skew detection and correction is a mandatory part in optical character recognition.

Although our present task is in a totally different environment than OCR, skew detection is

still necessary. In the real life scenario, the test bottle may be skewed a little because, after

forming, the hot bottles are put onto very fast conveyor belt and the conveyor belt itself is

moving very fast. So, it may be skewed a little at the time when camera is taking its

15
Localization of Designed Texture on Glass Bottles

picture. For skew detection, we use the method cited by ‘Lalith Premaratne, Yaregal

Assabie and Josef Bigun’ as described in[2].

As cited in [2], the directional features of the image can be used to detect the skew in the

image. The second order complex moment component, I20 as described in section 3.1.2,

which represents the information about orientation angle in a small neighborhood, is used to

detect the skew angle of the image. The summation of I20 values over the entire image

gives an almost approximate value of the skew angle. The accuracy is improved by

removing the contribution from the background, considering those values of I20, which have

high confidence according to [5].

3.2.2 Target area extraction

Vertical boundary

Once we have corrected the skew, next the vertical boundary detection is done. Here

vertical boundary refers to the left and right most vertical contours of the bottle in the

image. The second order complex moment component I11, which represents the sum of the

magnitude of the orientation vectors in a local neighborhood, is used to do this task. This is

done by adding the I11 values of all the columns of the skew corrected image. The

motivation behind using I11 values is that the I11 values are high at the edges and very low

elsewhere. The longest vertical edges are the vertical boundaries of the bottles. So, by

column wise adding of I11 values, we will get high values at the vertical boundaries of the

bottle.

16
Localization of Designed Texture on Glass Bottles

Figure 3.2 The boundaries of the bottle encircled with green ellipses are to

be found

Suppose, ‘A’ be an image of size m x n, then its I11 matrix will also be of size m x n.

Column wise addition of I11 matrix will produce an array of length n where each element will

be the sum of all the row values of that corresponding column, from where we can deduce

the approximate column indices of the major vertical borders. Mathematically, it will look

like

𝑚
𝑓 𝑙 = 𝑘=1 𝐴 𝑘, 𝑙 𝑓𝑜𝑟 𝑙 = 1,2 … . . , 𝑛 (3.8)

From 𝑓(𝑙) of equation (3.8), column indices of left and right boundary can be extracted as-

𝑛
𝑓𝑙 = max 𝑓 𝑙 2
𝑖=1
(3.9)

𝑛
𝑓𝑟 = max 𝑓 𝑙 𝑛
𝑖= +1
(3.10)
2

17
Localization of Designed Texture on Glass Bottles

Figure 3.3 The two high peaks rounded with red ellipses are indicating

the vertical contours of the bottle.

When equation (3.8) is applied to I11 matrix of figure 3.2, the resultant 𝑓 𝑙 | 𝑙=1𝑛

looks like figure 3.3. The two desired boundaries as indicated in figure 3.3 by red ellipses

can now be extracted by equation (3.9) and equation (3.10). The equations (3.9) and

(3.10) will give the column indices of left and right boundary respectively. When the column

indices of the desired boundaries are found, an image portion can be extracted, where the

left and right boundaries will be the marked boundaries of the figure 3.3. Such an extracted

image is shown in figure 3.4

18
Localization of Designed Texture on Glass Bottles

200

400

600

800

1000

1200

100 200 300

Figure 3.4 An image clipped out from the original image using the indices

found by equation 3.9 and 3.10

Horizontal boundary

The computational cost of using the whole image of the bottle to find the rotation angle by

matching it with reference images is high. In the matching phase, which is described in 3.3,

it will include many pixel positions which are texture less and thus lacking in feature

descriptors. The pixel values at those positions are almost invariant in all the images.

Practically, those pixel values will have no contribution in the matching process. The pixel

values which change with rotation will only have a contribution in this calculation. These

pixel positions are the textured regions. This can be more truncated if we can extract some

19
Localization of Designed Texture on Glass Bottles

textured area which is denser compared to other portions of the image and is changing with

rotation. This kind of texture can be selected by looking at the image of a bottle.

To clip out some significant texture area from the image, a method is developed being

motivated from some obstacle avoidance algorithms, which deal with mobile robot

navigation, namely ‘the certainty grid representation for obstacle representation’, ‘the VFF

concept’ and ‘vector field histogram’ methods. A brief introduction to these methods is given

in [3]. In these methods, a robot’s working area is divided into a certain number of grids

and it is assumed that each of the grids exerts some kind of force on the robot. The total

force is the summation of the individual forces applied by each grid. So, logically, it can be

said that each grid is given some weight, depending upon certain conditions, and the result

is an average of the individual grid components. This resultant element gives the obstacle

density at a certain point where the robot was located at that particular time.

We propose a similar concept with an exception that, rather than weight we

concentrate on each pixel value as mass. Our reason for looking into it as a mass is that a

design texture can be anywhere on the bottle, so it is not possible to weight one pixel value

higher or lower than the other. Rather, we should have a method that can weight each pixel

position equally and, at the same time, can extract the design texture area.

The first step is to convert the cropped I11 image, obtained by equation (3.9) and (3.10)

to a binary image. The I11 image highlights the edges of the image, but it does not give

equal value to each edge pixel. So to obtain equal pixels value at the edges and equal pixel

values elsewhere, the I11 image is converted to binary image. This is done by

experimentally setting a threshold value in the I11 images. All the pixel values higher than

this threshold are given value ‘1’ and other pixel values are set to the value ‘0’ . Fig-3.5 is

obtained by converting an I11 image into a binary image.

20
Localization of Designed Texture on Glass Bottles

The next step is to define an filter of size ‘wh x wv’ , where each value will be of equal

weight. However, the question is what should be the size of the window? The width of the

window is equal to the number of columns in the vertically limited image matrix, i.e the

width of the bottle. The height of the window should be determined carefully by looking at

the image. This should be large enough, so that it covers the desired texture area but, on

the other hand, it should not be so large that it can fit some unnecessary pixel positions in it

while it is covering the desired texture area. E.g. if we look at figure 3.5, it can be noticed

that the dense texture is made in the middle of the image, where some text is inscribed on

it. In the image, this region will change with rotation. Other portions of the image do not

produce any visual effect due to change of rotation angle. So, this portion can only be used

to identify the rotation angle. The height of the window should be such that it covers the

entire texture.

If we convolve this filter with the binary image, it will give maximum value when it

includes the targeted textured area. Near the textured region, the white pixels are denser

than other locations. Simply at that region the pixel density will be the maximum. If

necessary, further abbreviation can be achieved by considering only the central part of the

densest area. If we vertically divide the densest area into three equal parts, we will select

the second part on that. This should be the same for all the reference images and test

images. In this way, the significant texture area from the images can be extracted, which

will have a difference due to rotation but lies within the same pixel coordinates.

21
Localization of Designed Texture on Glass Bottles

200

400

600

800

1000

1200

50 100 150 200 250 300

Figure 3.5 I11 image converted to binary image.

3.3 Similarity Measure

Two images, I and J, are matched by calculating the similarity measure between the two

images. This is particularly necessary to find the rotation angle of the bottle on the conveyor

belt.

22
Localization of Designed Texture on Glass Bottles

The similarity measure gives high value for similar images and gives relatively low value for

dissimilar images. Selected image pixels (see 3.1.1) are represented by the average

direction of its second order complex moment I20. After extracting the texture regions as

described in the previous section, they are matched as

𝑖,𝑗 𝐼(𝑖,𝑗 ).𝐽 (𝑖,𝑗 )


𝑆= (3.11)
𝐼 𝐽

In equation 3.11, the values of 𝑆 will be in the interval 0 and 1. The value near to 1

indicates high degree of similarity, whereas values near to 0 indicate less similarity.

3.4 Hot spot marking

Hot spot marking is achieved by one to one mapping of the pixel coordinates of an

image J to another image 𝑰. Here, 𝑰 is the test image and 𝑱 is a texture region in the

corresponding reference image. The images 𝑰 and 𝑱 are of the same size. The appropriate

region on 𝑰 is marked according to the location of 𝑱 on the appropriate reference image with

which 𝑰 has maximum similarity. Suppose the location of a point 𝑱 on the reference image is

(a,b), then the hot spot of 𝑰 will be (a,b) coordinate point of the 𝑰 matrix. It is simple one to

one mapping, where each pixel on the text image is stained according to the location of

texture on the corresponding reference image.

23
Localization of Designed Texture on Glass Bottles

4 Experiments and results

The experiments performed and their results, are presented in this chapter. The results of

preprocessing of images, design of reference models, matching of the test images with the

reference images, interpretation of matching results and then hotspot marking are

described subsequently.

4.1 Software environment

We have performed all our experiments at simulation level. For simulation level

experimentation, MATLAB has a very compact, easy to use and easy programmable

environment. The images provided by the company Gedevelop are in .bmp format, which is

easy to handle in MATLAB. The image data are in unit-8 format. Some experiments require

‘double’ data type supportability, and it is a well known fact that different types of image

processing techniques can be easily done in MATLAB. Along with some image processing,

there are some mathematical implementations. This is also easier in MATLAB compared to

other well known programming platforms. So we choose MATLAB for experiments and

implementations.

4.2 Data

Gedevelop provided 34 and 39 images in .bmp format respectively of two different

kinds of bottles rotated at different angles denoted as B1 and B2. Out of 34 images in B1,

we have created two sub databases denoted as B1A and B1B. . B1A contains a total of 24

images while B1B contains 10 images. B1A contains those images which have designed

24
Localization of Designed Texture on Glass Bottles

texture made on their front side of the bottle from the camera viewpoint as in figure 4.1.

B1B contains those images which have design texture on the back side from cameraview

point as in figure 4.2. The motivation behind dividing B1 into two groups is that the

designed textures are made on the opposite sides. Possibility of this kind of situation can be

ignored as it requires more than 90 degree rotation, which is practically inconceivable.

Consequently, we wanted to see the effect of matching of these two types of independent

situations. Figure 4.3 is showing an image of B2. Here, one point to notice is that the

textures on the images of B1 are better defined than those of B2.

The images provided by company are used to create a reference database. Those

images are 1380 x1012 pixels in size. Because of automatic clipping of some significant

texture area, done according to the steps described in section 3.2.2, we got images of size

100x200 pixels. Figure 4.4 is one such automatically clipped image from B1A.

Second order complex moment I11 values are used for automatic extraction of significant

texture area. Binary image of the boundary detected image was created by setting a

threshold value. The I11 values greater than that threshold was set to 1 and other pixel

values were set to 0. One such binary image is shown in Figure 3.5. Reference models

include:

1. Original image taken by the camera.

2. The linear symmetry matrix of the corresponding image.

3. The dominant direction information of each pixel of I20 image.

4. The vertical boundary limited image.

5. The dimensions of the densest texture region.

6. The pixel coordinates of the designed textures.

The results of the individual steps as well as of overall experiments are explained in the

following sections.

25
Localization of Designed Texture on Glass Bottles

Figure 4.1 An image of database B1A having texture on front side of the

bottle

26
Localization of Designed Texture on Glass Bottles

Figure 4.2 An image of database B1B having texture on back side of the

bottle

Figure 4.3 An image of database B2

27
Localization of Designed Texture on Glass Bottles

4.3 Feature Extraction

Initially, the second order complex moments, I20 and I11 are calculated. The first step

towards the calculation of I20 is done by filtering the original image with two Gaussian

derivative filters: one in the X-direction and the other in the Y-direction. Then, by

combining these two derivative values, the complex valued gradient image is produced

according to equation (3.6). Next, the complex valued gradient image pixel values were

averaged in a local neighborhood using a Gaussian average filter. I11 was computed by

calculating the absolute value of each pixel of the complex gradient images and then

averaging the local neighborhood pixel values.

In MATLAB implementation, four Gaussian averaging filters were produced; two of them

are in X direction and two are in the Y direction. One set of filters had standard deviation

value 0.8 and other set has standard deviation value 1.2. The filters with standard deviation

value 1.2 are used to take the average value of complex gradient pixels in the local

neighborhood. Two Gaussian derivative filters are produced using standard deviation value

0.8.

I20 values calculated thus are used to correct skew and to calculate average directions for

matching. I11 values are used to detect vertical boundary as indicated in section 3.2.2 and

section 3.2.3. In figure 4.4 and 4.5, two target area extracted images are shown.

28
Localization of Designed Texture on Glass Bottles

10

20

30

40

50

60

70

80

90

100
20 40 60 80 100 120 140 160 180 200

Figure 4.4 Result of target area extraction on an image of B1A

10

20

30

40

50

60

70

80

90

100
20 40 60 80 100 120 140 160 180 200

Figure 4.5 Result of target area extraction on an image of B1B

29
Localization of Designed Texture on Glass Bottles

10

20

30

40

50

60

70

80

90

100
20 40 60 80 100 120 140 160 180 200

Figure 4.6 Orientation tensor in color of figure 4.4.

30
Localization of Designed Texture on Glass Bottles

Figure 4.7 Orientation tensor in color of figure4.5.

4.4 Registration

We used pixel by pixel matching of the average direction values of the clipped images.

Usually, in such kinds of matching process, alignment of the texture is a regular concept.

However, in this case, it is not, because we are trying to find the rotation angle. If the

textures are aligned, then the matching will give aberrant result. The alignment of the

contour of the whole bottle image is necessary, so that we can get a common arbitrary

origin from where the bottle can be observed. The bottle may be inclined due to high speed

of the conveyor belt and also it may be translated a little at the right or left. Thus, for

accuracy purpose, the image of the whole bottle should be aligned relative to its contour.

31
Localization of Designed Texture on Glass Bottles

The problem out of inclination is corrected due to skew correction. Translational

misalignment is automatically corrected because of the boundary extraction, as described

in 3.4.2. As a consequence, when we go for clipping out the texture region from the vertical

boundary detected images, we always start the calculation from the left upper corner of the

image which is vertically above the left most vertical boundary of the bottle. The

translational mismatch is thus compensated. It can be said that the bottles become aligned

according to the vertical axis passing through its center of gravity. Thus we can match two

complex images by their overlapping area. This kind of matching is done in [13]

The purposes of the methods described in section 3.4 are twofold. They serve both

automatic clipping and registration.

4.5 Matching

Matching is done by computing the normalized similarity score, according to the equation

(3.11).Each image of B1 is matched with all 33 images of the database except itself. Again

each image of B1A is matched with all 23 images of the database. Similarly, it was done for

B1B and B2. One image is not included in each matching step and that is the bottle itself, as

we will get the value 1 for that matching, which is meaningless in a practical situation. In

total we have 34, 24, 10 and 39 such instances for B1, B1A, B1B and B2 respectively where

the similarity score is 1 according to equation (3.11). We have 1122(=34 x 33), 552(=24 x

23), 90 (=10x9), and 1482(=39x 38) scores for B1, B1A, B1B and B2 respectively where an

image was matched with other images in the respective database. The similarity score

distributions for B1, B1A and B1B are shown in figure 4.8, figure 4.9 and figure 4.10. The

figure 4.8 depicts the similarity score distribution for B2 images, where matching is done

based on average direction. By looking at the nature of the figures 4.8, 4.9,4.10 and 4.11,

32
Localization of Designed Texture on Glass Bottles

we found that, for B1 images the matching results are more prominent, as for almost closer

images it has matching scores closer to one(above 0.8) . For B2 images, it is a little lower.

In figure 4.8, 4.9, 4.10 and 4.11 the region marked with green circle shows the similarity

scores when the images are in class 1. In the blue circled region, we have the scores for

those images which are in class 2. The red region indicates least similarity. Here, images

are in the class 3. Class 1, class 2 and class 3 are defined in section 2.3.3.

Figure 4.8 Matching result of B1 images matched using average directions

33
Localization of Designed Texture on Glass Bottles

Figure-4.9 Similarity score distribution for B1A using average directions

34
Localization of Designed Texture on Glass Bottles

Figure-4.10 Similarity score distribution for B1B using average directions

Figure-4.11 Similarity score distribution for B2 using average directions

35
Localization of Designed Texture on Glass Bottles

4.6 Hot Spot Marking

In figure 4.12, an image is shown on which design textures are marked. It is done

according to the method described in section 3.4. The pixel coordinates of the designed

textures are indicated by white colour.

200

400

600

800

1000

1200

100 200 300

Figure 4.12 Texture, marked on a B1A image

In real time application, appropriate match for the test bottle will not be found: it will be

approximated to the closest reference image. A typical situation which may arise is that it

may not cover the entire texture region because of this approximation. In that case,

uncertainty will be there and for this reason, it is necessary to calculate the uncertainty in

the position of the designed texture. According to that uncertainty the marking should be

done so that it covers up the entire texture region on the test bottle, otherwise it will cause

36
Localization of Designed Texture on Glass Bottles

problems to further intended processings. We could not investigate this because of

limitations in the data.

37
Localization of Designed Texture on Glass Bottles

5 Conclusions

In a nutshell, the ideas proposed here can be used to identify the rotation angle of the

bottle on the conveyor belt in a passive way so that the designed texture on the test bottle

can be localized according to the designed texture coordinates on the particular reference

bottle with which it has maximum similarity. We have proposed the use of the average

direction of linear symmetry feature for texture recognition as it is relatively free from

different kinds of errors caused by gray value changes from image to image due to

environmental changes. Then we proposed an idea to mark the location of the designed

texture on the test bottle in a direct way. The test results for finding rotation angle are

positive. Overall the proposed method works satisfactorily on the test data.

Depending upon the shortcomings of the proposed method, the future works include:

1) Fixing a threshold valued for matching results above which the similarity can be taken for

granted. This could be done by creating reference model with the images at 1 degree or 0.5

degree rotation gap.

2) Uncertainty of the localization results has to be calculated.

3) The automatic clipping out method of some significant texture area has to be made more

robust so that it can be applied in the situations where there is less dense texture in a

particular region.

4) The algorithm can be made faster by translating to any machine friendly programming

language.

5) The uncertainty in localization must be used in the hot spot marking by putting an

uncertainty box around each position.

38
Localization of Designed Texture on Glass Bottles

6) It should be modeled in such a way so that it can be applied in a real time environment.

39
Localization of Designed Texture on Glass Bottles

References

1. Kenneth Nilsson, Josef Bigun, Registration of Fingerprints by Complex Filtering and by

1D Projections of Orientation Images, Audio and Video based Person Authentication-

AVBPA 2005

2. Lalith Premaratne, Yaregal Assabie and Josef Bigun ,Recognition of Modification-based

Scripts Using Direction Tensors. 4th Indian Conf. on Computer Vision, Graphics and

Image Processing(ICVGIP2004), 587-592.

3. J. Borenstein and Y. Koren

The Vector Field Histogram - Fast Obstacle Avoidance for Mobile Robots

IEEE Transactions on Robotics and Automation, Volume 7, No. 3, pp. 278-288, June,

1991

4. J. Bigun and G. Granlund (1987)

Optimal Orientation Detection of Linear Symmetry". First int. Conf. on Computer

Vision, ICCV, (London). Piscataway: IEEE Computer Society Press, Piscataway, (Tech.

Report LiTH-ISY-I-0828, Computer Vision Laboratory, Linkoping University, Sweden

1986. Thesis Report, Linkoping studies in science and technology No. 85, 1986).

pp. 433–438.

5. Josef Bigun, Vision with Direction, A systematic introduction to Image Processing and

Computer Vision, Springer.

6. R.S. Ledley, Automatic pattern recognition for clinical medicine

IEEE,Volume: 57 Issue: 11 , page: 2017 - 2035

40
Localization of Designed Texture on Glass Bottles

7. A. Khontanzad and Y. H .Hong,Invariant image recognition by Zernike moments. IEEE-

PAMI, 12(5):489-497, 1990

8. Johan Thärnå, Kenneth Nilsson and Josef Bigun,

Orientation Scanning to Improve Lossless Compression of Fingerprint Images. In J.

Kittler and M.S. Nixon, editors, Audio and Video based Person Authentication-

AVBPA03, volume LNCS-2688, pages 343-350. Springer, 2003.

9. Irish food packaging information service. http://www.ucc.ie/fcis/PKglass.htm

10. Hui-Min Ma Guang-Da Su Jun-Yan Wang Zheng Ni , Dept. of Electron. Eng.,

Tsinghua Univ., Beijing, China, A glass bottle defect detection system without touching.

IEEE, 2002 International Conference on Machine Learning and Cybernetics, Volume-

2,pages 628-632, 19th February, 2003.

11. http://www.gedevelop.com/ 20th January,2010

12. H.L.Premaratne and J.Bigun, A segmentation-free approach to recognise printed

Sinhala script using linear symmetry, Pattern Recognition, Vol. 37, pp 2081-2089, 2004.

13. Kenneth Nilsson, Thorsteinn Rognvaldsson, Jens Cameron and Christina Jacobson,

Biometric Identification of Mice.

The 18th International Conference on Pattern Recognition, Hong Kong, 20-24 August

2006.

14. http://en.wikipedia.org/wiki/Glass_production date 19th May 2010

41
Localization of Designed Texture on Glass Bottles

15. Hartwig Fronthaler & Klaus Kollreider, Real-time Face Tracking: Enhancing the

authenticity of Recognition. Master’s thesis in computer Systems engineering. IDE

School of information Science, Computer and Electrical Engineering. Halmstad

University.

16. Rolland SiegWart, Illah R. Nourbakhsh, Introduction to autonomous mobile robots,

Amazon.

42

Vous aimerez peut-être aussi