Vous êtes sur la page 1sur 10

A Novel and Ecient Feedback Method for Pupil and Iris Localization

Muhammad Talal Ibrahim1 , Tariq Mehmood2 , M. Aurangzeb Khan2 , and Ling Guan1
1

Ryerson Multimedia Research Lab, Ryerson University, Toronto, Canada 2 Dept. of Electrical Engineering COMSATS Institute of Information Technology, Islamabad, Pakistan muhammadtalal.ibrahi@ryerson.ca, tariq mehmood@comsats.edu.pk, aurangzeb niazi@comsats.edu.pk, lguan@ee.ryerson.ca

Abstract. This paper presents a novel method for the automatic pupil and iris localization. The proposed algorithm is based on an automatic adaptive thresholding method that iteratively looks for a region that has the highest chances of enclosing the pupil. Once the pupil is localized, next step is to nd the boundary of iris based on the rst derivative of each row of the area within the pupil. We have tested our proposed algorithm on two public databases namely: CASIA v1.0 and MMU v1.0 and experimental results show that the proposed method has satisfying performance and good robustness against the reection in the pupil.

Introduction

Due to the uniqueness of iris patterns, they are considered as the most reliable physiological characteristic of human and thus, most suitable for security purposes [1]. For a reliable iris identication system, the iris should be segmented properly from an eye image. Iris segmentation deals with the isolation of the iris from other parts of an eye like pupil, sclera, surrounding skin, reection, eyelashes and eyebrows. It has been observed that the most computationally intensive task in iris recognition, specially in iris segmentation is iris localization. Iris localization means to exactly determine the inner and the outer boundaries of the iris. During the past few years, extensive research has been carried out to accurately localize the iris in an image. Generally, the methods proposed for the iris localization can be divided in two major categories. The rst category is based on circular edge detectors like circular Hough transform [2,3,4], and the second category is based on histogram [5,6]. Circular Hough transform based methods tries to deduce the radius and center coordinates of the pupil and iris regions but the problems with the Hough transform is thresholding and intensive computations due to its brute-force approach. The integro-dierential [7] can be seen as a variation of the Hough transform that overcomes the thresholding problems of the Hough transform as it works with raw derivative information but it fails in case of noisy images (such as reections in the pupil).
M. Kamel and A. Campilho (Eds.): ICIAR 2011, Part II, LNCS 6754, pp. 7988, 2011. c Springer-Verlag Berlin Heidelberg 2011

80

M.T. Ibrahim et al.

In histogram based methods, pupil is considered as the darkest region in an eye image and thresholding is used for locating the pupil. Xue Bai et. al. [8] uses the histogram based method for the extraction of pupil. It is an eective method to certain extant, but if the gray level of eyelashes and eyebrows or some other part of image is lower than that of pupil, it is not able to detect the exact boundary of the pupil. Xu Guan-zhu et al. [9] detected the pupil region by rst diving the eye image into small rectangular blocks of xed sizes and then nding the average intensity value of each block. The minimum average intensity value was selected as a threshold to nd the pupil region. Finally, the iris boundaries were detected in the predened regions. Z. He et al. [5] also presented a two stage algorithm. In the rst stage, specular reections were eliminated and an initial approximation of the iris center was obtained by using the cascaded Adaboost method. Later, the points of the iris boundary were detected by using a pulling and pushing model based on the Hookes law. In this paper, a new feedback method for pupil and iris localization is proposed. The method rst locates the pupil on the basis of adaptive thresholding, which iteratively seeks the region that has the highest probability of enclosing the pupil. Once the pupil is localized, the boundary of the iris is extracted based on the peaks of the rst derivative of the image intensity. For experimental results, proposed method has been applied on CASIA v1.0 [10] and MMU v1.0 [11]. The remainder of the paper is organized in the following manner: Section 2 covers our proposed method. Section 3 will cover the details of the experimental results. Finally, conclusions are provided in the last section.

Proposed Method

Proposed method is comprised of two stages. In the rst stage, localization of pupil from the given image is achieved and in the second stage, iris is localized based on the coordinates of pupil. 2.1 Pupil Localization

The proposed method for the pupil localization is basically a feedback system that works in iterative manner. The details of the proposed algorithm are given below: 1. For the very rst iteration, apply the histogram based contrast stretching in order to make dynamic range of the input image from 0 to 255. We have applied the following equation for contrast stretching [12]. In = In min 255; max min (1)

where In is the image of the nth iteration, and min and max are the minimum and maximum grey levels of In , respectively. For the rst iteration, I1 will be the original eye image.

A Novel and Ecient Feedback Method for Pupil and Iris Localization

81

2. Find the minimum and the maximum grey levels in the image and name them as minn and maxn , respectively. 3. Now calculate the frequency of each grey level in the image and look for the grey level that has maximum frequency in the grey level range lrn i.e. range from minn to ceil( minn +maxn ) and name it as lrleveln . Then look for the 2 grey level that has the maximum frequency in the grey level range upn i.e. range from ceil( minn +maxn ) + 1 to maxn and name it as upleveln . 2 4. Now start nding the coordinates of the pixels that have lrleveln and the coordinates of the pixels that have upleveln as given in the following equation: [x lrn , y lrn ] = f ind(In == lrleveln ) [x upn , y upn ] = f ind(In == upleveln ) (2)

Then take the standard deviation of the xlrn , xlrn , xupn and xupn separately and nd the mean of the standard deviation of lrn and also for upn independently as given below. sdlrn = sdupn =
std(xlrn )+std(ylrn ) 2 std(xupn )+std(yupn ) 2

(3)

5. Finally, the minimum from sdlrn and sdupn is calculated, and the range that has the minimum average value is then selected as the range that has more chances of having the pupil. Then the minimum and maximum grey levels of that range becomes equal to minn and maxn . If range lrleveln has the minimum average then the threshold value is selected as lrleveln else upperleveln is selected as threshold. 6. Repeat the steps from 3 to 5 for the nth iteration, until the value of threshold stops changing. To be on the safer side, we just add and subtract a small factor from this threshold value Tn to give a small range which we think can be the grey level of pupil as given below. As, we are dealing with digital images, so the value of should be an integer. For the experimental results, we have chosen this value as 5. indn = f ind(In >= Tn & In <= Tn + ) (4)

where indn are the indices corresponding to the grey levels of the pupil in In as shown in Fig. 1 (b). Now, indices corresponding to isolated points in Fig. 1 (b) are removed by some morphological operations and remaining horizontal and vertical indices are named as xn , yn as shown in Fig. 1 c). These indices will help us in extracting the radius and center of the pupil as shown in the equation below. rxn = round((max(xn ) min(xn ))/2) ryn = round((max(yn ) min(yn ))/2); (5) cxn = round((max(xn ) + min(xn ))/2); cyn = round((max(yn ) + min(yn ))/2); The maximum from rxn and ryn give us the radius of pupil Rpupiln , ideally both rxn and ryn should be equal to each other. The cxn and cyn give us the center

82

M.T. Ibrahim et al.

(a)

(b)

(c)

(d)

Fig. 1. a) Human eye, b) Localized pupil before morphological operations, c) Localized pupil after morphological operations and d) Outer boundary of pupil superimposed on (a)

of the pupil. Now, a circle of radius Rpupiln at point (cxn , cyn ) represents the outer boundary of pupil as shown in Fig. 1 (d). Now repeat the steps from 1 to 6 by giving the segmented pupil pupiln as a feedback to our method. The pupiln of nth iteration will become as an input image for the next iteration. From the rst iteration, we will have pupil1 which will give us (cx1 ,cy1 ), representing the location of the pupil1. Now, in the next iteration, if the dierence between cxn1 ,cxn and cyn1 , cyn is not equal to zero, it means that the location of the pupil has moved. We will apply the whole procedure again on pupiln, until this dierence is zero which will give us the exact location and radius of pupil as shown in Fig. 2. We name the nal location and the radius of pupil as (Crow ,Ccol ) and Rpupil . 2.2 Iris Localization

The center of the iris lies within pupil but these two centers are not necessarily concentric [7]. So, we have limited our search space by selecting a circular region whose radius is 3 times the radius of the pupil and we have neglected all the portion of the eye image beyond that circular region as shown in Fig. 3 (a). Now, we will look for the center of the iris within the pupil. The search starts by applying a 1D low pass lter to the rows within the area of pupil and then taking the rst derivative of each row independently. It looks easier in rows to nd the boundary of iris because there are cases in which eye is partially

A Novel and Ecient Feedback Method for Pupil and Iris Localization

83

(a)

(b)

(c)

(d)

(e)

Fig. 2. a) Human eye, b) Pupil after 1st iteration, c) Pupil after 2nd iteration, d) Pupil after 3rd iteration and e) Final Localized Pupil

opened so it is not possible to locate the exact boundary of the iris by taking the derivative in the vertical direction. Let us explain the proposed algorithm for only one row and it will be same for the rest of the rows that belong to the area within the pupil. Let this row be row1 as shown in Fig. 3 (b). Before taking the rst derivative, we have applied the 1D low pass lter in order to remove the spurious points caused by noise. It is clear from Fig. 3 (c), that when our rst derivative lter will enter from the black region to the grey level region, we will have positive response as shown in Fig. 3 (d) and as soon as it will move towards the boundary of iris, there will be a negative change in the grey level i.e. from high to low as shown in Fig. 3 (d). We should also keep in mind that these peaks can be because of eyelashes. To overcome this problem, we have applied a 1D accumulator of length 7 on the result of rst derivative, so that the eect of these small peaks caused by the eyelashes will be minimized as shown in Fig. 3 e). The maximum negative peak from left will be the boundary of the iris represented by red circle and we named the location of this negative peak as p1. The same thing will happen when the lter will leave the boundary of the iris, we will have a maximum positive response as shown in Fig. 3 (f), represented by green circle and we named the location of this positive peak as p2. Now, we will calculate the dierence of the location of the detected peaks p1, p2 from the central column of pupil Ccol as given in the equation below: D = abs((Ccol p1) (p2 Ccol )) (6)

After testing our algorithm on 1206 images of two databases, we have set the value of D to be 20. If the value of D is greater than 20, then it means that the one of the

84

M.T. Ibrahim et al.

180 160 140 120 100 80 60 40 20 0

180 160 140 120 100 80 60 40 20 0

50

100

150

200

250

300

350

50

100

150

200

250

300

350

(a)
25 20 15 40 10 5 0 5 10 40 15 20 25 0 50 100 150 200 250 300 350 60 80 20 0 20 80 60

(b)
80 60 40 20 0 20 40 60 80

(c)

50

100

150

200

250

300

350

50

100

150

200

250

300

350

(d)

(e)

(f)

Fig. 3. a) Segmented Eye, b) Row within the pupil, c) Result of applying 1D low pass lter on (b), d) Result of applying rst derivative on (c), e) Result of applying 1D accumulator of length 7 on (d) and f) Detected points of the boundary of iris represented by circles

extracted peak is because of a very strong eyelash, so we will move to the next row until this D is less than 20. The peaks p1, p2 in a row for which the value of D is less than 20 will give us the Ccoliris by just taking the mean of p1 and p2. As we know, that the number of pixels between the boundary of circle are maximum at the row that carries diameter of the circle. So we will apply the above mentioned algorithm on the rows within the area of the pupil and the row that will give the maximum number of pixels between two peaks satisfying the condition that we have put on the value of D will give us the diameter of the iris and its radius Riris will be the location of the iris represented by Crowiris . Now, we will draw a circle of radius Riris at point (Crowiris , Ccoliris ) as shown in Fig. 4.

Experimental Results

In order to check the authenticity of our proposed method, we have tested our method on the two most widely used public databases namely: CASIA v1.0 database [10] and MMU v1.0 database [11]. We have used the accuracy rate [13,6] as a measure to evaluate the performance of our proposed method. The accuracy rate (ACrate ) is dened as follows: ACrate = Nsuccess 100 Ntotal (7)

A Novel and Ecient Feedback Method for Pupil and Iris Localization

85

(a)

(b)

(c)

(d)

(e)

(f)

Fig. 4. Results of our proposed method on some of the randomly selected images from CASIA v1.0 database.

where Nsuccess is the total number of eye images in which the iris has been successfully localized and Ntotal is the total number of images in the database. The details of the experimental results are presented in the following sections. 3.1 Experiments Set-1

In our rst set of experiments, the results are collected on CASIA v1.0 iris database. It contains 756 eye images of 108 dierent subjects with a resolution of 320 280 pixels. Each of the subject contributed seven images in two dierent sessions. In the rst session, three images were acquired from each subject while in the second session, four images were acquired. The proposed method is tested on the whole database with an accuracy rate of 99.47%. Table 1 shows the comparison of our proposed method with some of the existing methods while using the CASIA v1.0 iris database for experimental results. Fig. 4 shows the results of our proposed method on some of the randomly selected images from the CASIA v1.0 database. 3.2 Experiments Set-2

In our second set of experiments, the results are collected on MMU v1.0 database. It consists of 450 eye images with a resolution of 320 240 pixels from 45 subjects. The proposed method is tested on the whole database with an accuracy rate of 99.11%. Table 2 shows the accuracy rate of our proposed iris localization method. Fig. 5 shows the results on some of the randomly selected images from the MMU v1.0 database.

86

M.T. Ibrahim et al.

(a)

(b)

(c)

(d)

(e)

(f)

Fig. 5. Results of our proposed method on some of the randomly selected images from MMU v1.0 database Table 1. Comparison of Accuracy Rates on CASIA v1.0 database Masek [14] Alvaro et. al. [15] Mateo [16] Xu et. al. [17] Daugman [18] Cui [19] Weiqi Yuan et. al. [20] Our Proposed 82.54% 94.84% 95.00% 98.42% 98.60% 99.34% 99.45% 99.47%

Table 2. Comparison of Accuracy Rates on MMU v1.0 database Masek [14] as reported in [13] Daugman [21] as reported in [13] Ma et. al. [22] as reported in [13] Daugman New [23] as reported in [13] Somnath et. al. [13] Our Proposed 83.92% 85.64% 91.02% 98.23% 98.41% 99.11%

Conclusions

In this paper, a novel feedback method for the pupil and iris localisation has been proposed. It is clear from the results that the performance of our proposed method is not aected by the reection in the pupil and it is still capable of localizing the iris from the eye image. By taking the rst derivative of horizontal

A Novel and Ecient Feedback Method for Pupil and Iris Localization

87

rows within the pupil along with the dierence condition on D, has overcome the diculty faced while extracting the boundary of iris in case of partially opened eye image. By looking at the accuracy rates, we can say that our proposed method has outperformed many existing methods for the localization of iris in digital eye image.

References
1. Ross, A.: Iris Recognition: The Path Forward. IEEE Computer 43(2), 3035 (2010) 2. Wildes, R., Asmuth, J., Green, G., Hsu, S., Kolczynski, R., Matey, J., McBride, S.: A system for automated iris recognition. In: Proceedings IEEE Workshop on Applications of Computer Vision, Sarasota, FL, pp. 121128 (1994) 3. Tisse, C., Martin, L., Torres, L., Robert, M.: Person identication technique using human iris recognition. In: International Conference on Vision Interface, Canada, vol. 2, pp. 249299 (2002) 4. Ma, L., Wang, Y., Tan, T.: Iris recognition using circular symmetric lters. National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences 2 (2002) 5. He, Z., Tan, T., Sun, Z., Qiu, X.: Toward accurate and fast iris segmentation for iris biometrics. IEEE Transactions on Pattern Analysis and Machine Intelligence 31(9) (2009) 6. Dey, S., Samanta, D.: Fast and accurate personal identication based on iris biometric. International Journal of Biometrics 2(3), 250281 (2010) 7. Daugman, J.: Biometric personal identication system based on iris analysis. Patent, Patent Number: 5,291,560 (1994) 8. Bai, X., Wenyao, L., et al.: Research on Iris. Image Processing Algorithm. Journal of Optoelectronics-Laser 14, 741744 (2003) 9. Guang-zhu, X., Zai-feng, Z., Yi-de, M.: A novel and ecient method for iris automatic location. Journal of China University of Mining and Technology 17, 441446 (2007) 10. Specications of casia iris image database(ver. 1.0), Chineese Academy of Sciences (March 2007), http://www.nlpr.ia.ac.cn/english/irds/irisdatabase.htm 11. Multimedia university iris image database (2007), http://www.persona.mmu.edu.my.ccteo/ 12. Gonzalez, R.C., Woods, R.E.: Digital Image Processing, 3rd edn. Prentice-Hall, Englewood Clis (2008) 13. Dey, S., Samanta, D.: A novel approach to iris localization for iris biometric processing. International Journal of Biological, Biomedical and Medical Sciences 3, 180191 (2008) 14. Masek, L., Kovesi, P.: Matlab source code for a biometric identication system based on iris patterns. the school of computer science and software engineering, the university of western australia (2003) 15. A.M., Zuniga, G.: A fast and robust approach for iris segmentation. In: Symposium II Peruvian Computer Graphics and Image Processing (SCGI 2008), pp. 110 (December 2008) 16. Otero-Mateo, N., Vega-Rodr guez, M.A., Gmez-Pulido, J.A., Snchez-Prez, J.M.: o a e A fast and robust iris segmentation method. In: Mart J., Bened J.M., Mendona, , , c A.M., Serrat, J. (eds.) IbPRIA 2007. LNCS, vol. 4478, pp. 162169. Springer, Heidelberg (2007)

88

M.T. Ibrahim et al.

17. zhu Xu, G., feng Zhang, Z., de Ma, Y.: A novel and ecient method for iris automatic location. Journal of China University of Mining and Technology 17, 441446 (2007) 18. Daugman, J.G.: High condence visual recognition of person by a test of statistical independence. IEEE Trans. on Pattern Analysis and Machine Intelligence 15, 1148 1161 (1993) 19. Cui, J., Wang, Y., Tan, T., Ma, L., Sun, Z.: A fast and robust iris localization method based on texture segmentation. In: Proceedings of the SPIE, vol. 5404, pp. 401408 (2004) 20. Yuan, W., Lin, Z., Xu, L.: A rapid iris location method based on the structure of human eyes. Engineering in Medicine and Biology Society, 30203023 (2005) 21. Daugman, J.G.: How iris recognition works. IEEE Trans. on Circuits and Systems for Video Technology 14(1), 2130 (2004) 22. Ma, L., Tan, T., Wang, Y., Zhang, D.: Local intensity variation analysis for iris recognition. Pattern Recognition 37, 12841298 (2004) 23. Daugman, J.: New methods in iris recognition. IEEE Trans. on Systems, Man and Cybernetics. Part B: Cybernatics 37, 11671175 (2007)

Vous aimerez peut-être aussi