Vous êtes sur la page 1sur 170

BioRFID: A Patient Identification System using Biometrics and RFID

By

David Vassallo

A DISSERTATION

Submitted to

The University of Liverpool

in partial fulfillment of the requirements


for the degree of

MASTER OF SCIENCE
in
COMPUTER INFORMATION SECURITY

December 2016
ABSTRACT

BioRFID: A Patient Identification System using Biometrics and


RFID

By

David Vassallo

Patient misidentification and mislocation is a completely preventable cause of hundreds of pa-

tient deaths even in modern, first world countries. In this thesis we discuss a possible technical

solution to this problem via the use of vein patterns as a biometric for identification and RFID as

a technology to enable patient location tracking. We demonstrate that the proposed system is

capable of identifying patients with a high accuracy (around 91%) and subsequently reliably lo-

cating patients within a medical facility. Further, we show that both patients and doctors were

very interested in using the system and found it very easy to use without being intrusive.

The proposed system uses a Raspberry PI based minicomputer to take near infrared images of

a patients wrist area. The infrared images expose vein structures which are then fed into an

image enhancing algorithm. Enhanced images are then processed by a sparse-coding algo-

rithm that decomposes an image into its sparse vectors. Euclidean distance between these

sparse vectors and template vectors is then used to identify which patient is being pho-

tographed. A RFID tag is subsequently attached to a patient, who can be tracked via a network

of RFID antennas and readers. This information is present to health-care professionals via a

web-based interface.

The thesis focuses on biometric and RFID use in a healthcare setting, however the concepts

and techniques used can be used in several other security applications such as remote bank-

ing, physical access control and computer system access control.


DECLARATION

I hereby certify that this dissertation constitutes my own product, that where the language of others is set

forth, quotation marks so indicate, and that appropriate credit is given where I have used the language,

ideas, expressions, or writings of another.

I declare that the dissertation describes original work that has not previously been presented for the award of

any other degree of any institution.

Signed,

David Vassallo

This dissertation contains material that is confidential and/or commercially sensitive. It is included here on

the understanding that this will not be revealed to any person not involved in the assessment process.

Student, Supervisors and Classes:

Student name: David Vassallo

Student ID number: H00037626

GDI name: Dr Lalit Garg

CRMT class ID: LAUR-906, March 2016

DA name: Dr Lalit Garg

CAC class ID: UKL1.CKIT.702.H00023870


ACKNOWLEDGMENTS

I would like to acknowledge several important people who made this dissertation possible:

Mr Ivan Bartolo, the CEO of 6PM PLC, who provided sponsorship and premises for this project and

unwavering confidence in me

Mr Brian Zarb Adami, the CTO of 6PM PLC, who provided sound advice and ideas on the imple-

mentation of the project, as well as the RFID infrastructure

Dr Lalit Garg, my supervisor at UoL, who provided invaluable advice and help with the dissertation

Mr Robert Grech, R&D engineer at 6PM PLC, who provided invaluable assistance and advice dur-

ing the hardware implementation phase of this project

My family and friends, for putting up with my long hours and absences due to the demands of this

project

Participants of the study, for allowing me to involve them in the study often at short notice, as well

as the interest they expressed and the important feedback they gave.
TABLE OF CONTENTS
Page
ABSTRACT.............................................................................................................................................2
DECLARATION......................................................................................................................................3
ACKNOWLEDGMENTS........................................................................................................................4
LIST OF TABLES...................................................................................................................................7
LIST OF FIGURES.................................................................................................................................8
Chapter 1. Introduction...........................................................................................................................1
1.1 Problem Statement...........................................................................................................................1
1.2 Scope................................................................................................................................................3
1.3 Approach..........................................................................................................................................4
1.4 Outcome...........................................................................................................................................5
Chapter 2. Background and review of Literature....................................................................................6
2.1 Background......................................................................................................................................6
2.2 Literature Review.............................................................................................................................7
2.2.1 Patient Identification Errors.......................................................................................................7
2.2.2 Current approaches to the problem............................................................................................8
2.2.3 Biometrics................................................................................................................................12
2.2.4 Biometric Features...................................................................................................................16
2.2.5 Vein pattern use cases..............................................................................................................18
2.2.6 Biometric systems implementation in literature......................................................................20
2.2.7 RFID.........................................................................................................................................22
2.3 Theory............................................................................................................................................24
..........................................................................................................................................................25
2.3.1 Theoretical Framework............................................................................................................25
2.3.1.1 Patient Wrist.......................................................................................................................25
2.3.1.2 NIR Camera - Vein Pattern Capture...................................................................................25
2.3.1.3 Image Enhancement...........................................................................................................26
2.3.1.4 Training Phase....................................................................................................................27
2.3.1.5 Template Database.............................................................................................................27
2.3.1.6 Sparse Coding Algorithm (training phase).........................................................................28
2.3.1.7 Template Sparse Representations.......................................................................................30
2.3.1.8 Sparse Coding algorithm (identification phase).................................................................31
2.3.1.9 Image Sparse representations.............................................................................................31
2.3.1.10 Comparison Algorithm.....................................................................................................31
2.3.1.11 Patient Identity.................................................................................................................33
2.3.1.12 Available RFID Tags........................................................................................................34
2.3.1.13 Track Patient Through RFID structure.............................................................................34
2.3.1.14 Monitor patient via web UI..............................................................................................34
2.4 Terms and Definitions....................................................................................................................35
Chapter 3. Analysis and Design............................................................................................................37
3.1 Introduction....................................................................................................................................37
3.2 Experimental Design......................................................................................................................38
3.3 Hardware Components...................................................................................................................40
3.3.1 Vein Pattern Capture................................................................................................................40
3.3.2 RFID Infrastructure..................................................................................................................42
3.4 Software Components....................................................................................................................44
3.4.1 High Level Architecture...........................................................................................................45
3.4.1.1 Training Phase....................................................................................................................45
3.4.1.2 Identification Phase............................................................................................................47
3.4.1.3 Image Enhancement...........................................................................................................49
3.4.1.4 Mapping biometrics to RFID.............................................................................................51
3.4.1.5 RFID Software...................................................................................................................52
3.4.1.6 RFID LLRP Listener..........................................................................................................53
3.4.1.7 Filtering and storing the RFID reads..................................................................................55
3.4.1.8 Mapping RFID information to physical location...............................................................56

V
3.4.1.9 The Web UI........................................................................................................................59
3.5 Qualitative Analysis: Surveys........................................................................................................61
3.5.1 End User (Patient) Survey Questions.......................................................................................62
4.5.2. Expert User (Healthcare workers) Survey Questions.............................................................65
Chapter 4. Implementation....................................................................................................................69
4.1 Introduction....................................................................................................................................69
4.2 Hardware Implementation..............................................................................................................69
4.2.1 Vein Pattern Capture................................................................................................................70
4.2.2 RFID Infrastructure..................................................................................................................76
4.3 Software Implementation...............................................................................................................79
4.3.1 Back-end Implementation........................................................................................................80
4.3.1.1 Biometric functions............................................................................................................81
4.3.1.2 Image Enhancement...........................................................................................................81
4.3.1.3 Training phase....................................................................................................................83
4.3.1.4 Identification phase............................................................................................................86
4.3.1.5 RFID functions...................................................................................................................88
4.3.1.6 Patient functions.................................................................................................................89
4.3.1.7 Render functions................................................................................................................89
4.3.2 Front-end Implementation........................................................................................................90
4.3.2.1 Administrator / Operator front-end....................................................................................90
4.3.2.2 Vein Scanner front-end.......................................................................................................93
4.4 Survey Implementation..................................................................................................................96
4.4.1 Recruitment Plan......................................................................................................................97
4.4.1.1 Recruitment plan for end users..........................................................................................97
4.4.1.2 Recruitment plan for healthcare professionals...................................................................98
4.4.2 Delivery of Questionnaires and collection of results...............................................................99
Chapter 5. Testing and Results............................................................................................................100
5.1 Introduction..................................................................................................................................100
5.2 Vein Pattern Capture System........................................................................................................100
5.2.1 Testing Method.......................................................................................................................101
5.3 Results..........................................................................................................................................105
5.3.1 Sample NIR photos................................................................................................................105
5.3.2 Accuracy Results....................................................................................................................107
5.4 RFID Infrastructure......................................................................................................................109
5.4.1 Testing Method.......................................................................................................................109
5.4.2 Results....................................................................................................................................109
5.5 User Experience and Feedback....................................................................................................110
5.5.1 Testing Method.......................................................................................................................110
5.5.2 End User Results....................................................................................................................111
5.5.3 Healthcare Professional User Results....................................................................................113
Chapter 6. Conclusions.......................................................................................................................119
6.1 Lessons Learned...........................................................................................................................119
6.2 Applications.................................................................................................................................121
6.3 Limitations...................................................................................................................................122
6.4 Recommendations & Prospects for Future Research / Work......................................................123
REFERENCES CITED........................................................................................................................125
APPENDICES......................................................................................................................................134
Appendix A. DS Proposal......................................................................................................................134
Appendix B. User Interface Screenshots...............................................................................................145
Appendix C. Code Listing.....................................................................................................................149

VI
LIST OF TABLES

Page
Table 1: Summary of the strengths and weaknesses of each approach with respect to patient identifica-
tion...........................................................................................................................................................10
Table 2: Summary of strengths and weaknesses in different biometric approaches................................14
Table 3: Summary of Vein Pattern use cases...........................................................................................19
Table 4: Summary of comparison algorithms..........................................................................................32
Table 5: Terms and Definitions................................................................................................................35
Table 6: Summary of experimental designs for technology validation (Zelkowitz and Wallace, 1998,
p.5)...........................................................................................................................................................38
Table 7: Sample RFID to Location Mapping...........................................................................................58
Table 8: Implementation differences between Euclidean Distance and SGD classifiers.........................86
Table 9: Evolutionary Testing of Dictionary Learning Algorithm.........................................................103
Table 10: Required hardware, provider and associated costs................................................................141
Table 11: High Level Timetable, Milestones & Tasks...........................................................................141
Table 12: Project Risk Assessment........................................................................................................142

VII
LIST OF FIGURES
Page
Figure 1: BioRFID Components...............................................................................................................4
Figure 2: Percentage of wristband errors by category (Howanitz, Renner, and Walsh, 2002).................9
Figure 3: Theoretical framework............................................................................................................25
Figure 4: Sparse Coding Illustration.......................................................................................................29
Figure 5: Vein Pattern hardware block diagram......................................................................................41
Figure 6: RFID Hardware block diagram...............................................................................................42
Figure 7: Training Phase Data Flow Diagram........................................................................................45
Figure 8: Identification phase DFD........................................................................................................47
Figure 9: Image Enhancement DFD.......................................................................................................49
Figure 10: Mappings table schema.........................................................................................................51
Figure 11: RFID Software DFD..............................................................................................................52
Figure 12: Decision flowchart for filtering tag reads..............................................................................56
Figure 13: Example RFID reader and antennas placement in a medical clinic......................................57
Figure 14: Web UI Storyboard................................................................................................................59
Figure 15: End User / Patient Questionnaire..........................................................................................64
Figure 16: Medical Professional Questionnaire - Part 1.........................................................................67
Figure 17: Medical Professional Questionnaire - Part 2.........................................................................68
Figure 18: Top-view of the vein capture prototype.................................................................................71
Figure 19: Raspberry Pi and supporting circuitry mounted on the underneath of the arc......................72
Figure 20: Small HDMI screen mounted on the raspberry PI, which can be used to provide visual feed-
back to the users.......................................................................................................................................73
Figure 21: Labelled setup of the NIR vein scanner................................................................................74
Figure 22: The passive RFID tags used in this project...........................................................................76
Figure 23: RFID Antennas......................................................................................................................77
Figure 24: RFID Readers........................................................................................................................78
Figure 25: Software implementation block diagram...............................................................................79
Figure 26: Vein pattern image enhancement algorithm implementation, showing original image (top
left) and the final enhanced image (bottom right)....................................................................................82
Figure 27: The administrator front-end...................................................................................................90
Figure 28: Operator front-end.................................................................................................................92
Figure 29: User Function Menu..............................................................................................................93
Figure 30: Identify functionality results..............................................................................................94
Figure 31: Set of figure showing the data set sample images................................................................106
Figure 32: Average Accuracy on data sets............................................................................................107
Figure 33: End-user reaction to Did the system feel intrusive?.........................................................111
Figure 34: End-user reaction to Was it easy to understand how to use the system?.........................111
Figure 35: End-user reaction to How long did it take to use the system?.........................................112
Figure 36: Healthcare professionals survey results to describe their role.............................................113
Figure 37: Healthcare professionals survey results to rate the system ease of use, from 1 (very difficult)
to 5 (very easy).......................................................................................................................................114
Figure 38: Healthcare professionals survey results to rate the system disruption, from 1 (not disruptive)
to 5 (very disruptive)..............................................................................................................................114
Figure 39: Healthcare professionals survey results to rate difficulty of identifying a patient, before the
system was used, from 1 (difficult) to 5 (easy)......................................................................................115
Figure 40: Healthcare professionals survey results to rate difficulty of identifying a patient, after the
system was used, from 1 (difficult) to 5 (easy)......................................................................................115
Figure 41: Healthcare professionals survey results to rate difficulty of locating a patient, before the
system was used, from 1 (difficult) to 5 (easy)......................................................................................116
Figure 42: Healthcare professionals survey results to rate difficulty of locating a patient, after the sys-
tem was used, from 1 (difficult) to 5 (easy)...........................................................................................116
Figure 43: Healthcare professionals survey results to rate the beneficial impact of the system, from 1
(no impact) to 5 (large impact)...............................................................................................................117
Figure 44: BioRFID Sections................................................................................................................137

VIII
Figure 45: Login page with role selection............................................................................................145
Figure 46: Administrator > RFID Readers Settings Page.....................................................................145
Figure 47: Administrator > Map Locations Settings Page....................................................................146
Figure 48: Administrator > Sparse Dictionary Settings Page...............................................................146
Figure 49: Administrator > Enrollment > Patient Profiles....................................................................147
Figure 50: Administrator > Enrollment > Patient Biometrics...............................................................147
Figure 51: Operator > Audit Screen......................................................................................................148
Figure 52: Operator > Last Screen Screen............................................................................................148

IX
CHAPTER 1. INTRODUCTION

Patient misidentification is a problem in the world's hospitals. Wrong patients and related

issues account for about 4% of medical errors in the US alone (Rosenthal, 2003) and

costs the UK's NHS 466 million a year (NHS England, 2014). While there has been

much research into tracking patients using technology such as Radio Frequency Identifi-

cation [RFID], the majority of patient misidentification occurs during patient identification

phase (i.e. determining which ID number a patient is associated with).

1.1 Problem Statement

Patient misidentification is a widely reported problem in medical literature. For example,

the National Patient Safety Agency quoted this problem as a significant risk in the NHS

(Thomas & Evans, 2004). The proposed solution aims to help alleviate the problem of

patient misidentification in healthcare facilities. To this end, we stipulate two hypothesis:

Hypothesis 1: The vein pattern biometrics significantly increases the ease and ac-

curacy of patient identification.

Hypothesis 2: Biometric systems can be successfully integrated with existing

RFID solutions to track patients, providing an end-to-end identification and track-

ing platform for patient and carer safety

This project attempts to verify the above two hypotheses and build a system that will

serve as a proof-of-concept that showcases a fully functional patient identification and

tracking system, including both hardware and software system components. Current so-

lutions currently deal with each problem separately (Lahtela, Hassinen, and Jylha, 2008)

1
(Probst et al, 2016). RFID tracking systems are quite mature and well-established, espe-

cially in the retail sector. Biometrics is also quickly maturing, especially with the introduc-

tion of fingerprint, voice and face recognition being incorporated into smartphones. How-

ever, the two fields have not yet been explored in conjunction. Solutions based solely on

RFID still misidentify the patient and cannot guarantee the presence of a patient. On the

other hand solutions based solely on biometrics provide identification but not location

tracking. In addition, the previously mentioned biometric systems (fingerprint, voice and

face recognition) are not particularly suited for a hospital environment since most pa -

tients might have physical or mental conditions that render such biometrics ineffective.

The proposed solution uses vein biometrics to overcome these problems, in conjunction

with RFID to provide both identification and tracking.

2
1.2 Scope

The scope of this project is to produce a working prototype, including both hardware and

software as the proof of concept system. The prototype will need to demonstrate:

Identifying a patient using biometrics with a high degree of confidence. This in-

cludes:

Building a Near IR Camera to capture vein patterns

Code the software required for image processing and data mining tech-

niques to match the captured vein patterns with known patterns. In addi-

tion the proof of concept will include Web portal to show results to an op-

erator such as a healthcare professional.

Assigning the identified patient an RFID number. This includes:

Investigating the use of RFID Tags

Writing code to enable the Web portal operator to enroll a patient (i.e.

bind their biometric identity to the RFID tag ID)

Subsequently tracking the assigned ID number using an RFID system. This in-

cludes:

Using RFID Readers and antennas to detect RFID tags

Writing code to process RFID reader output

Using the biometric system to confirm a patient's identification. This includes

writing software that prompts patient to scan their vein patterns, and confirms a

patients identity via the use of biometrics

3
1.3 Approach

The actual proof of concept solution consists of two broad categories of tasks, those re-

lating to the hardware of the proposed solution, and those relating to the software. Each

of these categories can be further subdivided into RFID and biometric components, as

shown in Figure 1.

Figure 1: BioRFID Components


Each section provides different functionality, which at a high level can be summarized as fol-
lows:

Hardware (RFID): Read RFID tags attached to patients by using a network of

RFID antennas and RFID readers.

Hardware (Biometrics): A Near Infrared Camera rig that will take pictures of

subjects wrist area, which is subsequently processed by a biometric algorithm to

identify the patient to whom the vein pattern belongs to.

Software (RFID): Code which receives and parses RFID data from RFID read-

ers and translate that to a physical location

Software (Biometrics): Image enhancing code to extract vein patterns, and ma-

chine learning algorithms to identify which individual the vein pattern belongs to.

When evaluating the proof of concept system quantitative methods based on statistics

will be used in order to test the accuracy of the vein pattern matching algorithms. Statisti -

cal methods such as leave one out cross validation were used to determine the accuracy

4
of the vein pattern biometric system proposed. A total of 33 participants volunteered their

vein patterns, and had their wrists scanned by the proof of concept system. In addition,

we use qualitative methods such as issuing questionnaires to both end users (patients)

and expert users (health-care workers) to evaluate if the system helps reduce identifica-

tion errors, is easy to use and helps in day-to-day tasks. The questionnaires were distrib-

uted online to users, and the anonymous answers were then statistically analyzed with

confidence levels of 95%. Throughout the project, a Lessons Learned experimental de-

sign was followed where we iterated over results to continuously improve the proof of

concepts (see Section 3.2)

1.4 Outcome

This project demonstrates that vein pattern biometrics can be used to identify patients

with up to 91% accuracy using very affordable off-the-shelf components. The resulting

proof of concept system can successfully identify the patient and track their location us-

ing an RFID network. In addition, we demonstrate that the majority of participants who

used the system felt very comfortable doing so; finding it non-intrusive to use and very

easy to understand the system. Healthcare professionals surveyed found the system

easy to use and non-intrusive. They also believed that the system would make it easy to

identify and locate patients.

The following chapters will proceed as follows. We will present a comprehensive litera-

ture review of biometrics and tracking technologies as well as review related work in

Chapter 2. Chapters 3 and 4 deal with the analysis, design, and implementation of the

system. In Chapter 5 we present the results of the proof of concept system, and we con-

clude with lessons learned and potential improvements in Chapter 6

5
CHAPTER 2. BACKGROUND AND REVIEW OF LITERATURE

2.1 Background

The core issue being tackled by this dissertation is increasing patient safety in healthcare envi-

ronments. A recent paper by Makary and Daniel (2016) indicates that medical errors are now

the 3rd leading cause of death in the United States [US] alone. Other parts of the world report

facing the same problem. For example, the National Patient Safety Agency identified this issue

as a significant risk in the NHS (Thomas & Evans, 2004). Medical errors is quite a generic

term and encompasses a variety of errors. However, patient misidentification is a significant

contributor to medical errors. In a 2012 survey conducted by the College of Healthcare Informa-

tion Management Executive [CHIME] in the US, 20% of respondents could attribute at least

one adverse medical event to patient identification or matching mistakes (Probs and Branzell,

2016). An executive brief from the highly respected ECRI Institute - which deals with patient

safety - lists patient misidentification in second place in the Top 10 Patient Safety Concerns for

Healthcare Organizations (ECRI Institute, 2016). Regardless, patient misidentification issues

remain under-reported, with medical literature not properly discussing protocols and procedures

for patient identification (Chassin, 2002)

6
2.2 Literature Review

2.2.1 Patient Identification Errors

As we just saw, patient misidentification has been classified as a serious problem in todays

healthcare environments. What are the real world implications of patient misidentification, and

what impact does it have on patients and healthcare workers?

Medical literature routinely exposes the need for substantial changes in the delivery of health -

care. Medical errors result in at least 44,000 unnecessary deaths each year in the United

States, with the most vulnerable patients such as the old or chronically ill bearing the brunt of

these errors (Weingart, et al, 2000). In the UK, around 5% of patients admitted annually experi-

enced some kind of medical error, which in turn has a measurable economic impact - costing

around 1 billion in extra bed days (Murphy and Kay, 2004). While medical errors take place in

many aspects of healthcare, such as diagnostic and surgical procedures, adverse drug reac-

tions and laboratory tests - accurate and efficient patient identification is a critical aspect in all

of these procedures. For example, particularly in blood transfusion, patient misidentification can

have catastrophic effects. In the blood transfusion setting, patient misidentification is the single

most contributing factor to mistransfusion, with it being frequent enough that the risk of mis-

transfusion is much greater than the transmission of HIV by blood, with the identification

process actually getting worse as time goes by (Murphy and Kay, 2004). Murphy and Kay

(2004) note that the reasons for patient misidentification include:

[...]

The (conscious) patient is not asked to state their name (and date of birth) and

these are not checked against the same details on the wristband and other written

documentation, such as the request form and the medical notes

The patient is not wearing an identification wristband

The patient details on the wristband are illegible

Staff do not check the details on the wristband

7
Staff rely on self-identification by the patient

A surrogate identifier such as bed number is used to identify the patient.

[...] (Murphy and Kay, 2004)

Worldwide, problems like the above mean that millions of people are not afforded the most ba-

sic checks during even routine medical procedures like blood transfusion, with the problem be -

coming more pronounced as time goes by and the healthcare infrastructure becomes more

overloaded.

2.2.2 Current approaches to the problem

The oldest approach to the problem of patient identification errors is the patient wristband. Very

often this is a simple piece of paper which was fastened or attached to the patient with hand -

written notes pertaining to the individual such as name, ID, blood type and so on. Handwritten

wristbands soon gave way to printed information, often in the form of 1D or 2D barcodes. How-

ever these are still prone to errors as shown in figure 2.

8
Figure 2: Percentage of wristband errors by category (Howanitz, Renner, and Walsh, 2002)

It is interesting to note that missing wristband is the leading cause of error. This is a funda-

mental problem of any system that relies on a possession for identification. These systems in-

clude systems that are based on barcode, RFID, NFC, or bluetooth tokens. It is for this reason

that biometrics are very appealing to solve the identification problem since they rely on a physi -

ological characteristic that is always on your person and cannot be misplaced.

In spite of this, barcodes remain a very popular of patient identification, and they introduce real

benefits. For example, staff typically find barcode identification systems easy to operate, and

preferred it to standard procedures. The barcode procedure also encourages following a stan-

dard procedure, reduces transcription errors, and can automatically record a users actions

(Murphy and Kay, 2004). However, barcodes require line-of-sight to be scanned properly and

hence may not be appropriate in some clinical environments or for locating a patient in an un -

known location.

RFID on the other hand does not require line of sight, and forms part of a class of solutions that

instead rely on radio frequency, such as NFC or bluetooth. Lahtela, Hassinen, and Jylha (2008)

investigate the use of both RFID and NFC (Near Field Communication) in healthcare. As the

name implies, NFC is suited for very short range communication and tracking (typically in the

9
range of a few centimeters). This limitation makes the technology unsuitable for the purposes of

this project which focuses on locating the patient inside a hospital or ward, and hence requires

longer ranges (ideally of several meters). Another interesting technology that researchers have

used is Bluetooth Low Energy [BLE]. BLE can be used to build indoor location services with

an improved accuracy over the traditional WiFi technologies (Faragher and Harle, 2014). We

have tested both RFID and BLE technologies ((Vassallo, 2016a), (Vassallo, 2016b)), and both

seem to be very similar in terms of being capable of locating a patient in a hospital, with some

changes to the hardware setup. However, RFID is significantly cheaper than BLE when tracking

large numbers of patients. RFID tags cost a few cents while BLE beacons currently cost a cou-

ple tens of dollars on average. Therefore for the purposes of this project we will concentrate on

RFID technology.

RFID technology still does not address the problem of missing wristbands. To address this

particular problem, most approaches today rely on biometrics, with healthcare being second

only to the financial industry in the adoption of biometric identification systems (Mordini and Ot -

tolini, 2007). In addition, while patient misidentification can occur at any stage of the healthcare

process, proper patient identification begins with proper patient registration (Schulmeister,

2008), which essentially means correctly identifying a patient. Indeed there are several com-

mercial biometric offerings which focus specifically on patient identification, such as US-based

Right Patient (Right Patient, n.d.).

Table 1: Summary of the strengths and weaknesses of each approach with respect to patient identi-
fication
Advantages Disadvantages

- Still requires barcodes for con-


- Cheap to implement veying more information
Color Coded Arm-
bands
- Easy to setup and use - Bad layout can lead to patient
misidentification
- Pasfield (1991)
- Probst et al (2016)
- No battery or electronics, highly
resilient - Insufficient best practices
guides to armband design

Barcodes - Extremely cheap tags and infra- - Runs the risk of being misplaced
structure
- Murphy and Kay (2004)

10
- Easy to setup and use
- Requires line of sight

- No battery required

- Does not require line of sight

- No battery required
- Supports long ranges
RFID
- Runs the risk of being misplaced
- Lahtela, Hassinen, and - Easy to use and setup
Jylha (2008) - Expensive infrastructure

- Mature technology

- Cheap Tags

- Does not require line of sight

- Runs the risk of being misplaced

- No battery required

NFC - Complex to setup

- Easy to use
- Lahtela, Hassinen, and
Jylha (2008) - Very short range

- Cheap Tags

- Relatively new technology

- Smartphone compatible

- Runs the risk of being misplaced


- Does not require line of sight
- Complex to setup

BLE - Easy to use


- Expensive Tags

- Faragher and Harle,


(2014)
- Supports long ranges
- Battery powered

- Smartphone compatible
- Relatively new technology

Biometrics - Requires patient to be present, - Complex to setup


making mistakes less likely

11
- Privacy and security concerns

- Mordini and Ottolini,


- No battery required
(2007)
- Difficult to implement continuous
- Schulmeister (2008) - Easy to use monitoring

- Risk of false positives / negatives

Table 1 highlights the strengths and weaknesses of different approaches to patient identification

discussed throughout this section. In this project, we propose the use of vein pattern biometrics

and RFID technology to register patients and subsequently identify and track their physical lo -

cation, combining the best aspects of the two technologies. The following sections explore why

these techniques and technologies were chosen, as well as alternatives that may be used in

the healthcare industry.

2.2.3 Biometrics

The term "biometrics" is derived from the Greek words bio (meaning life) and metric (meaning

to measure). Biometrics is the measurement of an individuals physical or behavioural charac-

teristics, and it has become a very popular means of establishing personal identity, especially

with the proliferation of smartphones. The primary advantage of using biometrics as identifica-

tion over other methods such as smartcards (something you have) or passwords (something

you know) is that biometrics cannot be forgotten or misplaced, they are in essence something

that you are (Jain and Jain, 2002). Because of this, biometrics are highly suited to identification

in sensitive or high security areas. For example, biometric identification is very well suited to al -

leviate insider fraud or friendly fraud problems - where impersonators may have legitimate

access passwords or ID cards (such as family members of a patient) or where a user denies

that a legitimate action has been taken (Kahn and Roberds, 2008).

Ideally, to qualify as a valid identification parameter, a biometric trait should have the following

properties (Jain and Jain, 2002):

12
Universality - everybody should have the characteristic being measured

Uniqueness - while everyone should have the characteristic, each instance of the

characteristic should be unique to the individual

Permanence - the characteristic should not change much over time

Collectability - the characteristic can be measured quantitatively

Performance - the characteristic should be easy to measure

Acceptability - refers to the extent people are willing to use and accept measure-

ment of the characteristic. This is often influenced by factors such as the invasive-

ness of the measuring procedure, religious beliefs and hygiene.

Circumvention - which refers to how difficult it is to fool the system such as by

copying the biometric or spoofing it.

In light of the above, we can now start to judge the applicability of various possible biometric

traits to the healthcare problem domain. Not every accepted biometric is suitable for a given

problem, For example, let us consider three of the most popular biometric traits currently seeing

wide adoption: fingerprints, facial recognition, and voice recognition. These biometrics are cer-

tainly applicable to everyday use by consumers, such as in the banking sector (Fatima, 2011),

however they may not be suitable for a healthcare environment such as the Accident & Emer -

gency [A&E] ward of a hospital. In such a case, it is quite common to have patients who may

have been involved in an incident where their face or fingers have been altered (e.g. a car acci-

dent or beating), have skin conditions making fingerprints unreadable, or even be unconscious

or incoherent hence making voice recognition very difficult. Especially when considering finger-

prints, spoofing attack techniques using electronic ink or gummy fingers are becoming preva-

lent (Galbally-Herrero et al, 2006) - considering healthcare fraud is a billion dollar worldwide

problem, the ease of fingerprint spoofing attacks calls into question the efficacy of the biometric

in the healthcare environment. Table 2 summarizes the advantages and disadvantages of dif-

ferent biometric systems.

13
Table 2: Summary of strengths and weaknesses in different biometric approaches
Accuracy Advantages Disadvantages

- Will not work if patients face is


bloody or has some other condi-
tion obscuring the face (if visual
- Only requires a simple camera used)
Facial Recogni-
camera or infrared cam-
tion
era
Medium - Sensitive to temperature varia-
- Jain and Jain (2002)
tions e.g. patients running fevers
- Applies to entire hu- (if FIR cameras used)
- Prokoski (2000)
man population

- Possibility of false positives /


negatives

- Expensive

- Only requires a simple - Easily obscured by contact


Iris Recognition
camera lenses, glasses, etc

High
- Jain and Jain (2002)
- Can be extremely ac- - Very short range
- Fatima (2011)
curate

- Possibility of false positives /


negatives

- Will work work if patients fin-


Fingerprints ger is bloody or has some other
- Fingerprint sensors
condition obscuring the finger
embedded in most
smartphones
- Galbally-Herrero et al
High
(2006)
- Possibility of false positives /
negatives
- Jain and Jain (2002) - Accepted forensic fea-
- Fatima (2011)
ture

- Becoming easier to spoof

- Will not work if patient is un-


conscious
Voice Recognition - Only requires a simple
microphone
- Easily distorted due to noise or
Low
- Jain and Jain (2002)
sickness

- Fatima (2011) - Simple to implement


- Possibility of false positives /
negatives

Vein Patterns High - Very difficult to spoof - Better accuracy requires modi-

14
- Internal body feature, fied camera
hence resistant to ex-
ternal obscuring factors
- Fatima (2011) - Possibility of false positives /
negatives
- Liveness check is in-
herent

- Depends on brain activity,


hence will not work on patients
- Very difficult to spoof
with abnormal mental states
(such as being unconscious, ag-
itated, undergoing chemother-
EEG - Internal body feature,
apy or hormone therapy
hence resistant to ex-
High
ternal obscuring factors
- Paranjape et al (2001)
- Considered quite intrusive

- Liveness check is in-


herent
- Takes a long time to take a bio-
metric sample, limiting scalability

Heartbeat Signals
from facial video
- Heartbeat signals read in this
- Contactless, hence
manner lack the distinctiveness
Low very unintrusive
- Nasrollahi, K., Haque, capability required to use them
M.A., Irani, R. and Moes- as biometrics
lund, T.B., (2016)

- Very difficult to spoof

ECG
- Internal body feature, - Takes a relatively long time to
hence resistant to ex- obtain a biometric sample (about
- Lugovaya (2005)
ternal obscuring factors 20-30 seconds)

- Odinaka et al (2012) Medium


- Liveness check is in- - Accuracy decreases when
- Sufi, Khalil, & Mahmood, herent ECGs are performed in different
(2011) sessions (training data taken
- Very small template weeks before identification data)
size possible due to
compression

For these reasons, this dissertation will focus on the use of subcutaneous vein patterns as bio -

metric features.

15
2.2.4 Biometric Features
Universality

This is straightforward since every living person has a vascular system, hence this biometric

feature can be applied to all humans. In practice however, certain individuals may not have any

hands or feet, or have vein patterns which are difficult to read due to fatty tissue. Regardless,

research indicates that vein recognition systems can be used on 99.9% of the population (Wil-

son, 2011), which is especially significant when comparing this to fingerprint systems which are

effective of 95% of the population.

Uniqueness

While there is no statistical model to quantitatively prove that vein patterns are unique to indi -

viduals, it has been shown that there is a high variety of branching patterns, leading to the

widely held assumption that vein patterns are in fact unique. This assumption has yet to be dis -

proved in finger vein, as well as left and right palm veins (Nadort, 2007)

Permanence

In industry, the only reported variance in vein patterns is that due to the natural growth of hu -

mans. However, Nadort (2007) identifies several diseases and surgery related procedures that

may cause veins to change. That being said, most of the described causes for change do not

affect the area of interest of this dissertation (i.e. the hand wrist region), and hence only would

affect a very small percentage of the population

Collectability.

Vein patterns are collected by visual means. While a variety of methods can be used, such as

X-Ray or ultrasound, the invasive nature of these methods means that the two most common

methods of collecting vein patterns are Near InfraRed [NIR] and Far InfraRed [FIR]. Of the two,

Wang, Leedham, and Cho (2007) note that NIR is more effective at capturing vein patterns and

is more tolerant to environmental changes. On the other hand, the authors also note that NIR is

more susceptible to pattern corruption due to its sensitivity to surface patterns such as skin

16
hair. NIR is absorbed by haemoglobin in the blood, therefore veins appear to be darker in color

than the rest of the hand.

Performance.

This dissertation will be based on capturing vein patterns via NIR. NIR cameras are cheap and

easy to source. Several researchers have used cheap and highly customizable embedded sys-

tems such as the Raspberry Pi computer to capture NIR vein pattern images in real time (Joar-

dar, Chatterjee, and Rakshit, 2015).

Acceptability

In contrast to other biometrics systems like fingerprint scanning, NIR based systems do not re-

quire contact with the sensor, hence it is considered quite hygienic (Wilson, 2011). Due to its

hygienic nature and non-invasiveness, vein pattern biometrics promote user acceptance

(Watanabe et al, 2005)

Circumvention

Due to veins being an internal feature of the body, and NIR relies on the presence of blood,

vein patterns are considered to be remarkably difficult to circumvent (Watanabe et al, 2005). In

addition, since vein patterns are hidden beneath the skin, they are difficult to forge (Hashimoto,

2006), especially when compared to fingerprint system, which have recently come under attack

by various forgery techniques such as electronic ink or dental putty replicas.

17
2.2.5 Vein pattern use cases

Vein patterns have a variety of use cases, mainly within identification. Recently, vein patterns

have been used for forensic purposes, as was the case with the FBI identifying a terrorist who

committed several crimes against the United States. In 2011, the FBI identified the terrorist as a

journalists beheader by matching veins in his hand with those in a video of the beheader

(Farmer, 2011).

The main use cases for vein patterns remain in the realm of biometrics. The finance industry,

which invests significantly in preventing identity fraud, has already begun using vein patterns to

identify individuals. As far back as 2004, ATMs in Japan began using palm vein patterns to au -

thenticate users (Kallender, 2004). More recently, Europe has followed suit, with ATMs in

Poland using finger vein pattern matching (Collinson, 2014).

Vein pattern technology is not just present in specialist hardware installations such as ATMs.

Again, the finance industry has examples of consumer-grade vein pattern recognition technol-

ogy which banks use to authentication Internet banking customers (Gompertz, 2014). Just as

fingerprints did a few years ago, vein patterns are seeing a number of initiatives placing them

into more consumer products. There are efforts to integrate vein pattern capturing technology

into smartphones (Yalavarthy, Nundy, and Sanyal, 2009), and major electronics vendors like

Samsung are applying for patents to integrate vein pattern recognition in wearables such as

smartwatches (Patently Mobile, 2016)

18
Table 3: Summary of Vein Pattern use cases
Author Use Case

Vein patterns used as a forensic method of identifying individuals


- Farmer (2011)
- in this case a known terrorist

Vein patterns used to identify authorized account holders using


- Kallender (2004)
bank ATMs, considered more secure than just entering a PIN
- Collinson (2014)
number

Vein pattern biometric consumer devices are being used to iden-


- Gompertz (2014)
tify and authenticate Internet Banking users

- Yalavarthy, Nundy, and


Use of vein patterns to identify and authenticate users in mobile
Sanyal (2009)
platforms such as smartphones and smartwatches
- Patently Mobile (2016)

Table 3 lists several examples in literature where vein patterns are used for different ap-

plications

19
2.2.6 Biometric systems implementation in literature

Having determined the applicability of vein patterns as a biometric feature, we turn our attention

to the implementation of the system. Several researchers have built NIR image rigs to capture

hand vein patterns based on NIR capable cameras and used a variety of techniques to extract

the actual patterns. In general, biometric systems pass through two implementation phases: en-

rollment and identification. In the first phase, a subject is enrolled into the system by providing

biometric templates which are used in the second phase of identification to match subsequent

biometric samples to the nearest matching template. With vein patterns, the process involves

capturing the image, and applying a number of image enhancement techniques to make the

vein patterns more recognizable (Aboalsamh., Alhashimi, and Mathkour, 2012). After enhancing

the images, features which constitute identifiable material from the image are extracted and

stored. For example, some researchers have used vein bifurcation points as the image feature

(Soni et al, 2010). Others have used thinned vein pattern images, i.e. images where veins

have been reduced to white lines on a black background (Gayathri, Nigel, and Prabakar, 2013).

Once a template image for an individual has been obtained, processed and stored, that individ-

ual can be subsequently identified using a variety of identification techniques. Some studies ap-

proached the identification problem using relatively simple methods, such as minimizing the eu-

clidean distance between feature sets (Soni et al, 2010) (Sahu and Bharathi, 2015) of two bio-

metric images. Other studies compared the two biometric images directly rather than using fea-

tures, and using image similarity measures to identify a subject (Badawi, 2006), similar to im-

age convolution. However more intelligent approaches exist by pursuing the latter strategy of

direct image comparison. In essence, the problem now becomes one of image recognition,

which is widely studied especially in the field of computer vision, so literature suggests a large

number of approaches to the problem, such as K Nearest Neighbor and support vector ma-

chines (Kim, Kim, and Savarese, 2012). For example, one popular technique used in computer

vision is neural networks, and in fact, some researchers have applied neural networks to vein

pattern recognition (Kocer, Tutumlu, and Allahverdi, 2012). Similarly, support vector machines

[SVM] have also been applied to vein pattern recognition with good results (Lee et al, 2010).

20
However, the latter two techniques mentioned have a limitation when applied to the healthcare

environment: the difficult they have to scale efficiently. In the case of more advanced computer

learning algorithms such as neural networks or SVMs, these need to be retrained whenever

new subjects are enrolled in the system. In a healthcare environment which deals with thou-

sands of patients and has a very high patient turnover, this would cripple the performance of

the system. For this reason, the application of sparse representation-based classification

[SRC], also known as sparse coding to the vein pattern recognition problem is of particular in -

terest (Joardar, Chatterjee, and Rakshit, 2015). The concept of sparse coding will be further ex-

plored in the subsequent section 2.3 Theory. In a nutshell, sparse coding does not require re -

training whenever a new subject is enrolled into the system.

21
2.2.7 RFID

Having reviewed the biometric aspect of the project, we now review what happens after a sub-

ject has been identified. In reality, any action can be taken after identification, such as authenti -

cating a patient into an online service, or allowing them access to a restricted area. In the case

of this project, our main aim is to reduce medical errors, therefore we focus on ensuring the

right patient is in the right place at the right time, avoiding misdiagnosis or wrong medication.

RFID (Radio Frequency Identification) is a mature and well-established technology that has al-

ready been implemented in a variety of applications (Betances and Huerta, 2012):

Access control.

Inventory Management.

Baggage identification and screening.

Industrial production chains.

Library book input and output.

Identification and location of animals.

Healthcare.

RFID systems consist of RF-emitting antennas and RFID tags. Tags can be either passive or

active in design, the difference being the former are powered from captured RF energy while

the latter are battery powered (Want, 2006). When powered, the tags emit a unique number

which serves as an identification to whatever item or person the tag is attached to. Hence for

the purposes of this project, it is possible to positively identify a patient using biometrics and

subsequently bind that identity to a unique RFID tag that is issued to the patient. Most industry

RFID systems are based in the Ultra High Frequency [UHF] range, at a frequency of 2.4GHz

With respect to healthcare applications, Yao, Chu and Li (2010) outline the following 5 use

cases for RFID technology:

22
Medical mistakes

Increased costs

Theft loss

Drug counterfeiting

Inefficient workflows

Yao, Chu and Li (2010) note that RFID can be used to tackle the above issues by applying the

technology to tracking, identification, and verification. Of particular interest is the authors men-

tion that privacy is one of the main obstacles to more widespread of the technology, which is an

aspect we hope to address in this project.

23
2.3 Theory

In order to reduce medical error due to patient misidentification, this project poses two hypothe-

sis:

Hypothesis 1: The vein pattern biometrics significantly increases the ease and accuracy

of patient identification.

Hypothesis 2: Biometric systems can be successfully integrated with existing RFID solu-

tions to track patients, providing an end-to-end identification and tracking platform for

patient and carer safety

In the previous section we have seen why vein pattern biometrics are a better choice than other

biometric features in a healthcare environment, as well as explored RFID and alternative tech-

nologies to locate a patient once they have been registered in the system. In this section we ex-

plore the theory and high level components behind a system that integrated vein pattern bio -

metrics and RFID in such a way that addresses some privacy concerns relating to both RFID

and biometrics that usually curb the adoption of these technologies. Based on the discussion in

the Literature Review, we identify a theoretical framework for the system as follows in Figure 3

24
Figure 3: Theoretical framework

2.3.1 Theoretical Framework

2.3.1.1 Patient Wrist

We decide to use the patient wrist as the region of interest [ROI] which we will acquire NIR im-

ages of. In practice, various parts of the hand can be used to provide the biometric feature,

such as the palm, wrist, and back of the hand, all with positive results (Wang, Leedham, and

Cho, 2007).

2.3.1.2 NIR Camera - Vein Pattern Capture

As previously mentioned, NIR will be used to capture vein patterns. A NIR image of the vein

patterns is captured in a very similar way to a normal photograph. In order to make images as

similar as possible and avoid scale, skew or rotational problems as much as possible, a physi-

cal guide is built to guide patients on how to place their hands to get the best possible image

of their vein patterns. We drew ideas from previous research, such as the image capture rig

built by Suarez Pascual et al (2010), with some modifications to the design.

25
2.3.1.3 Image Enhancement

The image enhancement stage is extremely important since in this stage we ensure the veins

are made more prominent by removing noise and increasing contrast. In order to do this, we

implement the following algorithms:

Image erosion and dilation

Image erosion and dilation are a set of mathematical morphology functions (Haralick, Stern -

berg, and Zhuang, 1987) that together increase the contrast of an image. Erosion shrinks bright

areas of the image while enlarging dark areas. Dilation removes the resulting small bright spots

(also known as salt in digital image processing) and connects small dark regions. Therefore

dilation tends to increase the relative area of dark gaps between brighter areas. With respect to

vein patterns, since NIR is absorbed by blood, the dark gaps will equate to the vein them-

selves, therefore we are interested in making these as dark as possible.

Image equalization using adaptive histogram

The resulting image still is of relatively low contrast, therefore the next image processing step is

image equalization with enhances the contrast in images by spreading out (i.e. making

lighter or darker) the most commonly used intensities. This allows smaller features in the image

to be more easily discernable. In particular, the adaptive histogram variation of this algorithm is

to be used. In this variation, the algorithm calculates the equalization by using a histogram cal -

culated by splitting the image into tiles, rather than using the entire image. Therefore, local de-

tails can therefore be better enhanced even in image regions which are darker or lighter than

the image average. There are a number of variations of the adaptive histogram itself, but in

theory the contrast limited adaptive histogram equalization [CLAHE] technique should pro-

duce images in which the noise content of an image [...] is not excessively enhanced, but in

which sufficient contrast enhancement is provided for the visualization of structures within the

image (Zimmerman et al, 1988)

Applying a median filter to the image.

26
The last image pre-processing step is further noise reduction. To this end, we employ a me-

dian filter to the image which in effect smoothes out the image, reducing noise while retaining

edges (Weiss, 2006) which is extremely important for vein pattern recognition.

2.3.1.4 Training Phase

Every biometric system typically passes through two phases: a training or enrollment phase,

followed by the identification phase. In the enrollment phase, subjects provide samples of their

biometric features to the system so they can be subsequently used in the identification phase.

In this particular case, the biometric features are the image enhanced vein patterns which are

then stored as templates. The templates should ideally be taken in a variety of angles and

light conditions to as closely as possible match conditions that may arise in the identification

phase. In the identification phase, the subjects vein pattern is matched with all stored templates

gathered during the enrollment phase.

2.3.1.5 Template Database

The template database is a secure storage where vein patterns gathered during enrollment are

stored. It is important that the template database is only accessed by authorized personnel,

since an attacker could insert unauthorized templates or modify templates of existing personnel

and thus compromise the identification phase.

However, it is worthwhile noting that due to privacy reasons, it is not advisable to store or even

use the actual vein patterns once the enrollment phase is over. Instead, we should use a trans-

formation of these templates (Prabhakar, Pankanti, and Jain, 2003). Since humans have a very

limited set of biometrics (for example, we only have two wrists), a biometric compromise may

be difficult to recover from. If an attacker got hold of vein patterns which they could use to im -

personate a subject, then that subject would have to re-enroll and again, assuming their other

wrist vein patterns have not already been compromised. Using a transformation of the vein pat-

terns makes it easier to change compromised credentials since the system would only need to

change the transformation being used, not the actual biometric data. This same concept of us-

ing transformations also enhances privacy. If biometric templates are used to identify a subject

directly, different service providers who utilize biometric identification can be leveraged to build

27
a picture of a subjects activities by correlating biometric identification activity across different

service providers; much in the same way that third party cookies in a web browser can build a

complete picture of a subjects browsing habits (Roesner, Kohno, and Wetherall, 2012). By us-

ing transformations, privacy is enhanced since different service providers use different trans-

forms to describe the same individual, hence protecting his or her identity. In the coming sec -

tions below, we outline how we transform biometric templates which are stored in order to en-

hance security and privacy to a certain degree.

2.3.1.6 Sparse Coding Algorithm (training phase)

In section 2.2.1 the concept of sparse coding was mentioned. Sparse coding attempts to de-

scribe a large vector of inputs by a weighted sum of a number of basis functions. When ap-

plied to images, sparse coding approximates the behaviour of neurons in the brains visual cor-

tex (Lee et al, 2006). During the training phase, given a set of images that (ideally) completely

describe or approximate any subsequent input images, sparse coding will build a dictionary of

these images by decomposing them into their basis functions - or features which make up an

image. Hence sparse coding represents images more efficiently than simple pixels. To better

understand this process, we present an oversimplified example which illustrates the algorithm

(Ng, 2010):

28
Figure 4: Sparse Coding Illustration

In figure 4, the natural images on the left are decomposed into the base functions that are dis-

played in the matrix on the right. Any subsequent test image can be expressed as a function of

the learned bases. The test example in the bottom left is a small region of the test image. In fig-

ure 4, this example can be expressed as components of three learned bases. Hence the entire

region can be expressed by the formula at the bottom of figure 4:

x (0.8*36)+(0.3*42)+(0.5*63)

Therefore, the whole test region has been converted to a very succinct representation, and by

extension the whole image can be represented as a set of functions.

With the above example in mind, we now apply the foundations of sparse coding to the biomet-

ric system. The template database is used to build the learned bases in figure 4, also referred

to as the dictionary. There are a number of methods for learning the dictionary, however one

of the most scalable and performant options is to use online batch dictionary learning (Mairal et

al, 2009). Online learning methods can update their learned dictionaries incrementally when

new templates are provided, rather than having to learn the entire dictionary from scratch.

29
2.3.1.7 Template Sparse Representations

The template sparse representation database is the result of passing each biometric template

through the sparse coding algorithm using the learned dictionary. Each template image will be

reduced to its sparse representation, similar to the equation presented in figure 4. Of particular

note is that whenever the batch dictionary learning algorithm is run from scratch, a new dictio-

nary is built, even if the exact same templates are provided as learning material. If the dictio-

nary changes, then so do the sparse representations of the images. Looking at figure 4, we in -

tuitively understand that if the learned bases are changed, then so too must the resulting for -

mula describing the test example image region. This is the basis of the systems privacy and

security features:

Strictly speaking, the system now does not need the actual vein pattern images,

but only their sparse representation. Therefore the template database collected in

stage 5 can be archived or destroyed, reducing its chance of compromise. This

feature can be used to address some of the security and privacy concerns of

users.

Since the same template images generate a different dictionary, different biometric

providers will have different sparse representations for the same users, increasing

their privacy

If a template sparse representation database is compromised, the system doesnt

require a new set of biometric vein patterns. All that is required is retraining the

batch dictionary algorithm to produce a different dictionary, and hence changing

the sparse representation of its users. While this is certainly a time-consuming

process, it is still much more preferable than requiring users to submit a new set of

biometric vein patterns, since as we previously discussed, humans have only 2

sets of wrist vein patterns. Note that it is imperative that the actual template data-

base (as opposed to the template sparse representation database) is not compro-

mised since we then require a completely new set of biometric vein patterns.

30
2.3.1.8 Sparse Coding algorithm (identification phase)

During the identification phase, a subject that needs to be identified presents their wrist to cap -

ture their vein patterns. This biometric sample is then passed through the image enhancement

techniques and the sparse coding algorithm to obtain its sparse representation. The dictionary

that was built during the training phase is used.

2.3.1.9 Image Sparse representations

The resulting image sparse representation from section 2.3.1.8 is now used as the subjects

vein pattern signature. The vein pattern signature is hence reduced to a 2 dimensional sparse

matrix.

2.3.1.10 Comparison Algorithm

Once the system obtained the sparse representation of the subjects vein patterns, it proceeds

to make a one-to-many comparison of the sparse representation against all stored template

sparse representations. This is potentially the most time consuming aspect of the identification

process so a good choice of comparison algorithm is essential. The most intuitive comparison

algorithm is to use a distance function which computes the distance between the subjects

sparse representation and all the templates representation. The identity chosen is the individ-

ual whose template produces the minimum distance. Two popular distance functions are Eu-

clidean distances and Cosine Angle distances, both of which have similar performance charac-

teristics (Qian et al, 2004). The advantage of these simple distance functions is that they are

easily parallelizable and hence can be distributed over a large number of computers or servers

as the number of templates grow.

Another option for the comparison algorithm is to treat the problem as a classification problem,

where each set of sparse codes representing an individuals vein pattern templates can be

placed into a single class, where a class is an individual. Classification problems are well

studied in artificial intelligence research, and the reduction of biometric templates into sparse

codes makes the problem similar to text classification (another sparse feature problem), which

is also widely studied, hence we have a number of options to explore, notably algorithms such

as Support Vector Machines, k-nearest neighbor, and ridge regression (Yang, Zhang, and

31
Kisiel, 2003). However, as suggested in previous sections, plain vanilla implementations (i.e.

without modifications) of these algorithms suffer from scalability issues since they are usually

not easy to train incrementally, or train online. Therefore when a new individual is enrolled into

the system, the comparison algorithm would need to be retrained, which is not feasible for this

project. In fact, when scaling to larger datasets, practical machine learning programming li-

braries suggest using incremental learning classifiers such as Linear Stochastic Gradient De-

scent (Scikit Learn, 2014). Table 4 summarizes the comparison algorithms along with relevant

citations in literature.

Table 4: Summary of comparison algorithms


Advantages Disadvantages

Euclidean Distance - Simple to implement


Function - Scaling issues: requires iterating
- Easily parallelizable over all possible templates

- Qian et al, (2004)

- Can be highly accurate

- Complex to implement
- Large number of libraries im-
plementing well-researched neu-
ral network architectures (e.g.
- Requires very precise tuning
Neural Networks Googles TensorFlow)

- Kocer, Tutumlu, and Al-


- Can have high performance re-
- Used extensively with good re-
quirements
lahverdi (2012) sults

- Depending on algorithm chosen,


- Offer relatively good perfor-
may require retraining
mance, does not require iterat-
ing over whole template data set

Nearest Neighbor - Easy to use, conceptualize and - Difficult to implement online / in-
implement cremental learning, therefore new
enrollments require retraining
- Yang, Zhang, and Kisiel

(2003) - Offer relatively good perfor- - Needs tuning


mance, does not require iterat-
ing over whole template data set - Does not perform too well in high
dimensional data (like sparse cod-
ing)

32
- Does not scale well to thousands
of data points

- Easy to use, conceptualize and


implement
- Difficult to implement online / in-
Support Vector Ma- cremental learning, therefore new
chines enrollments require retraining (un-
- Offer relatively good perfor-
less paired with SGD, see below)
mance, does not require iterat-
- Yang, Zhang, and Kisiel
ing over whole template data set

(2003) - Better performance with bigger


data points
- Offers good accuracy with high
dimensional data

- Performs very well with sparse


feature sets
- Difficult to implement online / in-
Ridge Regression cremental learning, therefore new
- Relatively easy to implement enrollments require retraining

- Yang, Zhang, and Kisiel

(2003) - Offer relatively good perfor- - Requires careful variable selection


mance, does not require iterat- due to collinearity
ing over whole template data set

- Modifies popular classifiers like


SVM to easily implement online /
incremental learning

Linear Stochastic - Sensitive to feature scaling


- Performs very well with sparse
Gradient Descent
feature sets

- Needs tuning and proper hyperpa-


- Scikit Learn, (2014) rameter selection
- Offers relatively good perfor-
mance, does not require iterat-
ing over whole template data set

2.3.1.11 Patient Identity

The comparison algorithm will have output the nearest guess of who the presented vein pat-

terns belong to. The system will now have identified the patient and if required this can be veri -

fied by a carer by presenting a photo ID or other form of secondary identification.

33
2.3.1.12 Available RFID Tags

Once the patient identity is verified, the carer assigns an RFID tag from available stock to the

patient, and inputs the unique RFID serial number into the system. At this stage, the system

now equates the patient identity with a particular RFID serial number and a mapping between

RFID and patient identity is created. In other words, the sparse code representing the patients

vein pattern now needs to be mapped to an RFID serial number. It is worth noting that for the

purposes of this project, a one-to-one mapping between sparse code and RFID serial number

is sufficient. However, it is also possible to have a one-to-many relationship between sparse

code and RFID serial numbers, that is, a single patient can be assigned multiple RFID serial

numbers. Furthermore, for security and privacy reasons it may be desirable that the mapping

server does not store the actual mappings themselves, but rather a transformation function that

returns the correct RFID serial number only if the correct patient sparse code representation is

presented. An implementation of this scheme is presented in Sutcu, Sencar, and Memon

(2005), based on gaussian curves.

2.3.1.13 Track Patient Through RFID structure

The patient is given the RFID tag to keep on their person. This is trivial since RFID tags come

in many shapes and sizes, ranging from patient wristbands, to battery assisted tags that can be

attached to wheeled beds. The RFID tag is tracked through RFID antennas placed at strategic

positions around the facility (such as at ward entrances and exits), and the system registers

which antennas have last picked up the RFID tag. The system will mark the patients location

as the antenna which most recently detected the RFID tag with the highest signal strength.

2.3.1.14 Monitor patient via web UI

The interaction between the operator and the system all happens through a web portal, not ne-

cessitating any special software installed other than a normal browser. The above information

regarding the patients location is output on a map for operators to monitor.

34
2.4 Terms and Definitions

Table 5 provides a glossary of some of the terms used throughout this disserta-

tion.

Table 5: Terms and Definitions


Term Description

A computer image technique used to improve contrast in


Contrast Limited Adaptive images. Rather than using a single histogram computed
Histogram Equalization over the entire image, CLAHE improves local contrast by
[CLAHE] first subdividing an image in to sub-images, and calculating
individual histograms for each.

A software paradigm where a programmer strives to encap-


sulate code in atomic functions that are re-used as much as
Dont Repeat Yourself [DRY]
possible throughout the program, avoiding code bloat and
repetition

A record of brain activity measured via electric signal detec-


Electroencephalogram [EEG] tion.

A set of hardware pins connected to a circuit board that can


General Purpose Input Out-
either input or output both digital or analogue signals to or
put [GPIO] from compatible peripherals

An industry standard protocol used to interact with RFID


Low Level Reader Protocol
readers. Several software libraries in popular programming
[LLRP] languages exist the implement this protocol.

The spectrum of light right before the red wavelength. Typi-


cally found in TV remote controls and security equipment, it
Near InfraRed [NIR]
operates at wavelengths of around 700nm - 950nm

A popular form of artificial learning that emulates a human


brains neurons. Neural networks have been used for vari-
Neural Networks
ous applications such as classification of images and pow-
ering self driving cars.

One form of HTTP requests. HTTP requests can be one of


POST requests several types the two most common are GET (used to re-
trieve information) and POST (used to send information)

A very popular dynamic scripting language that has gained


Python
popularity in scientific, academic and industrial settings

Region Of Interest An area in the image that is kept for further processing. The

35
rest of the image is cropped out saving both processing
and storage.

A term used in wireless networks to measure the strength


Relative Signal Strength Indi-
of an antenna or access point relative to a clients position.
cator [RSSI]
Usually this is measured in dB

Radio Frequency Identifica- Refers to a small chip that responds to radio frequencies
tion [RFID] (typically at 13.56MHz or 960MHz ) with a unique identifier

Machine learning and AI algorithms need to be fed labeled /


classified data to be able to generate models. These mod-
Training Phase els are subsequently used on new data in order to classify
this data based on previously seen data. This model gener-
ation is referred to as the training phase.

A toolkit provided by a vendor to interact with software they


Software Development Kit
wrote. Typically this toolkit takes the form of software code
[SDK]
like dynamic linked libraries, or shared libraries.

A classifier that is based on spare representation, which is


Sparse Representation-based
a set of sparse vectors (vectors which are most null or 0)
Classification [SRC]
that can solve a set of equations

A visual representation of how a user interacts with a piece


Storyboard of software, usually through a GUI (Newman and Landay,
2000)

Subcutaneous Underneath the skin

A machine learning technique that strives to divide a set of


Support vector machines data by introducing hyperplanes - lines that divide the
data with maximal distance between different classes.

In the next chapter we will introduce the experimental design of the system. We outline the
analysis and design decisions taken with respect to both hardware and software used in the
system.

36
CHAPTER 3. ANALYSIS AND DESIGN

3.1 Introduction

This chapter will tackle the experimental design of the project, where we will outline the

design in terms of established experimental models. We then process to discuss the phi-

losophy behind the design and implementation of the project, the objectives to be met by

the design, and why particular design choices were made. The majority of this chapter

will deal with two aspects in the project. The first is the technical aspect regarding the

analysis and design involved in building the proof of concept IT artefact. The second as-

pect is designing the questionnaires that will support the qualitative part of the project

where we query real-life users if the system helps in their healthcare visits and if it re -

duces medical errors.

In the first sections we will analyse the Lessons Learned experimental model used to

validate the hypothesis posited in this project. As explained in section 3.2, the Lessons

Learned model (Zelkowitz and Wallace, 1998) allows us to examine qualitative data

from completed projects. In the case of this project, the completed project is the working

proof of concept that is designed and developed as described throughout this disserta -

tion, while the qualitative data is obtained from the questionnaires presented in section

3.5. Subsequent sections will analyse the high level architecture [HLA] of the hardware

and software, by describing the individual modules and processes. The HLA is described

via the use of data flow diagrams [DFD] using the notation described by Chen (2009).

Once the technical aspect has been covered, we proceed to the design of the question-

naires and present a qualitative research design including a metasummary of the find-

ings as originally outlined by Sandelowski, Barroso and Voils (2007).

37
3.2 Experimental Design

There are a number of experimental designs that can be used for validating technology.

Table 6 summarizes these models:

Table 6: Summary of experimental designs for technology validation (Zelkowitz and Wal-
lace, 1998, p.5)

For the purposes of this project, we will be using the Lessons Learned experimental

model. Once the hardware and software components have been completed and com-

bined into a working proof of concept IT artefact, the Lessons Learned model requires us

to gather qualitative data in order to examine the efficacy of the project. This qualitative

data is gathered from the questionnaires that are presented in section 3.5. Essentially

the project adheres to the single group, pre and post test experimental design, where a

single group of users (especially healthcare workers) are asked about their experiences

in patient identification before and after the system is used. The surveys are designed to

38
compare how much of an impact the system has on reducing patient identification and

location errors, while still being easy to use and applicable to the majority of use cases

that may arise in a healthcare environment.

As indicated in table 6, there are two main disadvantages to this approach: the lack of

quantitative data (i.e. having to rely on subjective data) and not being able to control or

constrain all factors. As regards the former, this is an inherent problem when determining

the success of a user-facing computer system. We discuss in section 2.5 that the main

measure of success of a computer system is how well users react to the system, and

how useful they find the system. Different users in the same category of user groups

have different notions of usefulness, for example some users might give priority to de-

sign over function, while others may be vice versa. This also contributes to the latter

problem, that we are unable to constrain all factors. For example, even within the health-

care workers user group, there may be users who are more computer and technically

savvy than others, which may result in more favourable results being obtained from the

more technology literate individuals. In order to mitigate these risks, we first try to recruit

a significant amount of participants in this study, so that these variables will balance each

other out, while randomly selecting individuals to make sure that answers are statistically

relevant.

39
3.3 Hardware Components

The first stage of the technical analysis is determining which hardware components will

be used to actually build the system. As pointed out in previous chapters, the project re-

quires hardware to support two features: the biometric vein capture and the subsequent

RFID tracking of the patient. The primary objectives of the hardware analysis phase are

to:

Keep costs to a minimum, since this will mean lower per-unit costs to the final

product

Use off-the-shelf components as much as possible. This translates into lower

costs, while also keeping things simple to procure, build, and repair, while reduc-

ing the build times.

Wherever possible, use open source hardware unless this conflicts with the

above two points.

3.3.1 Vein Pattern Capture

Vein pattern capture involves taking Near InfraRed [NIR] photographs of a particular re-

gion of interest [ROI]. As described in section 2.3.1.1, we intend to use the patient wrist

as a ROI, so a suitable guide is to be built to clearly show end users where to place their

hand so that the NIR camera can capture the ROI properly, without scale or rotation vari -

ances between pictures. The guide can be a simple cardboard, wooden or plastic mould.

The choice of NIR camera is important as it largely affects the quality of the images

taken. Options for the NIR camera range from modifying a DSLR camera (Cardinal,

2013) to using specially made infrared Raspberry Pi cameras, known as the Pi Noir

40
camera (Raspberry Pi, n.d.), and in fact, both options have been used in literature (Soni,

Gupta, Rao and Gupta, 2010; Joardar, Chatterjee and Rakshit, 2015). In keeping with

the objectives outlined in the previous section,we will use the Raspberry Pi computer

coupled with the Pi Noir camera, since it is a much cheaper option (costing about $75)

as well as being open source, off the shelf hardware. Since the Raspberry Pi is a fully-

fledged mini computer that is capable of running Linux, it gives the added flexibility of

performing some of the calculations, such as image enhancement, on-board rather than

adding load to the central server. The Raspberry Pi can optionally also be attached to an

HDMI monitor to give real-time feedback to the end user. In addition, the Raspberry Pi 3

has inbuilt WiFi and ethernet, offering flexible TCP/IP connectivity options. The proposed

hardware block diagram for vein pattern capture is depicted in Figure 5:

Figure 5: Vein Pattern hardware block diagram

The central processing server will perform most of the software tasks such as discussed

in subsequent sections of this chapter.

41
3.3.2 RFID Infrastructure

The second aspect of the hardware analysis concerns tracking an identified patients lo-

cation via RFID. There are a large number of RFID vendors available on the market. For

the purposes of this project, RFID equipment from Impinj (Impinj, n.d.) was chosen.

While not open source - the hardware platform is proprietary - it is relatively cheap and

widely used. In addition, we have had previous positive experience using Impinj equip -

ment in other projects. In figure 6 depicts the block diagram for the RFID components:

Figure 6: RFID Hardware block diagram

Each RFID reader can have multiple antennas attached to it, each in a different physical

location. The RFID readers chosen are Impinj Speedway Revolution general purpose

readers (Impinj, n.d.), while the antennas used are those recommended for use by Imp-

inj, manufactured by the vendor Laird (Laird, n.d). Each antenna reports the RFID tag

ID, along with its Relative Signal Strength Indicator [RSSI]. The RSSI can be used to de -

42
termine which antenna the RFID tag is closer to, which in turn determines an approxi-

mate physical location for the tag. This information is fed into the RFID reader, which re-

lays this information via TCP/IP to the central processing server. Since the central pro-

cessing server will store both the RFID tag ID and the patient identity via biometric vein

patterns, it can link a patient to a particular RFID tag, and it can then parse the informa-

tion from the RFID readers to present real-time patient location information to healthcare

providers.

In general RFID antennas are placed at natural chokepoints throughout the facility.

Good examples of chokepoints are the entrances and exits of a ward or room, in stair-

wells and corridors. Using at least two RFID antennas, one can also determine the direc-

tion of travel of a patient. For example, two antennas can be mounted on each end of a

corridor, and depending on which antenna reads the tag first, and which reads the tag

second, the patients direction of travel (up or down the corridor) can be easily inferred.

Hence it is important to include timestamp information with every RFID tag read.

43
3.4 Software Components

The software section of the project closely mirrors the structure of the hardware just pre -

sented. Just like the hardware, the software components can also be split by those com-

ponents dealing with the biometric aspect, and those dealing with the RFID aspect. The

software is designed to be modular in nature, adhering to Dont Repeat Yourself [DRY]

software principles (Hunt and Thomas, 2004). The objectives of the software analysis is

to design a system which:

Uses open source software and programming languages wherever possible

Re-uses as many software libraries/modules as possible. This raises the quality

of the system, as well-established libraries tend to be much better written, opti-

mized and tested than custom built software

Is easy to extend, modify and has a shallow programming learning curve

The above objectives lead to the choice of Python (Van Rossum, 2007) as the program-

ming language of choice since it is open source, very easy to learn and has a vast

choice of libraries and modules. At the time of writing, Python is a very popular choice

among AI programmers and data scientists, meaning there is an active community, mak-

ing help and quality code easier to find. For example, the Python machine learning li-

brary Scikit-learn (Pedregosa et al, 2011) and its sister library Scikit-image for image

processing (Van Der Walt et al, 2014) provide many of the required operations that will

be used in the project.

44
3.4.1 High Level Architecture

In this section we review the high level design of the proposed system and the con-

stituent modules

3.4.1.1 Training Phase

The software data flows change depending on which phase the system is currently oper-

ating. Before entering into production and identifying patients, the software must first be

trained. It is in this phase that the dictionary used by the sparse coding is built. This

phase is depicted in the DFD shown in figure 7:

Figure 7: Training Phase Data Flow Diagram

A sample of appropriate users is selected to train the software. Ideally, the sample in-

cludes as diverse a range of users as possible. Additionally, ideally the photos taken of

the users wrist are taken in the same conditions as when the system is run in produc-

45
tion, i.e. with similar lighting conditions. These two prerequisites ensure that the training

is run on a representative data set, giving more accurate results in the subsequent identi-

fication phase.

Photographs are taken of each subjects region of interest (P1), and passed through an

image enhancement algorithm that helps subsequent algorithms better identify the vein

structure (P2). The image enhancement algorithm will be presented in section 3.3.1.3.

The enhanced images are stored in a datastore with appropriate annotation describing

which image belongs to which patient (D1). These are the template images which will be

used to build a sparse coding dictionary (P3) as described in section 2.3.1.6. The sparse

coding dictionary is stored (D2) for subsequent use in the identification phase.

N.B.: The process described so far need only be run once (or at least very infrequently)

on system setup, to obtain the sparse code dictionary. The rest of the process needs to

be run every time a new user is enrolled into the system.

Once the sparse coding dictionary has been obtained, it is used by the sparse code algo-

rithm to represent any template vein pattern image in terms of the dictionary, i.e. the sys -

tem calculates the sparse representation of the template. Every user who is enrolled pro-

vides a set of template images. These template images (usually three per patient) are

taken when the patient first comes into contact with the system and are taken in different

positions and lighting conditions to as accurately as possible reflect the conditions of the

photograph being taken in subsequent identification stages. The template images are

then converted into their sparse code representation (P4). Strictly speaking, the template

images are no longer required unless the sparse code dictionary changes, which is an

important improvement when considering privacy and security as discussed in section

2.3.1.7. The system only needs regular access to the sparse code dictionary (D2) and

the template sparse code representation (D3), not the annotated image template data-

base (D1). Optionally, the sparse codes are used to train a classifier that will be used as

46
a classification algorithm in the identification phase (P5). This step is optional since it de-

pends on the choice of classification algorithm that yields the most accurate results, as

shall be outlined in the following section.

3.4.1.2 Identification Phase

Once the sparse dictionary is built. we proceed to the identification phase, as depicted in

Figure 8. Identification cannot be performed successfully on a subject until that subject

has been successfully enrolled into the system, though enrollment and identification can

occur in parallel.

Figure 8: Identification phase DFD

As can be seen in figure 8, similarly to the training phase, during identification a subjects

wrist is photographed using NIR (P1) and passed through the same image enhancement

algorithm (P2) as explained in section 3.4.1.3. The enhanced image is then operated on

47
by the sparse code algorithm and decomposed into its sparse representation in terms of

the sparse code dictionary (P3) obtained in the training phase (section 3.4.1.1). Again its

worthwhile noting from the privacy and security perspective that from this point on the

system doesnt require the actual biometric image and hence it does not need to be

stored.

Once the sparse code of the subject to be identified has been obtained, the system is re-

quired to match this to known individual sparse codes obtained during enrollment. For

this purpose we use a comparison algorithm (P4) that compares the subjects sparse

code to all known sparse codes in order to identify the individual. The most simplistic

form of doing this is using a distance measure and identifying the individual by selecting

the template which has the least distance between the sparse codes, with the advan-

tages and disadvantages described in section 2.3.1.10. In this case, we will use Eu-

clidean distance as a distance measure. We also investigated supporting the linear sto-

chastic gradient descent [SGD] classifier, which is a more advanced classification algo-

rithm based on regression. The advantage of the SGD classifier over Euclidean distance

is that is should be computationally faster since it doesnt require the system to iterate

over all template sparse codes. The SGD classifier also supports incremental / on-line

training. In the implementation of this project we will compare the two comparison meth -

ods and evaluate which to use in terms of performance vs accuracy trade-offs.

48
3.4.1.3 Image Enhancement

The image enhancement module is used to process the NIR images, making the vein

patterns more discernible by removing noise, increasing contrast and enhancing the

veins. The whole process is implemented on a python library known as SciKit-Image

(Van Der Walt et al, 2014), shown in Figure 9. The results of passing a sample image

through each process are presented in Chapter 4, Implementation.

Figure 9: Image Enhancement DFD

The image to be enhanced is output from the NIR camera in JPEG or PNG format. The

first operation taken is to crop the image and convert into grayscale (P1). This reduces

the amount of data that is required to be processed in subsequent stages and ensures

that redundant features like background are not processed. Next, the image is down-

scaled using the local mean algorithm (P2). The local mean algorithm splits the image

into configurable blocks, each of which is replaced by a single pixel whose value is equal

to the mean of the original block. This has the effect of reducing salt and pepper image

noise as well as removing smaller features such as skin pores and small hairs. The

downscaled image is cloned and one copy of the image is dilated (P3) while the other is

eroded (P4). Image erosion and dilation are algorithms associated with image recon-

struction and morphology (Scikit Image, n.d.). Image erosion shrinks bright regions and

49
enlarges dark regions, in practice, connecting dark regions of the image together. On the

other hand, image dilation shrinks dark regions and enlarges bright ones. Veins will ab-

sorb more NIR energy and hence appear as darker regions in the image. By subtracting

the dilated image from the eroded image, we remove all the relatively large bright re -

gions of the image, leaving veins in high contrast. The high contrast image is added back

into the original image to make the veins more visible. This is what the feature recon-

struction / enhancement process is performing (P5).

This enhanced image is then equalized using the adaptive histogram technique (P6). Im-

age histogram equalization algorithms are designed to enhance low-contrast images

(Scikit Image, n.d.), resulting in images whose histograms follow a linear cumulative dis -

tribution function. There are a number of variations to histogram equalization, the one

used in this project in particular is Contrast Limited Adaptive Histogram Equalization

[CLAHE] (Zimmerman et al, 1988), which once again splits the images into blocks. For

each block, a histogram is calculated, and equalization is performed on those histograms

rather than on a global image histogram, resulting in local details being enhanced even

in lighter or darker regions.

Last, the equalized image is blurred using a median filter (P7). Blurring is a common im-

age manipulation technique usually used to remove noise, in this case using a median

filter (Weiss, 2006). A median filter simply replaces each pixel with the median value of

its neighbors. In the case of vein patterns, median filter blurring has the effect of slightly

thickening the veins themselves, making them more prominent. The resulting image is

passed into the the training and/or identification algorithms as described in the previous

two sections

50
3.4.1.4 Mapping biometrics to RFID

Once a patients sparse code has been obtained and verified, the system is required to

map this to a single, unique RFID tag for tracking purposes. In this project, a one-to-one

relationship will be enforced between sparse code and RFID serial number, therefore a

simple database table is sufficient to store the mapping information. Figure 10 represents

the proposed table schema (a full databAse schema is shown in the code listing in Ap -

pendix C)

Figure 10: Mappings table schema

51
3.4.1.5 RFID Software

At this stage, a patient has been identified, assigned an RFID tag, and mapped to that

particular RFID tag serial number. The patient can now be physically tracked as they

move around the premises. Figure 11 depicts the DFD of the RFID software which en-

ables patient tracking.

Figure 11: RFID Software DFD

52
3.4.1.6 RFID LLRP Listener

The standard API to most RFID readers on the market today is the Low Level Reader

Protocol [LLRP] (Krishna and Husak, 2007). LLRP operates over a standard TCP

socket, requiring an RFID server to open communication with the reader, instructing it to

send tag reports back. As stated in section 3.4.2, Impinj has been chosen as the RFID

equipment supplier. Impinj also supply a Software Development Kit [SDK] referred to as

Octane SDK (Impinj, n.d.). Octane SDK provides Java APIs and leverages LLRP, but

significantly simplifies programming RFID readers and receiving RFID tag reports when

compared to using LLRP directly.

Therefore, the RFID server will use Octane SDK to both setup the reader (P1), and listen

for RFID tag reports over LLRP (P2). During reader setup, the RFID server instructs the

RFID reader to send back the following information for each RFID tag read:

The RFID tag ID

The Antenna ID that read the RFID tag

The IP address of the RFID reader

The above information will be used by the RFID server to uniquely identify a physical lo -

cation. Since each RFID reader can be connected to more than one antenna, it is impor -

tant to not only identify the reader by its IP address, but also to include the antenna ID.

Each location of interest should be covered by a single antenna as detailed in previous

sections. Ideally, there should be no overlap between the antenna fields so that each lo-

cation can be mapped uniquely to each antenna ID.

53
The LLRP listener implemented with the Octane SDK (P2) proceeds to buffer the tag

reads into memory (P3). This step alleviates scaling issues, since in a busy environment

tag reads can amount to thousands per second, and buffering tag reads ensures that po-

tentially slower subsequent stages do not result in dropped reads. Apart from scalability

and performance, the buffer also allows us to decouple the LLRP listener from subse -

quent stages, which results in additional flexibility. For example, the LLRP listener that

writes into the buffer can be implemented in Java as required by the Octane SDK, but

subsequent stages that read from the buffer can be implemented in Python to conform

with the rest of the project. As a buffer, the popular in-memory data structure store Redis

is used (Redis, n.d.). Redis is an easy to use, open source program that has multi-lan-

guage bindings, and provides extremely good performance.

54
3.4.1.7 Filtering and storing the RFID reads

A python program reads the RFID reads from the buffer and performs basic filtering (P4).

The primary functionality at this stage is to avoid huge storage costs by removing dupli-

cate read. A duplicate read is defined as an RFID tag read that contains the same tag ID,

reader IP and antenna ID as the preceding RFID tag read that has the same RFID tag

ID. In such a situation, the same antenna is continuously picking up the same RFID tag,

meaning the patient is not moving. If all these tag reads where to be stored, it would re-

sult in unnecessary storage being wasted on duplicate data. Hence we filter duplicate tag

reads and only store those tag reads which indicate that the tag has changed position.

This decision process is shown in figure 12. Valid tag reads will be stored in a database

table, and can be used for auditing and reporting.

55
Figure 12: Decision flowchart for filtering tag reads

3.4.1.8 Mapping RFID information to physical location

Once the RFID tag reads are filtered and stored, we next need to display the information

in a user friendly format to the end user. In order to achieve this the first step is to trans-

late or map a reader IP and antenna ID into a user-friendly location name. Figure 13 fur-

ther illustrates this point.

56
Figure 13: Example RFID reader and antennas placement in a medical clinic

Internally, the system only stores tuples in the form (Reader IP : Antenna ID), which is

cumbersome for a user to interpret. Therefore we need a location mapping table to

convert these tuples into locations.

57
Taking the scenario depicted in Figure 13, the location mapping result would be as

shown in table 7.

Table 7: Sample RFID to Location Mapping


Reader IP Antenna ID Location

192.168.1.1 1 M.D. Office

192.168.1.1 2 Registration

192.168.1.1 3 Reception

During setup, the above information is entered into the system via a WebUI. the above

design assumes that a fixed IP is given to the RFID reader, which is a reasonable as-

sumption in most of todays networks. The WebUI will show above information to a user,

essentially displaying the current location of a tag, as well as an audit trail showing past

locations of the RFID tag.

58
3.4.1.9 The Web UI

The final stage of the technical design involves specifying a storyboard specifying how

the user will interact with the system. Figure 14 specifies the storyboard for this system,

where we can see the screens that will be available to the user depending on their role,

as well as the proposed functionality of each screen.

Figure 14: Web UI Storyboard

The system will define two roles: administrative and regular users. Administrative

users will be able to perform functions such as enroll new patients into the system, re -

build the sparse dictionary, and map rfid reader antennas to locations. Regular users will

be able to identify patients, assign them an RFID tag for subsequent tracking, and view

the records for patients locations, both current and historical.

59
In keeping with the objectives and choices laid out in section 3.4, the Web UI will also be

built in Python, specifically using the Python-Flask microframework (Grinberg, 2014).

Python Flask will provide the server and runtime environment for the WebUI. The actual

UI will be written in standard HTML, CSS and Javascript. Writing the UI using web tech -

nologies has the advantage that the UI becomes device agnostic so long as the device

operating system has an Internet browser. This means the web UI of the project will be

accessible via desktop PCs, tablets, and smartphones without having to rewrite any

code.

60
3.5 Qualitative Analysis: Surveys

We now turn our attention to the design of questionnaires that will be used in surveys for two

user groups that will be in contact with the system: patients and healthcare workers. The sur -

veys will be the basis of the qualitative analysis that answers the fundamental question if the

system adds any value or benefit to the current healthcare landscape. As Sandelowski (2004)

points out, qualitative research is a very useful tool when directed by evidence-based practice.

The evidence points towards problems in identifying patients reliably as we saw in Chapter 2,

so investigating solutions to the problem is warranted. However, the main measure of success

of any IT project is how well users react to the system, how useful they find the system and

how helpful the system is to them. In this respect, qualitative analysis is the best way to capture

the success of the project.

The qualitative analysis will consist of two independent surveys conducted via questionnaires.

Each questionnaire should not take more than 5 minutes to answer in order to keep inconve -

nience to a minimum and increase response rates. For the same reason, questionnaires will be

kept simple and easy to understand, consisting almost entirely of multiple choice questions.

Following Sandelowski, Barroso and Voils (2007) guidelines to describe our findings from the

surveys, we will use metasummaries to extract results from the questionnaires, including statis -

tics such as frequency effect sizes which measure the effect the system has on both patients

and healthcare providers. This will provide our results for the Lessons Learned experimental

model introduced in section 3.1. In the following two sections we present the two question-

naires that will be used to extract information from users who will come into direct contact with

the system, namely patients and healthcare workers.

61
3.5.1 End User (Patient) Survey Questions

The first user group that is surveyed are the end users, or patients. This user group can

be comprised of any normal, healthy individual who are able to:

1. Have their wrist area scanned in order to extract a valid vein pattern

2. Answer an online survey designed to determine if the system is easy to use and

unintrusive.

The first item is a prerequisite for the entire system, considering that the biometric fea-

ture chosen is vein patterns. However as discussed in Chapter 2, extracting valid vein

patterns is possible on 95% of the human population, leaving very little individuals who

cannot participate in the study for medical reasons. Therefore the potential pool of partic-

ipants is not limited to any particular group, other than being physically able to use the

system. Participants are recruited via a simple mailshot and/or via social network post-

ings. As regards the second item, as previously discussed the success of an IT project

often depends on how easy the system is to use, while a major obstacle to using biomet-

rics from an end users perspective is how intrusive taking the biometric reading is.

Hence the survey will ask how easy the system is to use (essentially getting their wrist

scanned) and if they consider the system intrusive or not.

62
The survey will take the form of multiple choice questions, asking the user specific ques-

tions, as follows:

1. Did the system feel intrusive?

2. Was it easy to understand how to use the system?

3. How long did it take to use the system?

4. Optional: Do you have any concerns or feedback about the system?

Question 1 directly relates to if the system felt intrusive or not, while questions 2 and 3

relate to ease of use of the system. Two important factors for ease of use are how long

it takes to use the system - with the assumption that longer use times means the system

is more difficult to use - and ease of understanding. The questions are answered by se-

lecting a number between 14 and 5, each number representing varying degrees of diffi-

culty and/or intrusiveness, with 1 being not easy / intrusive at all, and 5 being very easy /

intrusive. Figure 15 shows a screenshot of the online survey as presented to the user.

Question 4 is an optional, text-based answer that prompts the user for any general feed -

back they might have. This question was kept as optional so as to ensure that the user

can complete the questionnaire in as little time as possible, hopefully encouraging partic-

ipation. Users who answer this question may raise concerns that were not thought of

during the design of the system, giving a better chance at more concrete improvement of

the project.

63
Figure 15: End User / Patient Questionnaire.

64
4.5.2. Expert User (Healthcare workers) Survey Questions

The second user group questioned are the expert users that will operate and use the

system. This user group is composed of individuals who:

1. Are involved in the healthcare industry and give care to patients. Examples of

such individuals are:

1. Nurses

2. Admissions Staff

3. Doctors and Surgeons

4. Medical Researchers

5. Pharmacists

6. Medical Insurance Personnel

2. Are proficient in using IT (but to varying degrees)

As can be seen above, the system can be used by anyone who comes into contact with

a patient, hence widening the pool of potential candidates to the study. The participants

will be recruited by advertising at their place of work (e.g. hospitals and clinics). There

are no special restrictions on who can participate other than being having an appropriate

role and having at least basic IT knowledge.

65
Similar to the previous section, the questions asked will be answered in a multiple choice

fashion, as can be seen in Figures 16 and 17. The questions asked to the participants

are as follows:

1. Which category best describes your role?

2. Is it easy to use the system?

3. Does the system disrupt your daily task flow?

4. Before using the system, how easy was it to identify a patient?

5. After using the system, how easy was it to identify a patient?

6. Before using the system, how easy was it to locate a patient?

7. After using the system, how easy was it to locate a patient?

8. Did the system have a meaningful impact on your daily work?

9. Optional: Do you have any concerns or feedback about the system?

The first question categorizes the candidate by profession, which may be useful in ex-

tracting results in the metasummary of the findings, for example finding a covariance be-

tween industry/profession and a users acceptance of the system. Questions 2-7 are de-

signed to check how easy it is to use the system in its two main functions: identifying and

locating a patient. Question 8 is designed to get an overall indication for how useful the

participants believe the system to be, while question 9 is designed to be a catch-all for is-

sues or feedback which were not foreseen. A preview of the questionnaire is shown in

figures 16 and 17.

66
Figure 16: Medical Professional Questionnaire - Part 1

67
Figure 17: Medical Professional Questionnaire - Part 2

In the next chapter, we will present the implementation of the system according to the design

specifications we laid out above.

68
CHAPTER 4. IMPLEMENTATION

4.1 Introduction

In this chapter we will outline the implementation of the design described in Chapter 3.

We discuss how we executed the designs and present real-life photos of the prototype

being built and being used, both from a hardware and software perspective. We also

highlight any deviations from the original design ideas and discuss why the decisions

were taken. This chapter also explores the technical details of the prototype build, con -

sidering factors such as equipment, programming languages, and frameworks used.

Code listings and screenshots of the prototype are included in Appendices B and C.

Similarly to Chapter 3, this chapter will be split into three main categories: the implemen-

tation of hardware, software, and the end survey questionnaire. We first discuss the

hardware choices and implementation, for biometrics and RFID. Next we discuss soft-

ware implementation for both the front-end (user facing) and back-end components.

Last, we present the implementation of the questionnaire that records users reaction to

the system.

4.2 Hardware Implementation

Hardware was required for two main areas in the prototype: building the biometric vein

capture rig, and sourcing RFID equipment such as RFID antennas and readers to track

patients throughout a premises.

69
4.2.1 Vein Pattern Capture

A cheap and easily iterable prototype was needed in order to capture Near Infrared [NIR]

photos of a patients wrist area. As described in Chapter 3, a Raspberry PI mini computer

was used to power the prototype, while a Pi Noir camera was used as a NIR sensitive

camera (Raspberry Pi, n.d.). Initially we thought of using a 3D printed chassis for the pro-

totype, however this turned out to be a very expensive option and instead we opted for

simple sturdy cardboard. In a production environment sheet metal seems to be the

cheapest option for the chassis material. However if the prototype is commercialized at

sufficient scale then production should shift to plastic injection moulding as this is

cheaper, stronger, and more hygienic than cardboard.

In figure 18 we can see a top-view photo of the cardboard chassis. The chassis has a

box-like structure, with simple screws serving as guides to help the patient place their

hand in the right position. In the photo one can see the faint outline of a hand drawn for

scale.

70
Figure 18: Top-view of the vein capture prototype

The arc like structure going over the tray supports the Raspberry Pi and camera, pic-

tured in subsequent photos. The raspberry pi can be mounted underneath the arc as

shown in figure 19, or on top of the arc.

71
Figure 19: Raspberry Pi and supporting circuitry mounted on the underneath of the arc.

72
The Raspberry Pi can be mounted on top of the arc which makes it easier to access,

which is especially convenient if the Raspberry PI is fitted with a small HDMI screen as

shown in figure 20, rather than a larger external HDMI monitor. Which option is used de-

pends on the physical requirements of the hospital using the platform

Figure 20: Small HDMI screen mounted on the raspberry PI, which can be used to
provide visual feedback to the users.

73
An HDMI monitor is a necessity to give feedback to the users of the platform. For exam-

ple, the screen can output a preview of the image that the camera sees, so more accu-

rate photos can be taken of the wrist veins. The monitor will also output the results of the

patient identified and allows for easily enrolling new patients to the platform while on the

field (i.e. without requiring patients to enroll using different hardware or in a different envi-

ronment).

Figure 21: Labelled setup of the NIR vein scanner

In figure 21 we can see a clearer picture of the raspberry pi and associated hardware

stuck to the underside of the cardboard arc. The components are labeled as follows:

Raspberry PI and case: The minicomputer powering the setup along with a pro-

tective case. Figure 21 shows the raspberry pi model 2, however this can be up-

74
graded to the more powerful raspberry pi model 3 if more performance is re-

quired (Raspberry Pi, n.d)

NIR LEDs: The surface mounted Near Infrared LEDs which provide NIR illumi-

nation at a wavelength of 950nm. The LEDs pictured in figure 21 are Brighteks

N0F14S89 3535 2.0t Series (Brightek, n.d).

Switch to power on NIR LEDs: This is a simple push button switch that

switches on the NIR LEDs to prevent excessive power consumption which can

lead to excessive heat being generated. The switch was included to increase the

LED lifespan of the prototype. In production this switch would be controlled by

software to automatically switch on when a user presents their wrist.

Pi Noir Camera: This camera is specifically designed by Raspberry Pi to be

sensitive to NIR illumination. Pictured in figure 21 is the first Pi Noir camera

model, however this can be upgraded to the raspberry Pi Noir 2 camera should

better resolution be required (Raspberry Pi, n.d).

Switch to trigger photo: The Pi Noir camera is controlled by software which de-

tects when a particular General Purpose Input Output [GPIO] hardware pin on

the raspberry pi is set to ground. This switch toggles the GPIO and hence allows

the user to trigger a photograph (please see subsequent sections for a descrip-

tion and the appendix for a full code listing).

Connector to power supply: This connector provides power to the hardware.

75
4.2.2 RFID Infrastructure

The RFID infrastructure is made up of the following components:

RFID Tags: In this project passive RFID tags were used to track patients. RFID

tags came in a very large variety of physical forms, ranging from simple paper

adhesives as pictured in figure 22, to rubber bracelets and more durable epoxy-

embedded tags. The choice of tags depends on budget, operating environment

and range required.

Figure 22: The passive RFID tags used in this project

76
RFID antennas: these antennas are powered and are sensitive to the RFID

tags. They pick up the unique ID of any RFID tag in their field. Different antennas

have different field patterns. The particular ones used for the purposes of this

project are general purpose, wide-field antennas from Laird as shown in Figure

23 (Laird, n.d.)

Figure 23: RFID Antennas

77
RFID readers: these readers from Impinj (Impinj, n.d.), shown in figure 24, are

designed to power and aggregate data from RFID antennas. The reader has

simple firmware on-board which translated the RFID signals from the antennas

into TCP packets that can be sent over a standard TCP/IP network to a server.

The manufacturer also provides a SDK that can be used to facilitate develop-

ment (see subsequent sections for a description and the appendix for a full code

listing)

Figure 24: RFID Readers

78
4.3 Software Implementation

The software implementation of this project was developed using an iterative, incremen-

tal software development life cycle (Jacobson et al, 1999). The requirements were split

into several categories and each category was developed incrementally until all func-

tional requirements were validated by testing. Broadly speaking, the requirements were

split into back-end and front-end components as shown in the block diagram in figure 25.

Figure 25: Software implementation block diagram

The following sections explore each of the above categories and their sub categories.

79
4.3.1 Back-end Implementation

The backend is in essence a web server which is responsible for serving REST API calls.

These API calls invoke functions that control functionality such as patient and RFID ad -

ministration, building a biometric sparse dictionary and identifying uploaded vein pat-

terns. The Flask python microframework was used to handle the web server functional-

ity, including parsing HTTP requests, and providing a URL router (Grinberg, 2014). The

Flask URL router is responsible for invoking the appropriate function depending on which

HTTP URL was visited by the client. For example, the following code:

@app.route('/echoer', methods=['POST'])
def echoer():
print request.form
return str(request.form)

Will invoke the function echoer whenever an HTTP POST request is sent to /echoer.

In this case the above function will simply echo back and data sent with the POST re-

quest. All the functions presented below follow the same general structure.

80
4.3.1.1 Biometric functions

The following sections describe functions that provide the biometric capability of the

system

4.3.1.2 Image Enhancement

Biometric functions essentially handle the training and identification phases previously

described in sections 3.4.1.1 and 3.4.1.2. As a result, the biometric functions also handle

the image enhancement algorithms presented in section 3.4.1.3. Figure 26 shows a

sample vein pattern being passed through the various stages of the image enhancement

algorithm presented in Figure 9.

81
Figure 26: Vein pattern image enhancement algorithm implementation, showing original image
(top left) and the final enhanced image (bottom right)

82
As can be seen in figure 26, the image enhancement algorithm presented in section

3.4.1.3 Image Enhancement, significantly enhances the vein patterns of the patients

wrist. This is crucial for the training and identification phases since it makes it easier for

the algorithms to correctly identify patients.

4.3.1.3 Training phase

With respect to the training phase, the backend provides two API calls:

/buildSparseDict

/uploadPatientTemplate

The first, buildSparseDict, is meant to be used only rarely. The function is envisaged to

be used in the following scenarios:

The buildSparseDict function is primarily called in the initial stages of deploy-

ment, when the sparse dictionary is being trained with sample NIR vein pattern

images for the first time (denoted by P3 in Figure 7). Once the sparse dictio-

nary is built, the function re-iterates over the image templates to produce a

sparse code for each one (the process denoted by P4 in Figure 7). Both the

sparse dictionary and template sparse codes (D2 and D3 respectively in Fig-

ure 7) are stored as python dictionary object on the filesystem. In its current

form, the prototype will accept a local file system directory as a container for the

annotated image templates. Each image file should be a JPG file and follows the

naming convention of : <patient identifier>_<photo number>.jpg. The patient

identifier can be generic, valid example include using a patients name directly,

83
an ID number or anonymized random number. For example, the following are all

valid names for training images:

DavidVassallo_1.jpg

S1_1.jpg , S1_2.jpg [...]

Patient123_1.jpg, Patient123_2.jpg [...]

The naming convention is required so as to correctly annotate the images. In this

way, the training algorithm will know which image template belongs to which pa-

tient.

The buildSparseDict function may also be called if a security breach occurred.

Depending on the scope of the breach, it is possible for the attacker to have

stolen the templates used to identify patients, or even have stolen the sparse

dictionary itself. In such cases, it becomes necessary to rebuild a new sparse

dictionary to prevent a malicious actor from spoofing patients with existing tem-

plates.

The buildSparseDict is quite a heavy function and depending on the number of template

images included in the directory. The more template images, the more accurate the iden-

tifier however the longer it would take to run. SInce the function is heavy on resources it

is an important design and implementation feature that it does not need to be run often.

The uploadPatientTemplate function can only be called once the sparse dictionary has

been built as outlined above. This function simply accepts a NIR vein pattern image,

along with a patient identifier. The function then calculates the images sparse code rep-

resentation in terms of the previously derived dictionary. The resulting template sparse

codes are associated with the provided patient identifier and stored on disk. This function

84
is quite lightweight and does not consume as many resources as buildSparseDict, hence

it can be used during normal operation.

85
4.3.1.4 Identification phase

With respect to the identification phase, the backend exposes a single API call for sim -

plicity and ease-of-use. This is the /identifyPatient API call. The API call accepts a single

file containing the photo of the NIR vein pattern, within an HTTP POST request. The

function then derives the images sparse code representation (process P3 in Figure 8),

and passes the results through the classification algorithm (process P4 in Figure 8).

Once this is done, the identification function returns the patient most closely matching

the vein pattern it was presented with.

In the design phase, we presented two different methods of classification: Euclidean Dis-

tance Classifier (Qian et al, 2004) and Stochastic Gradient Descent Classifier (Scikit

Learn, (2014)). During implementation testing, both classifiers performed roughly equally

in terms of accurately classifying patients, however the several differences emerged dur-

ing implementation as can be seen in table 8. Due to the differences listed in table 8, the

decision was taken to deviate from the original design plans in chapter 3 and implement

only the euclidean distance classifier in the final prototype.

Table 8: Implementation differences between Euclidean Distance and SGD classifiers


Attribute Euclidean Distance SGD Classifier

Can be parallelized on one server


(via multiple threads) as well as
Can only be parallelized on
parallelized over multiple hosts
one host (via multiple threads)
Performance (via distributed computing), allow-
hence only allows for moder-
ing for massive scalability (which
ate scaling
is extremely important in large
hospitals

Stability Accuracy was invariant to changes Accuracy had a very slight


in the sparse code dictionary sensitivity to changes in the
sparse code dictionary (re-
building the sparse code dic-
tionary would result in lower or
higher accuracies than previ-

86
ous test runs)

Requires hyperparameter tun-


ing techniques such as grid
Simplicity Extremely easy to implement
searches, which increase the
complexity of the program

Even though SGD supports


online training, the SKLearn li-
brary with which SGD was im-
plemented requires that all
possible patients are known
beforehand. This is reflected
on the library documentation
which states:

For classification, a some-


what important thing to note is
that although a stateless fea-
ture extraction routine may be
Ability to general- No issues encountered when in- able to cope with new/unseen
ize to unseen/new troducing/enrolling new patients attributes, the incremental
patients into the system learner itself may be unable to
cope with new/unseen targets
classes. In this case you have
to pass all the possible
classes to the first partial_fit
call using the classes= param-
eter. (Scikit Learn, 2014)

While it is possible to work


around this by initializing the
classifier with a large number
of empty classes (i.e. pa-
tients), this workaround hin-
ders scalability and simplicity.

87
4.3.1.5 RFID functions

The back end implementation is responsible for gathering data from the RFID readers as

described in section 3.4.1.5.1 and section 3.4.1.5.2. The Low Level Reader Protocol

[LLRP] listener described in section 3.4.1.5.1 is implemented in Java (readTags.java)

since the provided manufacturer SDK was written in Java. In essence, the Java program:

Fetches the configured reader IP addresses from the SQLite database

For each reader IP, creates a separate thread which monitors the reported tags

For each tag read, inserts an entry into a Redis First In, First Out [FIFO] memory

queue in the form:


<reader_ip>,event.tag.arrive,tag_id=<tag_id>,antenna=<antenna>,rssi=<rssi>,<time-

stamp>

A separate python script takes over processing at this point, and implements the design

described in section 3.4.1.5.2. The script pops individual entries from the queue and pro-

ceeds to process them by:

Convert the RFID reader IP address into the internal RFID reader ID stored in

the database

Checks if the tags current location has changed. If it has changed, it creates a

new row in the relevant SQLite table, otherwise the entry is discarded. These

row entries are subsequently displayed to the user via a web GUI as described

in subsequent sections.

For a full code listing of these programs please refer to appendix C

88
4.3.1.6 Patient functions

Patient functions in the backend are mainly limited to simple Create, Read, Update and

Delete [CRUD] functions. This allows the front-end to instruct the server to:

Add or remove patients (via the /delPatientProfile & /addPatientProfile API

calls)

Display configured patients (via the /getPatientProfiles API call)

Bind a patient identity to an RFID code (via the /updatePatientRfid API call)

Determine if a patient has been biometrically enrolled or not, i.e. check if a pa -

tient has an associated sparse code or not (via the /getPatientBiometrics API

call).

4.3.1.7 Render functions

Render functions are those functions which return HTML, CSS and Javascript to the

client browser that requests them. These functions return the front-end, i.e. the web UI

which the end users interact with. The render functions consist of the following three API

calls:

/ (root URL) : returns the login.html template, where a user selects which role

they would like to assume. Depending on their choice, they will be redirected ei-

ther to the administrator web UI, or to the operator web UI

/admin : returns the admin.html template, which interfaces with administrative

functions such as creating patient profiles, building a sparse dictionary and RFID

reader management.

/operator : returns the operator.html template, which interfaces with operator

functions such as viewing patient locations.

89
4.3.2 Front-end Implementation

The front end is responsible for the interaction between the end user and the system.

The backend server returns HTML, CSS and Javascript code to the users browser

which then renders that code into the front-end. The front-end is mainly written using the

ReactJS framework from Facebook. (Vipul & Sonpatki, 2016). ReactJS was chosen

since it simplified the code, and made it very easy to modularize the code into re-usable

components, while forcing a well defined structure to the code which adheres to current

industry best practices, rather than an adhoc standard. The ReactJS code is stored in

.JS files in the /scripts folder

4.3.2.1 Administrator / Operator front-end

Depending on the role chosen by the end user, the front-end will render either the admin-

istrator or operator console, as can be seen in the annotated screenshots show in figures

27 and 28.

Figure 27: The administrator front-end

90
Figure 27 is a screenshot of the administrator front-end with the following annotations:

1: The navigation bar allows the administrator to switch between Settings and

Enrollment. The former displays functionality that is usually seldom used or is a

one-time action, mainly managing RFID readers, locations and building the

sparse dictionary. The latter is used for more day-to-day tasks, mainly enrolling

users to the biometric system and assigning them RFID tags.

2: The tabs allow the administrator to switch between several sub-categories to

the navbar described above. Depending on the navbar option chosen in (1), the

tabs will change to offer additional functionality

3: The main actions area. Depending on the choices selected in (1) and (2)

above, the main actions area changes to show the appropriate functionality. In

this particular screenshot, the area is showing functionality pertaining to man-

agement of RFID readers

91
Figure 28: Operator front-end

The operator front-end has a similar layout to the administrator front-end previously de-

scribed, albeit with fewer options. In figure 28 we can see a screenshot of the operator

frontend with the following annotations:

1: The tabs allow the operator to choose between two functionalities:

Audit: The audit screen is intended to be the main function offered to

end users. It allows the operator to query the database either by RFID

Tag ID, or by location to see patient locations.

Last Seen: The last seen screen is intended to be a quick lookup of

exactly where a particular patient was last detected by the RFID readers.

2: The main action area. Depending on the user choices in (1) above, the main

action area will change to offer the selected functionality.

For additional screenshots please refer to appendix B

92
4.3.2.2 Vein Scanner front-end

The vein scanner front-end is responsible for interacting with the end user, specifically

with patients and with healthcare professionals identifying those patients. Initially during

the design phase it was envisaged that the frontend would be a web-based user inter -

face that resided directly on the Raspberry PI that powered the vein scanners. This de -

sign has changed slightly, with the frontend still residing directly on the Raspberry PI,

however it is no longer web-based. This decision was taken due to implementation

specifics when using the library that controls the Raspberry PI Noir camera. This library

allows the developer to specify that a live preview should be displayed on screen, show-

ing a video feed of whatever the camera is viewing. This preview function is being used

to allow patients or healthcare workers more accurately position their wrists so a better

picture can be taken. This is the same principle as looking through a cameras viewfinder

to better align a photograph. Unfortunately this preview function can only display the

video feed within a native operating system window, therefore the vein scanner frontend

was written to use native operating system windows as well for conformity. Figures 29

and 30 illustrate the front-end in action.

Figure 29: User Function Menu

93
In figure 29 we can see the two main functions that the vein scanner performs: identify -

ing a patient, or enrolling a new patient into the system. Selecting the identify option will

cause a preview window to be displayed, whereupon a patient can place their wrist to be

scanned. Once the picture trigger is taken, the photo is sent to the server in the back-

ground, which in turn attempts to identify the patient and returns its results to the user as

shown in Figure 30.

Figure 30: Identify functionality results

The enroll function performs very similarly to the aforementioned identify procedure,

however before taking a picture, the user is prompted to enter the patient identifier via a

text dialog box. This identifier is subsequently sent to the server along with the picture

taken of the patients wrist vein patterns. This picture is treated as the template image

for the patient specified, at which point the server calculates the sparse codes for the im-

age and stores them under the patients identifier. The patient can then be identified us-

ing the previously described identify function.

This enroll function is designed to be used on a day-to-day basis to enroll patients who

were not part of the initial sparse dictionary training phase. The only-prerequisite to using

the enroll function is to have already built a sparse code dictionary. While building the

sparse code dictionary via the use of template images is a time-consuming process, en -

94
rolling patients using the front-end just described is very quick (the process takes under

a minute to complete) and hence should have minimal impact on day-to-day operations.

95
4.4 Survey Implementation

As described in section 3.5, two separate surveys were implemented; one questionnaire

aimed at end users (representative of patients in a hospital), and another questionnaire

aimed at healthcare professionals who would be operating the system. The participants

for each survey were recruited according to the recruitment plans outlined in section

4.4.1.

In order to encourage responses to the surveys, the questionnaires were purposely

made to be easy to access, easy to understand, and quick to answer so as to have as lit -

tle impact as possible on participants. As such, the questionnaires are hosted online and

can be accessed from any device so long as it has internet connectivity, so participants

could answer at their own leisure and without any external pressures. The questions are

almost entirely composed of multiple choice questions to make them easy to answer.

The end user survey consists of only 3 compulsory questions while the healthcare

worker survey consists of 8 compulsory questions. The participants are also encouraged

to leave any additional feedback they may have in free-text sections of the questionnaire.

96
4.4.1 Recruitment Plan

4.4.1.1 Recruitment plan for end users

1. Send out advertisement via email to friends/family/colleagues/acquaintances

2. Send out the same advertisement via facebook

3. Any replies to the above will be assessed by the following criteria:

- Does the candidate have any damage to both wrists? (if yes, exclude from

study)

- Does the candidate report directly to me or works on any projects I am currently

involved in? (if yes, exclude from study)

- Is the candidate expecting remuneration, monetary or otherwise? (if yes, ex-

clude from study

4. Accepted candidates will be asked to read, understand, and sign the consent

form.

5. If participants agree to the consent form and sign the document, their name, sur-

name and contact details will be recorded in the participant information sheet

6. The data collected in step 5 will be used to communicate with participants in or -

der to setup a convenient time for them to physically meet with me to get their

wrist scanned

7. The data collected in step 5 will be used to communicate with participants in or -

der to ask them to fill in the online questionnaire

97
4.4.1.2 Recruitment plan for healthcare professionals

1. Send out advertisement via email to friends/family/colleagues/acquaintances

2. Send out the same advertisement to hospital contacts

3. Any replies to the above will be assessed by the following criteria:

- Does the candidate have a healthcare role such as admissions clerk, nurse,

doctor, or surgeon? (if not, exclude from study)

- Does the candidate report directly to me or works on any projects I am currently

involved in? (if yes, exclude from study)

- Is the candidate expecting remuneration, monetary or otherwise? (if yes, ex-

clude from study

4. Accepted candidates will be asked to read, understand, and sign the consent

form.

5. If participants agree to the consent form and sign the document, their name, sur-

name and contact details will be recorded in the participant information sheet

6. The data collected in step 5 will be used to communicate with participants in or -

der to setup a convenient time for them to physically meet with me to show them

how the system works and allow them to use it for as long as they wish

7. The data collected in step 5 will be used to communicate with participants in or -

der to ask them to fill in the online questionnaire about their experience

98
4.4.2 Delivery of Questionnaires and collection of results

The questionnaires were implemented in and delivered via Google Forms. This allows

participants to access the surveys online and answer the questions in their own time,

without any pressure. Google Forms also conveniently collects and processes the an-

swers to survey questions. Google Forms summarizes answers by providing pie charts

or bar charts of responses, which we use to metasummarize the results, which we

present in section 5.5

In the next chapter we present the results of the accuracy of the system as well as user

feedback after having used the system, from both an end user and healthcare profes-

sional point of view

99
CHAPTER 5. TESTING AND RESULTS

5.1 Introduction

In this chapter we will present how each major component of the proof of concept was

tested and the results from these tests. The major components under test are as follows:

The vein pattern capture system

The RFID infrastructure

The user experience and feedback

For each of these components the following sections define our testing methodology and

present test results

5.2 Vein Pattern Capture System

The vein pattern capture system was the main technical focus of the testing process

since it is entirely custom built without any reliance on 3rd party vendors and is the major

contribution of this project to the project sponsor. Both the hardware and the software of

the vein pattern capture system had to be tested, with the objective of maximising accu-

racy of patient detection.

100
5.2.1 Testing Method

During testing of the vein pattern system, both hardware and software variables needed

to be changed so as to test the limits of the system. In order to replicate different real

world scenarios, the testing for the vein pattern system was split into three, which re -

sulted in three datasets.

Data set 1: In this situation participants had NIR photos taken of their wrists in

optimal conditions. The proof of concept was placed in a darkened room, and the

NIR LEDs were kept on throughout the time that the photos were taken, at maxi-

mum brightness, allowing for the maximum amount of exposure possible. In ad-

dition, the participants were instructed to keep their hands very still when taking

the photos.

Data set 2: In this situation participants had NIR photos taken of their wrists in

near-optimal conditions. The proof of concept was placed in a darkened room,

and the NIR LEDs were kept on throughout the time that the photos were taken,

however at less brightness, allowing for a medium amount of exposure. In addi-

tion, the participants were instructed to keep their hands still when taking the

photos.

Data set 3: In this situation the participants had the NIR photos taken in less

favourable conditions. The room was well-lit with fluorescent lighting, however

the NIR LEDs were kept on throughout the time that the photos were taken, to

allow for maximum exposure. The participants were instructed that they were

able to move their wrists in different positions under the camera

Data set 4: In this situation the participants had the NIR photos taken in even

less favourable conditions. The system was placed in a well lit room and partici-

pants were instructed that they were able to move their wrists as before. How-

ever, this time the NIR LEDs were programmatically switched on just before the

101
photo was taken and switched off immediately after, therefore allowing for only

minimum NIR exposure. The idea behind this scenario was to decrease the

power consumption of the system.

In each data set, four photos were taken from each participant, two from each wrist. We

then employed leave one out cross validation (Refaeilzadeh, Tang, & Liu, 2009), by

designating one photo as a testing photo, and the other three as training photos, and

measuring the accuracy of the recognition system. The process was repeated four times

to include every image at least once.

In order to increase accuracy, all the pictures from the training dataset were used to opti-

mize the parameters used by the sparse dictionary learning algorithm. After training and

testing the system, its accuracy was measured. The accuracy of the system was mea-

sured as the percentage of correctly identified subjects with respect to the total number

of tested subjects. In order to optimize the parameters, an evolutionary-based search

pattern was employed:

A single parameter in the algorithm is changed, and the resulting accuracy of the

system is compared to the previously obtained accuracy. If an improvement is

observed, this change is kept.

Otherwise the change is reset to its previous value and a new parameter is

changed.

102
Table 9 shows a sample of this process to optimize three parameters used internally by

the vein recognition system (Scikit Image, n.d.):

n_components : number of components or dictionary elements with which to

represent an image

Alpha: sparsity control

N_iter: number of iterations to perform when learning sparse dictionary

Batch_size: number of sample images to use in each iteration

Image_processing: any additional pre-processing steps taken on the training

images (for example cropping an image)

Table 9: Evolutionary Testing of Dictionary Learning Algorithm

Note that table 9 is a sample of the actual accuracy and not the final result. Final accu-

racy measures are presented in Section 5.2.2.2. Note that several items in highlighted in

bold in table 9 to denote which of the parameters were changed in order to optimize the

algorithm. For example, in the second data row, we see that uncropped image is high-

lighted in bold, indicating that this parameter was changed from the previous value

(which was cropped image). In addition, the second row also has yes under keep

mutation? highlighted in bold, meaning that the changed parameter will be retained

since it improved accuracy. In the third row, batch_size was changed from 3 to 10, so it

103
is highlighted in bold, and it also improved accuracy so yes is once again highlighted in

bold. The process continues down the table till accuracy peaks. In each row, only one

parameter is changed in order to make sure that we can effectively keep track of which

changes result in better accuracy and which do not.

Once the parameters were finalised the sparse dictionary learning algorithm was run on

the training set and tested on the test set images. The following section analyses the

overall accuracy of the system under different conditions.

104
5.3 Results

5.3.1 Sample NIR photos

In this section we present sample NIR photos from each dataset.

Data Set Sample Image

Data Set 1
(Dark
Room,
Maximum
NIR expo-
sure)

Data Set 2
(Dark
Room,
Medium
NIR expo-
sure)

105
Data Set 3
(Light
Room,
Maximum
NIR Expo-
sure)

Data Set 4
(Light
Room,
Minimum
NIR Expo-
sure)

Figure 31: Set of figure showing the data set sample images

As can be seen from figure 31, visually the vein pattern can be best observed in the first

dataset, with the smaller vein structure being better visible as well as the major veins.

106
5.3.2 Accuracy Results

Analysis of accuracy of the system revealed that there was no significant changes in ac-

curacy between the last three datasets, however there are marked differences in accu-

racy between the first dataset and the others.

Figure 32: Average Accuracy on data sets

Figure 32 illustrates that the first data gave the best results, with an average of 91% ac-

curacy. If we consider the potential population of users in a single hospital as being

around 1200 (University Hospitals Birmingham NHS Foundation Trust, 2015), we can

conclude that the accuracy of the system including the confidence interval is 91% 9.79

at a confidence level of 95%. This indicates that under proper conditions it is possible to

accurately identify individuals via their wrist vein patterns. It is possible to further improve

the accuracy by testing a variety of improvements, such as:

107
Adding more NIR LEDs such as the Bright Pi system (Pi Supply, n.d)

Opting for a more expensive but better performing NIR camera such as those of-

fered by E-Con Systems (E-Con Systems, n.d.)

If the system is going to be used in a bright area, apply NIR photography filters

to filter out visible light (Amazon, n.d.)

Adding more guides for users to consistently present the same wrist patterns to

the system, by using hardware such as a non-reflective plastic mould where they

can place their hand.

108
5.4 RFID Infrastructure

The RFID infrastructure was limited due to budget, and consisted of:

A single RFID reader from Impinj

A single RFID antenna from Laird

Generic RFID tags were used

5.4.1 Testing Method

Testing on the RFID consisted of ensuring that the antennas could pick up the RFID tags

at the expected range, given that a patient was wearing the tags on their person - such

as on a bracelet, attached to their clothing, or on a lanyard. In addition, the communica -

tion between readers and the system frontend was tested to ensure proper functionality.

5.4.2 Results

The functionality of the frontend worked as expected, however the range of generic RFID

tags varied somewhat. The range of the RFID tags was highly dependent on the orienta-

tion of the user with respect to the antenna. If a clear line of site was available, the RFID

tags could be picked up at a range of approximately 4 meters. However this range was

reduced by 50% if the patient oriented their body in such a way as to have their body be -

tween the tag and the reader. This is to be expected, considering the attenuation a hu -

man body would introduce. Due to this, it is recommended to use especially designed

RFID tags that have more power, often referred to as battery-assisted passive tags

within the industry (CoreRFID, n.d.). In addition, care should be taken to place tags

109
strategically (such as embedded in the front of a patient gown or embedded in a

bracelet).

5.5 User Experience and Feedback

In this section we evaluate users reactions to using the system. The section will first ex -

plain how reactions were tested, followed by two sections presenting the results. The first

result section will focus on end users who represent patients. Their interaction with the

system is mainly limited to the vein pattern capture system; therefore their responses are

a measure of how well the hardware of the system works. The second result section will

focus on the healthcare professionals who use the system frontend (the Web UI). Their

interaction with the system is mostly centered on using the software of the system to

track patient locations and therefore their responses are a measure of how well the fron -

tend of the system works.

5.5.1 Testing Method

In order to measure user reactions to the system, questionnaires were used. In the case

of end-users, questionnaires were completed after the users had been asked to use the

system in real-life scenarios. In the case of healthcare professionals, questionnaires

were completed after the professionals were given a quick tutorial of the system, how to

use it and allowed to experiment with the front-end without supervision.

In the following sections we present summary statistics of the responses for each of the

questions. We assume an end user population size of 7 billion, which is almost equal to

the worlds population. With respect to healthcare professionals we assume a population

size of 59,220,000 which is the latest count of worldwide healthcare workers provided by

the WHO in 2006 (World Health Organization, 2006)

110
5.5.2 End User Results

In total, 33 participants responded to the questionnaire. Figures 33 to 35 present sum -

mary statistics of each question they were asked:

Figure 33: End-user reaction to Did the system feel intrusive?

Figure 34: End-user reaction to Was it easy to understand how to use the system?

111
Figure 35: End-user reaction to How long did it take to use the system?

Figure 33 shows that the majority of the users felt that the vein scanning system was not

intrusive at all, with 93.9% 8.29 (with a 95% confidence level) of the respondents mark-

ing the minimum of 1 on an intrusiveness scale ranging from 1 to 5. The majority of the

users also thought the system was very easy to understand (72.7% 15.44 with a 95%

confidence level), as shown in Figure 34. It is worth noting that most users were unsure

of how and where to place their hands when having their vein patterns captured, further

reinforcing the suggestion made in Section 5.2.2.2 of incorporating some sort of handle

or mould for users to grasp or place their hand on, reducing their queries and improving

the accuracy of the system.

Finally, figure 35 shows that a very large percentage of users (97% 5.91 with a 95%

confidence interval) reported minimal time spent using the system - under 2 minutes. It is

worth noting that in this case the users response includes their experience of enrolling

into the system. After enrollment and during day to day operations, it is expected that the

overwhelming majority of users will report having to use the system for under a minute.

112
5.5.3 Healthcare Professional User Results

In total, 3 participants responded to the questionnaire, all were doctors or surgeons, as

figure 36 shows.

Figure 36: Healthcare professionals survey results to describe their role

113
Figures 37 to 43 present summary statistics of each question they were asked:

Figure 37: Healthcare professionals survey results to rate the system ease of use, from 1 (very difficult) to
5 (very easy)

Figure 38: Healthcare professionals survey results to rate the system disruption, from 1 (not disruptive)
to 5 (very disruptive)

114
Figure 39: Healthcare professionals survey results to rate difficulty of identifying a patient, before the sys-
tem was used, from 1 (difficult) to 5 (easy)

Figure 40: Healthcare professionals survey results to rate difficulty of identifying a patient, after the sys-
tem was used, from 1 (difficult) to 5 (easy)

115
Figure 41: Healthcare professionals survey results to rate difficulty of locating a patient, before the system
was used, from 1 (difficult) to 5 (easy)

Figure 42: Healthcare professionals survey results to rate difficulty of locating a patient, after the system
was used, from 1 (difficult) to 5 (easy)

116
Figure 43: Healthcare professionals survey results to rate the beneficial impact of the system, from 1 (no
impact) to 5 (large impact)

Figure 37 shows that the majority of the healthcare professionals ( 99.9% 11.26 with a 95%

confidence interval) find the proof of concept system very easy to use in general. Figures 39

and 40 show a marked improvement in the doctors assesment of how easy it is to identify a

patient once the system is introduced. Figures 41 and 42 shows the same improvement for

subsequently locating a patient. Most doctors also felt that the system would not disrupt their

workflow as can be seen in figure 38 . The feedback from the doctors was lukewarm about how

beneficial the system would be to their daily work as illustrated in figure 43, however they gave

valuable feedback about how to improve this:

Integration with other medical systems would be extremely beneficial to them

Certain organizations would regard the system as more useful than others.

Specifically, most doctors brought up the example of a psychiatric or mental hos-

pital where patients should be allowed to move around, but under supervision

Certain healthcare roles would find the system much more useful than others.

While doctors and surgeons might not find the system brings that much benefit,

this is mostly because they are generally insulated from tasks like identifying and

locating patients by nurses and admissions staff, who would in turn find this proof

of concept system much more beneficial.

117
The results are very promising and encourage further development of the system when

one considers that the very basic proof of concept system had an excellent reception by

both patients and doctors.

In the next chapter we present our conclusions to the thesis

118
CHAPTER 6. CONCLUSIONS

In this project we built a working proof of concept system which successfully identified a

small sample of patients with an average of 91% accuracy. Subsequently these patients

where tracked across a medical facility using RFID technology. Surveys of both patients

and healthcare professionals using the system showed that both groups were very re-

ceptive to using the system and we believe that a product based on the proof of concept

presented here has the potential to alleviate the problem of medical errors due to patient

misidentification and mislocation.

6.1 Lessons Learned

This project touched on several challenging areas such as image processing, biometrics,

RFID, healthcare and patient safety, and the results are very encouraging, justifying ad-

ditional development of the proof of concept into a fully-fledged product. We can summa-

rize the lessons learned as follows:

With respect to patient identification and biometrics, high accuracy identification

capable of 91% accuracy and above can be achieved using very cheap (sub-

$200 total cost) off-the-shelf components. However the quality of the hardware

used to take patient vein pattern photographs had a very large impact on the

systems resulting accuracy. The larger the patient population that uses the sys-

tem, the larger the investment in hardware should be, specifically:

The NIR camera should be the best one can afford

Proper design of the vein capturing rig is essential, therefore if possible

include hand guides or handles to make sure the patient wrist is rela-

tively in the same position when their veins are scanned

119
With respect to RFID, results were excellent, however once again we see better

results in terms of RFID read range and RFID read rates as we invest in better

tags. Simple paper RFID tags may be sufficient in some environments, however

in busy areas such as hospitals it becomes necessary to invest in better tags

such as battery assisted tags to more reliably locate patients

With respect to user acceptance, both in terms of end users and healthcare pro-

fessionals the feedback regarding the proof of concept was very good. The proof

of concept needs some more work in terms of user experience with more intu-

itive design and a more professional look, as well as further software develop-

ment to integrate with already existing hospital systems. The more integration

between components in a hospital environment, the better the rate of accep-

tance among hospital staff.

We believe the dissertation has proved that the original two hypothesis (reiterated below)

are indeed valid.

Hypothesis 1: The vein pattern biometrics significantly increases the ease and ac-

curacy of patient identification.

With high accuracy rates achieved even using basic hardware it is easy to see that with

additional investment it is very possible to have vein pattern biometrics identify individu-

als within a hospitals patient population, eliminating the problems with human error and

without the need for patients to carry any additional data such as ID cards.

Hypothesis 2: Biometric systems can be successfully integrated with existing

RFID solutions to track patients, providing an end-to-end identification and track-

ing platform for patient and carer safety

120
From a technical perspective integrating biometrics with RFID solutions was not prob-

lematic, and it is indeed a viable proposition.

6.2 Applications

The proof of concept illustrates that the concepts used to build the system provide a solid

foundation for any scenarios where identification and tracking are required. In this disser-

tation, identification and tracking have been applied together to build a system that would

help positively identify and subsequently track patients as they move across a medical

facility. Other applications for identification and tracking are rather varied, especially if

one considers the two separately. Apart from the patient identification and tracking sce-

nario described in this dissertation, some healthcare applications where the system can

be used include:

Pharmaceutical automatic dispensation: Vein pattern identification can be

used to not only verify that a patient is who they say they are, but also to make

sure they are present, helping to deter prescription fraud by medical identity

theft. Prescription fraud has been known to affect over 60% of the US population

(Imandoust, n.d.). Tracking can be used to track medicine location and alert on

expiry dates, unauthorized movement and so on.

Restricting physical access to authorized personnel only: In this application,

vein pattern identification and RFID tracking work together to increase physical

security. Not only does a user require a specific RFID tag to access a restricted

area, but they also need to present the appropriate vein pattern.

Securing sensitive remote access to hospital / clinic networks: The pro-

posed system is cheap enough to be deployed at scale, and it is possible to build

small, self contained units that perform vein pattern recognition for a particular

user. This user would need to present the appropriate vein pattern to this device,

which in turn releases and authentication token, giving remote network access

121
(such as VPN or web portal sign in) to the user. This can be used in conjunction

with more traditional security mechanisms such as passwords or hardware to-

kens generating a random PIN number.

6.3 Limitations

The proof of concept also highlighted several issues with the proposed system:

With respect to the vein identification system, the positioning of the wrist was

highly influential on the resulting accuracy. If users are not given proper guidance

on exactly where to place their hands for their wrists to be scanned, the accuracy

reduces dramatically. This need for control over positioning may be an issue in

some environments where users are actively trying to avoid identification.

There is no one size fits all when dealing with RFID tags. For example, an

RFID tag which works well in one environment such tracking a patient in a nor-

mal room will not work in another environment such as tracking a patient lying on

a metal bed (since the metal interferes with RF signals). Careful consideration

must be made as to where and how the system is going to be used in order to

recommend the correct type of RFID tags to be used

There is currently still some manual intervention required in the system when

pairing patients to RFID tags (i.e. even though a patient is identified automati-

cally, an operator would still need to subsequently manually enter an RFID code

to associate this with the patient).

From a user interface perspective the system still lacks many features that a pro-

duction system would require such as:

Enhanced search and auditing capabilities

Robust authentication and authorization capabilities

Exposing an API to allow for integration with other hospital systems

122
Addition of visual aids such as maps to help in tracking patient move-

ments

6.4 Recommendations & Prospects for Future Research / Work

The project opens up several avenues of improvement and future research, as outlined

below:

Hardware is continuously improving. During the time of writing this dissertation,

the Raspberry Pi Foundation have released an improved NIR camera, featuring

better resolution (Raspberry Pi, n.d.). One immediate improvement would be to

test the system using the new camera and compare this to the old one. Similarly,

other improvements to the camera could be done such as investing in more spe-

cialized (and expensive) NIR cameras

As mentioned in the previous section, currently an operator needs to manually

input an RFID code to represent an identified patient. The system could reduce

human error by instead prompting the operator to place the RFID tag over a

desktop RFID tag reader and automatically entering the RFID code for the identi-

fied patient.

With respect to the vein pattern recognition system, we identify the following ar-

eas of improvements:

Further testing and improvement in image enhancing techniques (as pre-

sented in Section 4.3.1.1.1) that would make vein patterns more visible

and consistent even under different lighting conditions

Testing of multiple image feature sets. The project in its current form

uses image features extracted from sparse coding. Other image feature

extraction algorithms such as ORB (Rublee et al, 2011) can be com-

123
bined together with the sparse coding features to create a richer and

more stable feature set with which to recognize the vein patterns

Testing of various classification techniques. The project in its current

form uses euclidean distance as a basis for matching vein patterns to

patients. It would be interesting to see how other classification algorithms

such as SVM would perform.

124
REFERENCES CITED

Aboalsamh, H.A., Alhashimi, H.T. and Mathkour, H.I., 2012, January. Applying Recent Vein Im-

age Enhancement Techniques In Vain Biometrics. In Proceedings of the International Con-

ference on Image Processing, Computer Vision, and Pattern Recognition (IPCV) (p. 1). The

Steering Committee of The World Congress in Computer Science, Computer Engineering

and Applied Computing (WorldComp).

Abdi, H. and Williams, L.J., 2010. Principal component analysis. Wiley Interdisciplinary Re-

views: Computational Statistics, 2(4), pp.433-459.

Amazon, n.d. Hoya 52mm RM72 Infrared Filter [online] <Available from:

https://www.amazon.com/Hoya-52mm-RM72-Infrared-Filter/dp/B0000AI1FZ> (Accessed

October 2016)

Badawi, A.M., 2006. Hand Vein Biometric Verification Prototype: A Testing Performance and

Patterns Similarity. IPCV, 14, pp.3-9.

Betances, R.I.G. and Huerta, M.K., 2012. A review of automatic patient identification options for

public health care centers with restricted budgets. Online Journal of Public Health Informat-

ics, 4(1).

Brightek, n.d. N0F14S89 datasheet, [online] <Available from:

http://www.brightekeurope.com/productcart/pc/catalog/N0F14S89.pdf> (Accessed August

2016)

Cardinal, D. 2013. How to turn your DSLR into a full spectrum super camera, ExtremeTech,

[online] <Available from: http://www.extremetech.com/electronics/144388-how-to-turn-your-

dslr-into-a-full-spectrum-super-camera> (Accessed June 2016)

Chassin, M.R. 2002 The wrong patient, Annals of Internal Medicine, 136(11), p. 826. doi:

10.7326/0003-4819-136-11-200206040-00012.

Chen, Y.L., 2009. Data Flow Diagram. In Modeling and Analysis of Enterprise and Information

Systems (pp. 85-97). Springer Berlin Heidelberg.

CoreRFID. Battery Assisted Passive Tags, [online] <Avaialble from:

http://www.corerfid.com/rfid-technology/rfid-tracking/battery-assisted-passive-tags/> (Ac-

cessed November 2016)

125
Collinson, P. 2014. Forget fingerprints banks are starting to use vein patterns for ATMs, The

Guardian, UK, [online] <Available from:

http://www.theguardian.com/money/2014/may/14/fingerprints-vein-pattern-scan-atm> (Ac-

cessed May 2016)

E-Con Systems, n.d. See3CAM_12CUNIR - 1.3 MP Monochrome USB NIR Camera [online]

<Available from: https://www.e-consystems.com/1MP-USB3-Near-IR-Camera.asp#Key-

Features> (Accessed October 2016)

ECRI Institute, 2016. Top 10 Patient Safety Concerns for Healthcare Organizations, Executive

Brief. ECRI Institute [online] <Available from:

https://www.ecri.org/EmailResources/PSRQ/Top10/2016_Top10_ExecutiveBrief_final.pdf>

Gompertz, S. 2014. Bank customers to sign in with 'finger vein' technology, BBC [online]

<Available from: http://www.bbc.com/news/business-29062901> (Accessed May 2016)

Faragher, R. and Harle, R., 2014. An analysis of the accuracy of bluetooth low energy for in-

door positioning applications. In Proceedings of the 27th International Technical Meeting of

the Satellite Division of the Institute of Navigation (ION GNSS 14).

Farmer, B. 2011. Daniel Pearl was beheaded by 9/11 mastermind , The Telegraph, UK, [on-

line] <Available from:

http://www.telegraph.co.uk/news/worldnews/asia/afghanistan/8271845/Daniel-Pearl-was-

beheaded-by-911-mastermind.html> (Accessed May 2016)

Fatima, A 2011, 'E-Banking Security Issues -- Is There A Solution in Biometrics?', Journal Of In-

ternet Banking & Commerce, 16, 2, pp. 1-9, Business Source Complete, EBSCOhost,

viewed 6 May 2016.

Galbally-Herrero, J, Fierrez-Aguilar, J, Rodriguez-Gonzalez, J, Alonso-Fernandez, F, Ortega-

Garcia, J, & Tapiador, M 2006, 'On the vulnerability of fingerprint verification systems to

fake fingerprint attacks', Conference On Crime Countermeasures And Security. Proceed-

ings, p. 130, SwePub, EBSCOhost, viewed 6 May 2016.

Gayathri, S., Nigel, K.G.J. and Prabakar, S., 2013. Low cost hand vein authentication system

on embedded linux platform. Int J Innovative Technol Exploring Eng, 2(4), pp.138-141.

Grinberg, M., 2014. Flask Web Development: Developing Web Applications with Python. "

O'Reilly Media, Inc.".

126
Haralick, R.M., Sternberg, S.R. and Zhuang, X., 1987. Image analysis using mathematical mor-

phology. Pattern Analysis and Machine Intelligence, IEEE Transactions on, (4), pp.532-550.

Hashimoto, J., 2006, June. Finger vein authentication technology and its future. In VLSI Cir-

cuits, 2006. Digest of Technical Papers. 2006 Symposium on (pp. 5-8). IEEE.

Howanitz, P.J., Renner, S.W. and Walsh, M.K., 2002. Continuous wristband monitoring over 2

years decreases identification errors: a College of American Pathologists Q-Tracks Study.

Archives of pathology & laboratory medicine, 126(7), pp.809-815.

Hunt, A. and Thomas, D., 2004. OO in one sentence: keep it DRY, shy, and tell the other guy.

Software, IEEE, 21(3), pp.101-103.

Imandoust, S. n.d. Prescription Fraud Resulting From Identity Theft, Identity Theft Resource

Centre, [online] <Available from: http://www.idtheftcenter.org/Identity-Theft/prescription-

fraud-resulting-from-identity-theft.html> (Accessed November 2016)

Impinj, n.d. [online] <Available from: http://www.impinj.com/> (Accessed June 2016)

Impinj, n.d. Speedway Revolution, [online] <Available from:

http://www.impinj.com/products/readers/speedway-revolution/> ( Accessed June 2016)

Impinj, n.d. Octane SDK, [online[ <Available from: https://support.impinj.com/hc/en-

us/articles/202755268-Octane-SDK> (Accessed June 2016)

Lahtela, A., Hassinen, M. and Jylha, V., 2008, January. RFID and NFC in healthcare: Safety of

hospitals medication care. In Pervasive Computing Technologies for Healthcare, 2008. Per-

vasiveHealth 2008. Second International Conference on (pp. 241-244). IEEE.

Laird, n.d. S8658WPR, Laird, [online] <Available from:

http://www.lairdtech.com/products/s8658wpr> (Accessed June 2016)

Lee, H., Battle, A., Raina, R. and Ng, A.Y., 2006. Efficient sparse coding algorithms. In Ad-

vances in neural information processing systems (pp. 801-808).

Lee, H.C., Kang, B.J., Lee, E.C. and Park, K.R., 2010. Finger vein recognition using weighted

local binary pattern code based on a support vector machine. Journal of Zhejiang Univer-

sity SCIENCE C, 11(7), pp.514-524.

Lugovaya T.S. 2005. Biometric human identification based on electrocardiogram. [Master's the-

sis] Faculty of Computing Technologies and Informatics, Electrotechnical University "LETI",

Saint-Petersburg, Russian Federation.

127
Jain, A. and Jain, A.K. (2002) Biometrics: Personal identification in Networked society. Edited

by Ruud Bolle and Sharath Pankanti. Cleveland: Kluwer Academic Publishers.

Jacobson, I., Booch, G., Rumbaugh, J., Rumbaugh, J. and Booch, G., 1999. The unified soft-

ware development process (Vol. 1). Reading: Addison-wesley.

Joardar, S., Chatterjee, A. and Rakshit, A. (2015) A real-time palm Dorsa Subcutaneous vein

pattern recognition system using collaborative representation-based classification, IEEE

Transactions on Instrumentation and Measurement, 64(4), pp. 959966. doi:

10.1109/tim.2014.2374713.

Kahn, C.M. and Roberds, W. (2008) Credit and identity theft, Journal of Monetary Economics,

55(2), pp. 251264. doi: 10.1016/j.jmoneco.2007.08.001.

Kallender, P. 2004. Japanese banks choose vein-recognition security system, ComputerWorld,

[online] <Available from:

http://www.computerworld.com/article/2566919/security0/japanese-banks-choose-vein-

recognition-security-system.html> (Accessed May 2016)

Kim, J.I.N.H.O., Kim, B.S. and Savarese, S., 2012. Comparing image classification methods: K-

nearest-neighbor and support-vector-machines. Ann Arbor, 1001, pp.48109-2122.

Krishna, P. and Husak, D., 2007. RFID infrastructure. IEEE Communications Magazine, 45(9),

p.4.

Kocer, H.E., Tutumlu, H. and Allahverdi, N., 2012. An Efficient Hand Dorsal Vein Recognition

Based on Neural Networks. Journal of Selcuk University Natural and Applied Science, 1(3),

pp.pp-28.

Mairal, J., Bach, F., Ponce, J. and Sapiro, G., 2009, June. Online dictionary learning for sparse

coding. In Proceedings of the 26th annual international conference on machine learning

(pp. 689-696). ACM.

Makary, M.A. and Daniel, M. (2016) Medical errorthe third leading cause of death in the US,

BMJ, , p. i2139. doi: 10.1136/bmj.i2139.

Mordini, E. and Ottolini, C., 2007. Body identification, biometrics and medicine: ethical and so-

cial considerations. ANNALI-ISTITUTO SUPERIORE DI SANITA, 43(1), p.51.

Murphy, M.F. and Kay, J.D.S., 2004. Patient identification: problems and potential solutions.

Vox sanguinis, 87(s2), pp.197-202.

128
Nadort, A., 2007. The hand vein pattern used as a biometric feature. Master Literature Thesis

of Medical Natural Sciences at the Free University, Amsterdam.

Nasrollahi, K., Haque, M.A., Irani, R. and Moeslund, T.B., 2016. Contact-Free Heartbeat Signal

for Human Identification and Forensics. In Handbook of Biometrics for Forensic Science.

Springer.

Newman, M.W. and Landay, J.A., 2000, August. Sitemaps, storyboards, and specifications: a

sketch of Web site design practice. In Proceedings of the 3rd conference on Designing in-

teractive systems: processes, practices, methods, and techniques (pp. 263-274). ACM.

NHS England, 2014. Improving medication error incident reporting and learning, [online]

<Available from: https://www.england.nhs.uk/wp-content/uploads/2014/03/psa-sup-info-

med-error.pdf> (Accessed November 2016)

Ng, A. 2010. ECCV10 Tutorial - Image Classification By Sparse Coding. Presentation, Univer-

sity of Stanford, California.

Odinaka, I., Lai, P.H., Kaplan, A.D., O'Sullivan, J.A., Sirevaag, E.J. and Rohrbaugh, J.W., 2012.

ECG biometric recognition: A comparative analysis. Information Forensics and Security,

IEEE Transactions on, 7(6), pp.1812-1824.

Paranjape, R.B., Mahovsky, J., Benedicenti, L. and Koles, Z., 2001. The electroencephalogram

as a biometric. In Electrical and Computer Engineering, 2001. Canadian Conference on

(Vol. 2, pp. 1363-1366). IEEE.

Pasfield, G., 1991. Color care coded patient identification system. U.S. Patent 5,026,084.

Patently Mobile, 2016. Samsung invents a new User ID System for Smartwatches using Hand

Vein Patterns, [online] <Available from: http://www.patentlymobile.com/2016/02/samsung-

invents-a-new-user-id-system-for-smartwatches-using-hand-vein-patterns.html> (Accessed

May 2016)

Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Pret-

tenhofer, P., Weiss, R., Dubourg, V. and Vanderplas, J., 2011. Scikit-learn: Machine learn-

ing in Python. The Journal of Machine Learning Research, 12, pp.2825-2830.

Pi Supply, n.d. Bright Pi Bright White and IR Camera Light for Raspberry Pi [online] <Avail-

able from: https://www.pi-supply.com/product/bright-pi-bright-white-ir-camera-light-

raspberry-pi/> (Accessed October 2016)

129
Prabhakar, S., Pankanti, S. and Jain, A.K., 2003. Biometric recognition: Security and privacy

concerns. IEEE Security & Privacy, (2), pp.33-42.

Probs, M. and Branzell, R. (2016) CHIME initiatives advance patient safety - healthcare IT -

CHIME. Available at: https://chimecentral.org/chime-initiatives-advance-patient-safety/ (Ac-

cessed: 5 May 2016).

Probst, C.A., Wolf, L., Bollini, M. and Xiao, Y., 2016. Human factors engineering approaches to

patient identification armband design. Applied Ergonomics, 52, pp.1-7.

Prokoski, F., 2000. History, current status, and future of infrared identification. In Computer Vi-

sion Beyond the Visible Spectrum: Methods and Applications, 2000. Proceedings. IEEE

Workshop on (pp. 5-14). IEEE.

Qian, G., Sural, S., Gu, Y. and Pramanik, S., 2004, March. Similarity between Euclidean and

cosine angle distance for nearest neighbor queries. In Proceedings of the 2004 ACM sym-

posium on Applied computing (pp. 1232-1237). ACM.

Raspberry Pi, n.d. Raspberry Pi 2 Model B, [online] <Available from:

https://www.raspberrypi.org/products/raspberry-pi-2-model-b/> (Accessed September

2016)

Raspberry Pi, n.d. PI NOIR CAMERA, [online] <Available from:

https://www.raspberrypi.org/products/pi-noir-camera/> (Accessed June 2016)

Raspberry Pi, n.d. Camera Module v2, [online] <Available from:

https://www.raspberrypi.org/products/camera-module-v2/> (Accessed November 2016)

Redis, n.d. [online] <Available from: http://redis.io/> (Accessed June 2016)

Refaeilzadeh, P., Tang, L. and Liu, H., 2009. Cross-validation. In Encyclopedia of database sys-

tems (pp. 532-538). Springer US.

Right Patient, n.d. [online] <Available from: http://www.rightpatient.com/> (Accessed May 2016)

Roesner, F., Kohno, T. and Wetherall, D., 2012. Detecting and defending against third-party

tracking on the web. In Proceedings of the 9th USENIX conference on Networked Systems

Design and Implementation (pp. 12-12). USENIX Association.

Rosenthal, M. M., 2003. Check the Wristband, Patient Safety Network, [online] <Available

from: https://psnet.ahrq.gov/webmm/case/22#references> (Accessed November 2016)

130
Rublee, E., Rabaud, V., Konolige, K. and Bradski, G., 2011, November. ORB: An efficient alter-

native to SIFT or SURF. In 2011 International conference on computer vision (pp. 2564-

2571). IEEE.

Sahu, A.P. and Bharathi, H.N., 2015. Veins based Authentication System. International Journal

of Computer Applications, 120(20).

Sandelowski, M., 2004. Using qualitative research. Qualitative Health Research, 14(10),

pp.1366-1386.

Sandelowski, M., Barroso, J. and Voils, C.I., 2007. Using qualitative metasummary to synthe-

size qualitative and quantitative descriptive findings. Research in nursing & health, 30(1),

pp.99-111.

Schulmeister, L. (2008) Patient Misidentification in oncology care, Clinical Journal of Oncology

Nursing, 12(3), pp. 495498. doi: 10.1188/08.cjon.495-498.

Scikit Image, n.d. Morphological Filtering, [online] <Available from: http://scikit-

image.org/docs/dev/auto_examples/applications/plot_morphology.html> (Accessed June

2016)

Scikit Image, n.d. Histogram Equalization , [online] <Available from: http://scikit-

image.org/docs/dev/auto_examples/plot_equalize.html> (Accessed June 2016)

Scikit Image, n.d. MiniBatchDictionaryLearning, [online] <Available from: http://scikit-

learn.org/stable/modules/generated/sklearn.decomposition.MiniBatchDictionaryLearning.ht

ml> (Accessed October 2016)

Scikit Learn, (2014) Strategies to scale computationally: bigger data, Scikit Learn Documenta-

tion, [online] <Available from: http://scikit-learn.org/stable/modules/scaling_strategies.html>

(Accessed May 2016)

Soni, M., Gupta, S., Rao, M.S. and Gupta, P., 2010. A new vein pattern-based verification sys-

tem. International Journal of computer science and information security, 8(1), pp.58-63.

Suarez Pascual, J.E., Uriarte-Antonio, J., Sanchez-Reillo, R. and Lorenz, M.G., 2010, October.

Capturing hand or wrist vein images for biometric authentication using low-cost devices. In

Intelligent Information Hiding and Multimedia Signal Processing (IIH-MSP), 2010 Sixth In-

ternational Conference on (pp. 318-322). IEEE.

131
Sufi, F., Khalil, I. and Mahmood, A., 2011. Compressed ECG biometric: a fast, secured and effi-

cient method for identification of CVD patient. Journal of medical systems, 35(6), pp.1349-

1358.

Sutcu, Y., Sencar, H.T. and Memon, N., 2005, August. A secure biometric authentication

scheme based on robust hashing. In Proceedings of the 7th workshop on Multimedia and

security (pp. 111-116). ACM.

Thomas, P. & Evans, C., 2004. An identity crisis? Aspects of patient misidentification. AVMA
Medical & Legal Journal, 10(1), pp.18-22.
University Hospitals Birmingham NHS Foundation Trust, 2015. About Us [online] <Available
from: http://www.uhb.nhs.uk/about-us.htm> (Accessed November 2016)
Van Der Walt, S., Schnberger, J.L., Nunez-Iglesias, J., Boulogne, F., Warner, J.D., Yager, N.,
Gouillart, E. and Yu, T., 2014. scikit-image: image processing in Python. PeerJ, 2, p.e453.
Van Rossum, G., 2007, June. Python Programming Language. In USENIX Annual Technical
Conference (Vol. 41).
Vassallo, D (2016a) Indoor GPS demo - powered by angular, pouchdb and ble beacons.. Avail-
able at: https://www.youtube.com/watch?v=7rt9hTj26ak (Accessed: 12 May 2016).
Vassallo, D(2016b) RFID powered indoor GPS. Available at: https://www.youtube.com/watch?
v=EzPnv8N_cYA (Accessed: 12 May 2016).
Vipul, A.M. and Sonpatki, P., 2016. ReactJS by Example-Building Modern Web Applications
with React. Packt Publishing Ltd.
Yalavarthy, P.K., Nundy, K.K. and Sanyal, S., 2009. Integrable Vein Viewing System in Hand
Held Devices, Indian Institute Of Science, Bangalore
Yang, Y., Zhang, J. and Kisiel, B., 2003, July. A scalability analysis of classifiers in text catego-
rization. In Proceedings of the 26th annual international ACM SIGIR conference on Re-
search and development in information retrieval (pp. 96-103). ACM.
Yao, W., Chu, C.H. and Li, Z., 2010, June. The use of RFID in healthcare: Benefits and barri-
ers. In RFID-Technology and Applications (RFID-TA), 2010 IEEE International Conference
on (pp. 128-134). IEEE.
Wang, L., Leedham, G. and Cho, S.. (2007) Infrared imaging of hand vein patterns for biomet-
ric purposes, IET Computer Vision, 1(3), pp. 113122. doi: 10.1049/iet-cvi:20070009.
Want, R., 2006. An introduction to RFID technology. Pervasive Computing, IEEE, 5(1), pp.25-
33.
Watanabe, M., Endoh, T., Shiohara, M. and Sasaki, S., 2005, September. Palm vein authentica-
tion technology and its applications. In Proceedings of the biometric consortium conference
(pp. 19-21).
Weiss, B., 2006, July. Fast median and bilateral filtering. In Acm Transactions on Graphics
(TOG) (Vol. 25, No. 3, pp. 519-526). ACM.
Weingart, S.N., Wilson, R.M., Gibberd, R.W. and Harrison, B., 2000. Epidemiology of medical
error. Western Journal of Medicine, 172(6), p.390.

132
Wilson, C. (2011) Vein pattern recognition: A privacy-enhancing Biometric. United States: CRC
Press.
World Health Organization, 2006 The World Health Report 2006, Chapter 1, Table 1.1, [online]
<Available from: http://www.who.int/whr/2006/06_chap1_en.pdf> (November 2016)
Zelkowitz, M.V. and Wallace, D.R., 1998. Experimental models for validating technology. Com-
puter, 31(5), pp.23-31.
Zimmerman, J.B., Pizer, S.M., Staab, E.V., Perry, J.R., McCartney, W. and Brenton, B.C., 1988.
An evaluation of the effectiveness of adaptive histogram equalization for contrast enhance-
ment. Medical Imaging, IEEE Transactions on, 7(4), pp.304-312.

133
APPENDICES

APPENDIX A. DS PROPOSAL

Student's Name: David Vassallo


Student's Number: H00037626
Student's Email Address: david.vassallo@my.ohecampus.com

Project Title:
BioRFID: A Patient Identification System using Biometrics and RFID

Proposal Submission Date: April 2016

Version Number of the Proposal: 0.3

DA Class ID: UKL1.CKIT.702.H00023870

Name of DA: Lalit Garg

RMT Class ID: LAUR-906, March 2016

Name of GDI: Lalit Garg

Ethical Checklist Completed: Yes

Name of SSM:

The Programme: MSc in Computer Security

Domain: CKIT-511: Security Engineering

Proposal approved by: Lalit Garg

Date of the approval: July 2016

134
Approval confirmed in MiTSA by the Lead Faculty (Dissertation):: (To be completed by the Lead
Faculty)

Sponsor's Details:
6PM LTD, 6PM Business Center, Triq it-Torri, Swatar, B'Kara BKR 4012, Malta, Europe

Sponsor's Background:
Healthcare IT provider with an interest in providing affordable IT healthcare solutions. Primary market is
the UK's NHS.

Sponsor's Agreement:
Yes, agreement to be posted pending some legal clarifications that have been submitted to Laureate Lens
already.

The Project Aims and Objectives:


Patient misidentification is a widely reported problem in medical literature. For example, the National
Patient Safety Agency quoted this problem as a significant risk in the NHS (Thomas & Evans, 2004).
The proposed project aims to help alleviate the problem of patient misidentification in healthcare facili-
ties. To this end, we stipulate two hypothesis:

Hypothesis 1: The vein pattern biometrics significantly increases the ease and accuracy of patient
identification.

Hypothesis 2: Biometric systems can be successfully integrated with existing RFID solutions to
track patients, providing an end-to-end identification and tracking platform for patient and carer
safety

The project attempts to verify the above two hypotheses and build a system that will serve as a proof-of-
concept that showcases a fully functional patient identification and tracking system, including both hard -
ware and software system components. Current solutions currently deal with each problem separately.
RFID tracking systems are quite mature and well-established, especially in the retail sector. Biometrics is
also quickly maturing, especially with the introduction of fingerprint, voice and face recognition being
incorporated into smartphones. However, the two fields have not yet been explored in conjunction. Solu-
tions based solely on RFID still misidentify the patient and cannot guarantee the presence of a patient.
On the other hand solutions based solely on biometrics provide identification but not tracking. In addi -
tion, the previously mentioned biometric systems (fingerprint, voice recognition, face recognition) are
not particularly suited for a hospital environment since most patients might have physical or mental con-
ditions that render such biometrics ineffective. The proposed solution investigates the use of vein biomet-
rics to overcome these problems, in conjunction with RFID to provide both identification and tracking.

135
In the table below, please state your hypothesis or hypotheses; the research methods you will use to
guide the development of your IT artefact; the kind of IT artefact you will produce; and the means by
which you will evaluate the IT artefact in the light of the hypothesis.

Step Short Description


Patient misidentification [PmID] is a problem in the world's hospitals. It ac-
counts for about 200,000 patient deaths every year in the US alone and
costs the UK's NHS a project 466 million a year. While there has been
much research into identifying patients using ID technology such as RFID,
the majority of PmID occurs during patient identification phase (i.e. deter-
mining which ID number a patient is associated with).

Hypothesis
Hypothesis 1: The vein pattern biometrics significantly increases the
ease and accuracy of patient identification, assigning them the correct
ID/RFID number.

Hypothesis 2: Biometric systems can be successfully integrated with


existing RFID solutions to track patients, providing an end-to-end
identification and tracking platform for patient and carer safety
The dissertation will be based mainly around quantitative methods. Statisti-
cal methods would be used to determine the accuracy of the vein pattern
Research biometric system proposed, and a working proof of concept will illustrate
Methods how the system would work in real scenarios. However, the evaluation will
be based on qualitative methods, where users and experts will fill in subjec-
tive opinions on the effectiveness of the proposed system
IT Artefact Working prototype, including both hardware and software as the proof of
concept system. The prototype will need to demonstrate:

Identifying a patient using biometrics with a high degree of con-


fidence
Hardware: Near IR camera to capture vein patterns
Software: Image processing and data mining techniques to
match the captured vein patterns with known patterns.
Web portal to show results.

Assigning the identified patient an ID number


Hardware: RFID Tags
Software: Web portal to enroll a patient (i.e. bind their
biometric identity to the RFID tag ID)

Subsequently tracking the assigned ID number using an RFID


system
Hardware: RFID Readers

136
Software: Server to process RFID reader output

Using the biometric system to confirm a patient's ID number


Software: Web portal that prompts patient to scan their
vein patterns, and shows the carer their RFID number and
last known location, hence confirming a patients identity
via the use of biometrics
The evaluation of the proof-of-concept deliverable will be based around both
expert and end-user reviews, based on user-centric questionnaires.

Two groups will be asked for feedback regarding how effective the system
Evaluation
is, effectively providing a complete user satisfaction survey:
Carers and hospital staff (experts)
Patients (end-users)

Project Outline

The project consists of two broad categories of tasks, those relating to the hardware of the proposed solu-
tion, and those relating to the software. Each of these categories can be further subdivided into RFID and
biometric components, as shown in Figure 1 below.

Figure 44: BioRFID Sections

Each section consists of several steps, which at a high level can be summarized as follows:
Hardware (RFID): Obtain, configure and test RFID tags, antennas and readers.
Hardware (Biometrics): Obtain components to build a Near Infrared Camera rig that will
take pictures of subjects wrist area
Software (RFID): Write code to receive and parse RFID data from RFID readers and
translate that to a physical location
Software (Biometrics): Write image enhancing code to extract vein patterns, and machine
learning algorithms to identify which individual the vein pattern belongs to.

Quantitative methods based on statistics will be used in order to test the accuracy of the vein pattern
matching algorithms, and in order to evaluate the resulting proof of concept, we will then proceed to
qualitative methods such as issuing questionnaires to both end users (patients) and expert users (health-
care workers) to evaluate if the system helps reduce identification errors, is easy to use and helps in day-
to-day tasks.

137
Literature Survey / Resources List:

The problem being tackled: patient misidentification


Lisa Schulmeister, R.N., 2008. Patient misidentification in oncology care. Clinical journal of oncol -
ogy nursing, 12(3), p.495.
This paper showcases some real-world problems that occurred when a patient is misidentified, including
some patient deaths. The author indicates that misidentification can occur at any stage of patient care, but
proper identification starts with proper patient registration.

ECRI Institute, 2016. Top 10 Patient Safety Concerns for Healthcare Organizations, Executive
Brief. ECRI Institute [online] <Available from:
https://www.ecri.org/EmailResources/PSRQ/Top10/2016_Top10_ExecutiveBrief_final.pdf>
This execute brief comes from the highly respected ECRI Institute, which deals with patient safety. Com-
ing in second place, patient misidentification is acknowledged to be a very real risk in todays healthcare
environments

Chassin, M.R. and Becher, E.C., 2002. The wrong patient. Annals of Internal Medicine, 136(11),
pp.826-833.
This paper argues that patient misidentification is under-reported and medical literature does not ade-
quately discuss the problem. The author also points out that the most remediable problem in patient
misidentification is absent protocols and procedures for patient identification.

Registering & Identifying patients using vein pattern biometrics


Nadort, A., 2007. The hand vein pattern used as a biometric feature. Master Literature Thesis of
Medical Natural Sciences at the Free University, Amsterdam.
This thesis goes into an in-depth discussion as to why vein patterns are a suitable feature that can be used
to identify people definitively. Of particular note is the section in which the author discusses the non-
variability of vein patterns, and under which medical conditions this may change. The thesis describes
how to capture vein pattern images, which features can be used in identification, and how a malicious ac-
tor may attempt to subvert the process. It concludes that vein biometrics are a suitable way of identifying
subjects for high security applications.

Aboalsamh, H.A., Alhashimi, H.T. and Mathkour, H.I., 2012, January. Applying Recent Vein Im-
age Enhancement Techniques In Vain Biometrics. In Proceedings of the International Conference
on Image Processing, Computer Vision, and Pattern Recognition (IPCV) (p. 1). The Steering Com-
mittee of The World Congress in Computer Science, Computer Engineering and Applied Comput-
ing (WorldComp).
This paper is useful in highlighting methods that can be used in the preprocessing stage, that is when pre-
paring a captured vein pattern image for information extraction.

Joardar, S., Chatterjee, A. and Rakshit, A., 2015. A Real-Time Palm Dorsa Subcutaneous Vein Pat-
tern Recognition System Using Collaborative Representation-Based Classification. Instrumenta-
tion and Measurement, IEEE Transactions on, 64(4), pp.959-966.

138
This paper is of significant interest since in it the author describes the workings of a low cost vein pattern
recognition system based on the micro-computer known as the Raspberry Pi - the same microcomputer
which will be used in the proposed system. Also of interest is the authors discussion of using sparse
representation based classification as a means of identifying which subject a vein pattern belongs to.

Identifying & locating patients using RFID


Yao, W., Chu, C.H. and Li, Z., 2010, June. The use of RFID in healthcare: Benefits and barriers. In
RFID-Technology and Applications (RFID-TA), 2010 IEEE International Conference on (pp. 128-
134). IEEE.
This paper explains how RFID technology can be used to better identify patients and increase patient
safety by reducing the amount of medical errors that currently occur. Also of interest is the authors men-
tion of privacy concerns when deploying RFID solutions. This privacy aspect is very important when
promoting the system and hence it should be discussed if possible in the proposed system.

Lahtela, A., Hassinen, M. and Jylha, V., 2008, January. RFID and NFC in healthcare: Safety of
hospitals medication care. In Pervasive Computing Technologies for Healthcare, 2008. Pervasive-
Health 2008. Second International Conference on (pp. 241-244). IEEE.
This papers main premise is similar to the previous one, that RFID can help reduce medical errors.
However, this paper is of note because it also mentions NFC, a competing or complementary technology
to RFID that can also help in reducing medical errors. It would be interesting to note in the proposed sys-
tem which technologies like NFC can be used as an alternative to the RFID technology being proposed.,
and why.

Scholarly Contributions of the Project


As mentioned before, the problem of patient misidentification is a serious problem. For example, in the
US in 1999, the estimated hospital mortality rate due to medication error stood at 98,000 patients per
year. As part of an effort to reduce these error, the US based Centers for Medicare & Medicaid Ser -
vices [CMS] mandated that hospitals begin tracking medications starting from when a medication or-
der is initiated, until its administration to the patient (Uy, Kury & Fontelo, 2015). The CMS suggested
that technologies like biometrics and RFID can help in achieving this goal (Uy, Kury & Fontelo, 2015).
However, especially with respect to biometrics, hospital adoption rate of these technologies remains low.
This project will contribute in giving hospitals a cheap but viable alternative that can help in reducing er -
rors.

Interestingly, registered nurses do not seem to think that patient misidentification is a big problem, with
only about 9% admitting to problems in a recent survey (Brtlov et al, 2015). While this is primarily an
education issue, as the authors of that same survey point out:

education, changes in protocols, and new technologies are needed to improve the precision of patient
identification. (Brtlov et al, 2015)

The proposed system will be a direct contribution to this issue. In addition, the impact of the proposed
system can be more far reaching that the traditional hospital / patient setting. For example, biometric
identification of patients has been shown to improve data used in healthcare research in Africa (Odei-

139
Lartey et al, 2016), as well as being used to further promote the use of eHealthcare systems (Kachurina et
al, 2015). The proposed system hopes to make a contribution in this respect by further exploring which
biometric techniques can be applied to help in these scenarios, as well as investigating how biometrics
can be supplemented with more traditional technology like RFID.

Last but not least, when reviewing most of the sources cited above, one notes the main focus of biomet-
rics seems to be on fingerprint and iris identification. There seems to be a lack of discussion around vein
patterns even though they have proved to be a very accurate and viable alternative. This dissertation aims
to fill this gap and identify why (or why not) vein patterns should be considered and how to integrate
them into a healthcare environment.

Description of the Deliverables:


The deliverables for this project will consist of the following:
Hardware: custom built non-invasive near infrared [NIR] camera rig to capture vein pat-
terns, along with industry standard RFID readers and antennas
Software: custom software which will:
enhance the captured non-invasive NIR images
identify who the vein patterns belong to
assign the identified patient an RFID tracking number
track the patient as he/she changes location

Evaluation Criteria:

The project key objectives to success are:

The system must be accurate and consistent in identifying patients


The system must successfully track a patients physical location
The system must be easy-to-use for end users
The system must help in patient ID & tracking tasks imposed on healthcare workers

In order to evaluate the above, the project will use a mixture of quantitative and qualitative methods. Sta-
tistical analysis and blind testing will be used to measure the accuracy of patient identification. The
project is aiming for a minimum accuracy in identification of around 90%.

In addition, a web-based UI will be used to demonstrate the ability of the system to physically track users
in a location. Last, statistical analysis of surveys will be used to evaluate the last two listed objectives
above, namely:

Survey: questionnaire results to two user groups listed below to assess if the system is ef-
fective, easy to use, and helps in healthcare day-to-day tasks.
expert users (health care workers)

140
end users (patients)
The survey will use a rating based system to gauge users experience of the system.

Resource Plan:
Hardware
Table 10: Required hardware, provider and associated costs
Part Description Provider Cost

Raspberry Pi w/ appropriate
https://www.adafruit.com/ $50
PSU & SD Card

Pi Noir Camera https://www.adafruit.com/ $29.95

NIR LED Illuminator Ebay $15

NIR LEDs https://www.adafruit.com/ $7.95

Software
The software that is going to be used is all written from scratch, based on the Python programming lan-
guage and using open source libraries, hence there are no associated costs.

Personnel
No additional personnel will be required during the design and implementation of the system. However,
personnel in the form of end users and expert users to test and asses the system will be required at the
end of the project

Project Plan and Timing

Table 11: High Level Timetable, Milestones & Tasks


Time Period (M = Month) Milestones and Tasks

Finish 'Comp Research Methods Training' Class


M1 M2 Literature Review
Submit Proposal
Literature Review
Draft Project Specification
Draft Project Design
M3 M5
Submit Project Specification & Design Report
Implementation of design:
Build
M6 M9 Hardware
RFID
NIR / Biometric
Software

141
Image Acquisition
Image Enhancement
Image classification (who does
this vein pattern belong to?)
Map vein pattern to RFID tag
Process RFID tags and display
on website
Test
Image classification accuracy
RFID tag range
End-to-End usability
Implement
Acceptance
End User Questionnaire
Expert User Questionnaire
Draft Dissertation Report

Draft of Dissertation Report Submission


M8

Dissertation Report Submission


M9

Risk Assessment:

Table 12: Project Risk Assessment


Risk Description Probability Impact Risk Mitigation

Parts ordered are not ap- Order different types of parts


propriate or sufficient (eg (eg different specifications) to
MEDIUM HIGH
insufficient non-invasive test and iterate during build
NIR LEDs) of prototype

Check with vendor if required


Parts are unavailable LOW VERY HIGH items are in stock via website
before ordering

Order parts as early as pos-


Parts ship late LOW MEDIUM sible - reuse personal equip-
ment where possible

Python software libraries LOW HIGH Survey existing libraries (e.g.


for specialized functions Scikit, Scikit-Image) and sur-
(eg image classification) vey literature and give prefer-
are unavailable ence to those algorithms
which are already available

142
before committing to a spe-
cific vein pattern recognition
algorithm

Communicate with RFID


Difficulty in interfacing vendor support team and re-
LOW HIGH
with RFID readers quest developer materials /
documentation / SDK

Start recruitment process as


Insufficient number of ex-
early as possible, and in-
pert users to participate in LOW MEDIUM
volve local hospitals / local
survey
academia

Quality Assurance
The project implementation will be split into stages that will happen sequentially. The first stage is build-
ing and testing the hardware components of the system. The second stage is building and testing the soft-
ware components of the system.

During the hardware stage, quality assurance will measure success as follows:

RFID readers and server:


The RFID server should output all the RFID tags that are in range, identified by their unique tag ID

Non-Invasive Near Infrared image acquisition rig:


The Non-Invasive NIR image acquisition rig should output images of a subjects wrist region.

Once the above two stages pass QA, we will proceed to the software stage, where QA would consist of
building a testing framework. The testing framework will split the images captured in the previous stage
into training and test sets. The training image set will be used to train the AI image classification al -
gorithms, while the test set will be left to gauge the accuracy of the algorithms. Each image in the test set
will be labelled, and the output of the algorithm under test will be compared to the labels. In this way, the
accuracy of the algorithm can be determined by checking how many images were assigned the correct la-
bel. Once the framework has been finalized, QA success is determined by maximizing the accuracy of
the algorithm under test. The following approach will be taken:

1. Determine which algorithms from literature are to be tested


2. Take a set of images with very basic image enhancement
3. Determine which algorithm from step one offers the best accuracy
4. Perform various image enhancement techniques and determine which of them gives the
best accuracy for the algorithm identified in step 3.

143
Once the image enhancement and classification algorithms have passed QA (we expect to have at least a
90% accuracy rate), we then proceed to perform a technical end-to-end test of the system. At this stage, a
successful QA would entail the following sequence of events:

1. An enrolled subject presents his/her wrist to the system


2. The system acquires a useful non-invasive NIR image of their wrist
3. Using the image enhancement and classification techniques, the system correctly identifies
the subject
4. The system prompts the expert user to enter a corresponding RFID code to track the sub -
ject
5. The system displays if the RFID tag corresponding to the subject is within range of the
RFID antenna or not.

If steps 1-5 are completed successfully, then the project would have passed technical QA. The project
will now proceed to a subjective QA in which two sets of users fill in a questionnaire:

End users (subjects that are identified and tracked by the system) will describe how easy
or difficult it was to use the system, and if the system was a source of discomfort or im -
peded their experience in some form. They will also be asked if they felt confident the sys-
tem would help reduce identification errors
Expert users (users who actually operate the system, such as hospital personnel) will be
asked how easy or difficult it was to use the system, and if the system helps them in their
day-to-day tasks by easily identifying and locating patients. They will also be asked if such
a system will help reduce medical errors if integrated with other medical systems present
in the facility.

References

Brtlov, S., Hajduchov, H., Brabcov, I. and Tthov, V., 2015. Patient misidentification in nursing
care. Neuro endocrinology letters, 36(suppl2), pp.17-22.

Kachurina, P., Buccafurri, F., Bershadskaya, L., Bershadskaya, E. and Trutnev, D., 2015. Biometric Iden-
tification in eHealthcare: Learning from the Cases of Russia and Italy. In Electronic Government and the
Information Systems Perspective (pp. 103-116). Springer International Publishing.

Odei-Lartey, E.O., Boateng, D., Danso, S., Kwarteng, A., Abokyi, L., Amenga-Etego, S., Gyaase, S., As-
ante, K.P. and Owusu-Agyei, S., 2016. The application of a biometric identification technique for linking
community and hospital data in rural Ghana. Global Health Action, 9.

Thomas, P. & Evans, C., 2004. An identity crisis? Aspects of patient misidentification. AVMA Medical &
Legal Journal, 10(1), pp.18-22.

Uy, R.C.Y., Kury, F.P. and Fontelo, P.A., 2015. The State and Trends of Barcode, RFID, Biometric and
Pharmacy Automation Technologies in US Hospitals. In AMIA Annual Symposium Proceedings (Vol.
2015, p. 1242). American Medical Informatics Association.

144
APPENDIX B. USER INTERFACE SCREENSHOTS

Figure 45: Login page with role selection

Figure 46: Administrator > RFID Readers Settings Page

145
Figure 47: Administrator > Map Locations Settings Page

Figure 48: Administrator > Sparse Dictionary Settings Page

146
Figure 49: Administrator > Enrollment > Patient Profiles

Figure 50: Administrator > Enrollment > Patient Biometrics

147
Figure 51: Operator > Audit Screen

Figure 52: Operator > Last Screen Screen

148
APPENDIX C. CODE LISTING

pre_process.py
from skimage.morphology import opening
from skimage.color import rgb2gray
from skimage import data, exposure
from skimage.morphology import disk
from skimage.transform import downscale_local_mean, resize
from skimage.exposure import rescale_intensity
import matplotlib.pyplot as plt
# BEGIN FUNCTION DEFINITIONS
def load_crop_gray(image, debug=False):
image1 = data.load(image)
if debug:
plt.title('Original Image')
plt.imshow(image1, cmap=plt.cm.gray)
plt.show()
if debug:
plt.title('Cropped Image')
plt.imshow(image1, cmap=plt.cm.gray)
plt.show()
image1 = rgb2gray(image1)
if debug:
plt.title('Grayscale Image')
plt.imshow(image1, cmap=plt.cm.gray)
plt.show()
image1 = downscale_local_mean(image1, (25, 25))
image1 = rescale_intensity(image1)
if debug:
plt.title('Downscaled Image')
plt.imshow(image1, cmap=plt.cm.gray)
plt.show()
selem = disk(4)
opened = opening(image1, selem)
image1 = rescale_intensity(opened)
if debug:
plt.title('Feature Reconstruction / Enhancement')
plt.imshow(image1, cmap=plt.cm.gray)
plt.show()
return image1
def pre_process_dave (image, debug=False):
image1 = load_crop_gray(image, debug)
img_eq1 = exposure.equalize_adapthist(image1)#, clip_limit=0.3)
img_eq1 = rescale_intensity(img_eq1)
if debug:
plt.title('Adaptive Histogram Equalization')
plt.imshow(img_eq1, cmap=plt.cm.gray)
plt.show()
return img_eq1

149
webserver.py
from flask import request, session
from flask import render_template
from flask import Flask
from werkzeug.utils import secure_filename
import numpy as np
from numpy.random import RandomState
from sklearn.decomposition import MiniBatchDictionaryLearning
from sklearn.neighbors import DistanceMetric
import sqlite3, json
import glob, os
import cPickle as pickle
import sys
from pre_process import pre_process_dave
app = Flask(__name__)
# set the secret key. keep this really secret:
app.secret_key = 'lbfsO20498U9WE08HJFD89EWQTFCDHUKJASHDFAO87Glkgads'
UPLOAD_FOLDER = '/tmp'
ALLOWED_EXTENSIONS = set(['jpg', 'jpeg', 'png'])
app.config['UPLOAD_FOLDER'] = UPLOAD_FOLDER
############### BEGIN UTILS SECTION ##############
def allowed_file(filename):
print filename
return '.' in filename and \
filename.rsplit('.', 1)[1] in ALLOWED_EXTENSIONS
def getPatientIdentifier(filename):
patientIdentifier = os.path.basename(filename).split('_')[0]
patientIdentifier = patientIdentifier.split('.')[0]
return patientIdentifier
def getPatientNumericalLabel(patientIdentifier):
try:
patientIdentifier = int(patientIdentifier)
except:
patientIdentifier=''.join([str(ord(c)) for c in patientIdentifier])
return patientIdentifier
############### BEGIN DB SECTION ###############
def connectToDB(dictionary=False):
conn = sqlite3.connect('bioRFID.db')
if dictionary:
conn.row_factory = sqlite3.Row
return conn
def createTablesDB():
conn = connectToDB()
# create rfid readers table
conn.execute('''CREATE TABLE IF NOT EXISTS rfidReaders
(ID INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL,
ipAddress TEXT NOT NULL);''')
print "rfidReaders table created successfully"
# create rfid antennas to location mapping table
conn.execute('''CREATE TABLE IF NOT EXISTS rfidAntennas
(ID INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL,
readerID INTEGER NOT NULL,
antennaID INTEGER NOT NULL,
locationName TEXT NOT NULL);''')
print "rfidAntennas table created successfully"
# create patient table
conn.execute('''CREATE TABLE IF NOT EXISTS patients
(ID INTEGER PRIMARY KEY AUTOINCREMENT,
name TEXT NOT NULL,
rfidCode TEXT NOT NULL,
sparseCode TEXT NOT NULL);''')
print "patients table created successfully"
# create rfid tag reads table
# data should be in the form:
# reader_ip,event.tag.arrive tag_id={},antenna={},rssi={},timestamp

150
conn.execute('''CREATE TABLE IF NOT EXISTS rfidTagReads
(ID INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL,
timestamp DATETIME NOT NULL,
readerID INTEGER NOT NULL,
antennaID INTEGER NOT NULL,
tagID TEXT NOT NULL);''')
print "rfidTagReads table created successfully"
conn.commit()
conn.close()
def getLocations(readerID=None):
conn = connectToDB()
if readerID:
cursor = conn.execute("SELECT ipAddress, locationName, antennaID FROM rfi-
dAntennas INNER JOIN rfidReaders ON "
"rfidReaders.ID = readerID WHERE readerID="+readerID+";")
else:
cursor = conn.execute("SELECT ipAddress, locationName, antennaID FROM rfi-
dAntennas INNER JOIN rfidReaders ON "
"rfidReaders.ID = readerID;")
result = cursor.fetchall()
conn.commit()
conn.close()
return result
def getReaders():
conn = connectToDB()
cursor = conn.execute("SELECT ID, ipAddress FROM rfidReaders;")
result = cursor.fetchall()
conn.commit()
conn.close()
return result
def getPatientProfiles():
conn = connectToDB()
cursor = conn.execute("SELECT name, rfidCode FROM patients;")
result = cursor.fetchall()
conn.commit()
conn.close()
return result
def getPatientBiometrics():
conn = connectToDB()
cursor = conn.execute("SELECT name, sparseCode FROM patients;")
result = cursor.fetchall()
conn.commit()
conn.close()
return result
def filterReadsByLocation(location):
conn = connectToDB()
cursor = conn.execute("SELECT name, timestamp,tagID, locationName FROM rfid-
TagReads INNER JOIN "
"patients ON patients.rfidCode = tagID INNER JOIN "
"rfidAntennas ON rfidTagReads.readerID = rfidAntennas.readerID
AND "
"rfidTagReads.antennaID = rfidAntennas.antennaID WHERE "
"locationName='"+location+"' ORDER BY date(timestamp) DESC LIMIT
1;")
result = cursor.fetchall()
conn.commit()
conn.close()
return result
def getAllReads():
conn = connectToDB()
cursor = conn.execute("SELECT name, timestamp,tagID, locationName FROM rfid-
TagReads INNER JOIN "
"patients ON patients.rfidCode = tagID INNER JOIN "
"rfidAntennas ON rfidTagReads.readerID = rfidAntennas.readerID "
"AND rfidTagReads.antennaID = rfidAntennas.antennaID;")
result = cursor.fetchall()

151
conn.commit()
conn.close()
return result
def lastKnownLocation(patientName):
conn = connectToDB()
cursor = conn.execute("SELECT name, timestamp,tagID, locationName FROM rfid-
TagReads INNER JOIN "
"patients ON patients.rfidCode = tagID INNER JOIN "
"rfidAntennas ON rfidTagReads.readerID = rfidAntennas.readerID "
"AND rfidTagReads.antennaID = rfidAntennas.antennaID WHERE
name='" + patientName + "' "
"ORDER BY date(timestamp) DESC LIMIT 1;")
result = cursor.fetchone()
conn.commit()
conn.close()
return result
def filterReadsByTag(tagID):
conn = connectToDB()
cursor = conn.execute("SELECT name, timestamp,tagID, locationName FROM rfid-
TagReads INNER JOIN "
"patients ON patients.rfidCode = tagID INNER JOIN "
"rfidAntennas ON rfidTagReads.readerID = rfidAntennas.readerID "
"AND rfidTagReads.antennaID = rfidAntennas.antennaID WHERE
tagID='"+tagID+"';")
result = cursor.fetchall()
conn.commit()
conn.close()
return result
def insertPatientDB(name, rfidCode, sparseCode = '&#xd7;'):
conn = connectToDB()
conn.execute("INSERT INTO patients (name, rfidCode, sparseCode) \
VALUES ('"+name+"', '"+rfidCode+"', '"+sparseCode+"' );")
conn.commit()
conn.close()
def deletePatientDB(name):
conn = connectToDB()
conn.execute("DELETE FROM patients WHERE name='"+name+"';")
conn.commit()
conn.close()
def updatePatientBioDB(name, sparseCode):
conn = connectToDB()
cursor = conn.execute("SELECT name FROM patients WHERE name = '" + name +
"';")
if cursor.fetchone():
conn.execute("UPDATE patients set sparseCode='"+sparseCode+"' WHERE
name='"+name+"';")
else:
conn.execute("INSERT INTO patients (name, rfidCode, sparseCode) "
"VALUES ('" + name+ "', '', '" + sparseCode + "');")
conn.commit()
conn.close()
def updatePatientRfidDB(name, rfidCode):
conn = connectToDB()
conn.execute("UPDATE patients set rfidCode='"+rfidCode+"' WHERE
name='"+name+"';")
conn.commit()
conn.close()
def insertRfidReaderDB(ipAddress):
conn = connectToDB()
conn.execute("INSERT INTO rfidReaders (ipAddress) \
VALUES ('"+ipAddress+"' )")
conn.commit()
conn.close()
def deleteRfidReaderDB(ipAddress):
conn = connectToDB()

152
cursor = conn.execute("SELECT ID FROM rfidReaders WHERE ipAddress='"+ipAd-
dress+"';")
result = cursor.fetchone()
if result is not None:
readerID = result[0]
conn.execute("DELETE FROM rfidAntennas WHERE readerID='"+readerID+"';")
conn.execute("DELETE FROM rfidReaders WHERE ipAddress='"+ipAddress+"';")
conn.commit()
conn.close()
def insertRfidAntennaDB(ipAddress, antennaID, locationName):
conn=connectToDB()
cursor = conn.execute("SELECT ID from rfidReaders WHERE ipAddress='"+ipAd-
dress+"';")
result = cursor.fetchone()
rfidReaderId = result[0]
conn.execute("INSERT INTO rfidAntennas (readerID, antennaID, locationName) \
VALUES ("+str(rfidReaderId)+","+str(antennaID)+",'"+str(locationName)
+"' )");
conn.commit()
conn.close()
def deleteRfidAntennaDB(locationName=None, ipAddress=None, antennaID=None):
conn=connectToDB()
if locationName:
conn.execute("DELETE from rfidAntennas WHERE locationName='"+location-
Name+"';")
elif ipAddress and antennaID:
cursor = conn.execute("SELECT ID from rfidReaders WHERE ipAddress='"+ipAd-
dress+"';")
result = cursor.fetchone()
rfidReaderId = result[0]
conn.execute("DELETE from rfidAntennas WHERE readerID="+rfidReaderId+" AND
antennaID="+antennaID+";")
conn.commit()
conn.close()
############### END DB SECTION ###############
############### BEGIN SPARSE CODE SECTION ###############
rng = RandomState(0)
def buildSparseDict(trainingDir =
'/home/dvas0004/Pictures/chime/NEW_RIG/train'):
numberOfTrainingFiles = 0
trainingDir = trainingDir.rstrip('/')
#build sparse code dictionary
print "building dictionary data"
dictionary_data = {}
training_images = glob.glob(trainingDir+'/*.jpg')
biggestImageSize = 0
for training_image in training_images:
print 'Processing: '+training_image
numberOfTrainingFiles += 1
threshold_image = pre_process_dave(training_image, debug=False)
orig_data = np.reshape(threshold_image,(1, -1)).astype(float)
mean = np.mean(orig_data)
std = np.std(orig_data)
data = orig_data
data -= mean
data /= std
imageSize=np.size(data)
if imageSize > biggestImageSize:
biggestImageSize = imageSize
patientIdentifier = getPatientIdentifier(training_image)
try:
dictionary_data[patientIdentifier].append(data)
except KeyError:
dictionary_data[patientIdentifier] = [data]
# define mini batch dictionary

153
mbdl = MiniBatchDictionaryLearning(n_components=30, transform_n_nonzero_co-
efs = 30, transform_alpha=0.01, alpha=0.001,
n_iter=50, batch_size=15,
random_state=100, shuffle=True,
split_sign=False, n_jobs=-1, transform_algorithm='lars')
# fit sparse code dictionary
# fit data to array
mdbl_data= np.zeros((numberOfTrainingFiles, biggestImageSize))
counter = 0
for patient in dictionary_data:
for data in dictionary_data[patient]:
mdbl_data[counter] = data
counter += 1
print "fitting sparse code dictionary using "+str(numberOfTrainingFiles)+"
training images..."
mbdl.fit(mdbl_data)
print "saving dictionary..."
pickle.dump(mbdl, open("sparse_dict.p", "wb"))
def getImageSparseCode(imageFilename):
# build sparse codes for training set, based on the saved dictionary
print "building sparse codes for template image..."
print 'Processing: '+imageFilename
threshold_image = pre_process_dave(imageFilename, debug=False)
orig_data = np.reshape(threshold_image,(1, -1)).astype(float)
mean = np.mean(orig_data)
std = np.std(orig_data)
data = orig_data
data -= mean
data /= std
#load dictionary
mbdl = pickle.load(open("sparse_dict.p", "rb"))
templateSpareCode = mbdl.transform(data)
return templateSpareCode
def initTemplateSparseStore(trainingDir =
'/home/dvas0004/Pictures/chime/NEW_RIG/train',debug=False):
trainingDir = trainingDir.rstrip('/')
#build template sparse code store
print "building template sparse code store"
templateSparseCodeStore = {}
if debug:
import matplotlib.pyplot as plt
pca_data=[]
pca_labels=[]
subjectsDone=[]
training_images = glob.glob(trainingDir+'/*.jpg')
for training_image in training_images:
print 'Processing: '+training_image
patientIdentifier = getPatientIdentifier(training_image)
templateSparseCode = getImageSparseCode(training_image)
try:
templateSparseCodeStore[patientIdentifier].append(templateSparseCode)
except KeyError:
templateSparseCodeStore[patientIdentifier] = [templateSparseCode]
if debug:
#if patientIdentifier not in subjectsDone:
pca_data.append(templateSparseCode[0])
pca_labels.append(patientIdentifier)
subjectsDone.append(patientIdentifier)
updatePatientBioDB(patientIdentifier, '&#x2714;')
if debug:
from sklearn.decomposition import PCA
pca = PCA(n_components=2)
pca_data = pca.fit_transform(pca_data)
dist = DistanceMetric.get_metric('euclidean')
print '***************'
print np.mean(dist.pairwise(pca_data))

154
counter=0
x_d1=[]
x_d2=[]
x_d3=[]
y_d1=[]
y_d2=[]
y_d3=[]
d1 = []
d2 = []
d3 = []
for dp in pca_data:
if pca_labels[counter].endswith('Right'):
d1.append([dp[0],dp[1]])
x_d1.append(dp[0])
y_d1.append(dp[1])
else:
d2.append([dp[0],dp[1]])
x_d2.append(dp[0])
y_d2.append(dp[1])
plt.annotate(
pca_labels[counter],
xy = (dp[0], dp[1]), xytext = (-20, 20),
textcoords = 'offset points', ha = 'right', va = 'bottom',
bbox = dict(boxstyle = 'round,pad=0.5', fc = 'yellow', alpha = 0.5),
arrowprops = dict(arrowstyle = '->', connectionstyle = 'arc3,rad=0'))
counter += 1
plt.plot(x_d1, y_d1, 'ro', x_d2, y_d2, 'bo')#, x_d3, y_d3, 'go')
plt.show()
print "saving template sparse code store..."
pickle.dump(templateSparseCodeStore, open("template_sparse_store.p", "wb"))
def addTemplateCodeToStore(templateFilename, recordToDB=False):
templateSparseCode = getImageSparseCode(templateFilename)
patientIdentifier = getPatientIdentifier(templateFilename)
#load template sparse code store
templateSparseCodeStore = pickle.load(open("template_sparse_store.p", "rb"))
try:
templateSparseCodeStore[patientIdentifier].append(templateSparseCode)
except KeyError:
templateSparseCodeStore[patientIdentifier] = [templateSparseCode]
print "saving updated template sparse code store..."
pickle.dump(templateSparseCodeStore, open("template_sparse_store.p", "wb"))
if recordToDB:
updatePatientBioDB(patientIdentifier,'&#x2714;')
def removePatientTemplateCodes(patientIdentifier):
#load template sparse code store
templateSparseCodeStore = pickle.load(open("template_sparse_store.p", "rb"))
try:
del templateSparseCodeStore[patientIdentifier]
except KeyError:
print 'Patient Sparse Codes not present in template store'
print "saving updated template sparse code store..."
pickle.dump(templateSparseCodeStore, open("template_sparse_store.p", "wb"))
############### END SPARSE CODE SECTION ###############
############### BEGIN COMPARISON SECTION ###############
def eucledianDistComparison(sparseCode):
dist = DistanceMetric.get_metric('euclidean')
currentBestDistance = None
guess1 = None
guess2 = None
currentBestGuess = None
#loadTemplate Sparse Code Store
templateSparseCodeStore = pickle.load(open("template_sparse_store.p", "rb"))
for patient in templateSparseCodeStore:
for templateSparseCode in templateSparseCodeStore[patient]:
euc_dist = dist.pairwise(sparseCode.tolist(), templateSparseCode.-
tolist())

155
if currentBestDistance is None:
currentBestDistance = euc_dist
currentBestGuess = patient
else:
if euc_dist < currentBestDistance:
currentBestDistance = euc_dist
guess2 = guess1
guess1 = currentBestGuess
currentBestGuess = patient
guesses=currentBestGuess
bestVote = 0
votedGuess = None
for guess in guesses:
vote = guesses.count(guess)
if vote > bestVote:
bestVote = vote
votedGuess = guess
print "{}/{}/{}".format(currentBestGuess,guess1,guess2)
return guesses
############### END COMPARISON SECTION ###############
############### BEGIN IDENTIFICATION SECTION ###############
def identifyPatient(imageFilename):
subjectSparseCode = getImageSparseCode(imageFilename)
bestGuessID = eucledianDistComparison(subjectSparseCode)
return bestGuessID
############### END IDENTIFICATION SECTION ###############
############### BEGIN WEB API SECTION ###############
@app.route('/identifyPatient', methods=['POST'])
def flaskIdentifyPatient():
if request.method == 'POST':
# check if the post request has the file part
if 'file' not in request.files:
return 'No files present in request'
file = request.files['file']
# if user does not select file, browser also
# submit a empty part without filename
if file.filename == '':
return 'No selected file'
if file and allowed_file(file.filename):
filename = secure_filename(file.filename)
templateAbsoluteLocation = os.path.join(app.config['UPLOAD_FOLDER'],
filename)
file.save(templateAbsoluteLocation)
bestGuessID = identifyPatient(templateAbsoluteLocation)
return bestGuessID
else:
return 'File not in allowed Extensions'
@app.route('/uploadPatientTemplate', methods=['POST'])
def uploadPatientTemplate():
if request.method == 'POST':
# check if the post request has the file part
if 'file' not in request.files:
return 'No files present in request'
file = request.files['file']
# if user does not select file, browser also
# submit a empty part without filename
if file.filename == '':
return 'No selected file'
if file and allowed_file(file.filename):
filename = secure_filename(file.filename)
templateAbsoluteLocation = os.path.join(app.config['UPLOAD_FOLDER'],
filename)
file.save(templateAbsoluteLocation)
addTemplateCodeToStore(templateAbsoluteLocation)
return 'OK'
else:

156
return 'File not in allowed Extensions'
@app.route('/setUserType', methods=['POST'])
def setUserType():
userType = request.form['userType']
session['userType'] = userType
return 'OK'
@app.route('/addRfidReader', methods=['POST'])
def addRfidReader():
ipAddress = request.form['ipAddress']
insertRfidReaderDB(ipAddress)
return 'OK'
@app.route('/delRfidReader', methods=['POST'])
def delRfidReader():
ipAddress = request.form['ipAddress']
deleteRfidReaderDB(ipAddress)
return 'OK'
@app.route('/addLocation', methods=['POST'])
def addLocation():
locationName = request.form['location']
ipAddress = request.form['ipAddress']
antennaID = request.form['antennaID']
insertRfidAntennaDB(ipAddress,antennaID,locationName)
return 'OK'
@app.route('/addPatientProfile', methods=['POST'])
def addPatientProfile():
patientName = request.form['name']
rfidCode = request.form['rfidCode']
insertPatientDB(patientName,rfidCode)
return 'OK'
@app.route('/updatePatientRfid', methods=['POST'])
def flaskUpdatePatientRfid():
patientName = request.form['patient']
newRFID = request.form['rfid']
updatePatientRfidDB(patientName, newRFID)
return 'OK'
@app.route('/delLocation', methods=['POST'])
def delLocation():
locationName = request.form['location']
deleteRfidAntennaDB(locationName=locationName)
return 'OK'
@app.route('/delPatientProfile', methods=['POST'])
def delPatientProfile():
patientName = request.form['patient']
deletePatientDB(patientName)
removePatientTemplateCodes(patientName)
return 'OK'
@app.route('/getReaders', methods=['POST'])
def flaskGetReaders():
results = getReaders()
resultArray = []
for result in results:
resultArray.append({'id':str(result[0]),'ip':str(result[1])})
return json.dumps(resultArray)
@app.route('/getLocations', methods=['POST'])
def flaskGetLocations():
results = getLocations()
resultArray = []
for result in results:
resultArray.append({'ip':str(result[0]),'location':str(result[1]),'an-
tenna':str(result[2])})
return json.dumps(resultArray)
@app.route('/getPatientProfiles', methods=['POST'])
def flaskGetPatientProfiles():
results = getPatientProfiles()
resultArray = []
for result in results:

157
resultArray.append({'name':str(result[0]),'rfid':str(result[1])})
return json.dumps(resultArray)
@app.route('/getPatientBiometrics', methods=['POST'])
def flaskgetPatientBiometrics():
results = getPatientBiometrics()
resultArray = []
for result in results:
resultArray.append({'name':str(result[0]),'sparse':str(result[1])})
return json.dumps(resultArray)
@app.route('/filterLocation', methods=['POST'])
def filterLocation():
locationName = request.form['location']
result = filterReadsByLocation(locationName)
return json.dumps(result)
@app.route('/getAllRecords', methods=['POST'])
def flaskGetAllRecords():
result = getAllReads()
return json.dumps(result)
@app.route('/filterTag', methods=['POST'])
def filterTag():
tagID = request.form['tagID']
result = filterReadsByTag(tagID)
return json.dumps(result)
@app.route('/lastKnown', methods=['POST'])
def lastKnown():
name = request.form['name']
result = lastKnownLocation(name)
return json.dumps(result)
@app.route('/buildSparseDict', methods=['POST'])
def flaskBuildSparseDict():
trainingFolder = request.form['trainingFolder']
buildSparseDict(trainingDir=trainingFolder)
initTemplateSparseStore(trainingDir=trainingFolder)
return 'OK'
@app.route('/echoer', methods=['POST'])
def echoer():
print request.form
return str(request.form)
############### END WEB API SECTION ###############
############### BEGIN WEB FRONTEND SECTION ###############
@app.route('/')
def login():
return render_template('login.html')
@app.route('/admin')
def adminPage():
return render_template('admin.html')
@app.route('/operator')
def operatorPage():
return render_template('operator.html')
############### END WEB FRONTEND SECTION ###############
############### BEGIN TESTING SECTION ###############
def buildConfusionMatrix():
#clear previous runs
try:
os.remove("sparse_dict.p")
os.remove("template_sparse_store.p")
except:
pass
#start by building sparse dictionary
print "Building Sparse Dictionary"
trainingFolder = '/home/dvas0004/Dropbox/Masters/Dissertation/lensPics/sam-
ples/train/'
buildSparseDict(trainingDir=trainingFolder)
initTemplateSparseStore(trainingDir=trainingFolder, debug=False)
# print "Adding Templates"

158
# templatesFolder =
'/home/dvas0004/Dropbox/Masters/Dissertation/Pcitures/train/templates'
# template_images = glob.glob(templatesFolder+'/*.jpg')
# for template_image in template_images:
# addTemplateCodeToStore(template_image)
#start identifying patients
print "Starting Testing"
testingFolder = '/home/dvas0004/Dropbox/Masters/Dissertation/lensPics/sam-
ples/test/'
testing_images = glob.glob(testingFolder+'/*.jpg')
totalNumberTested = 0
correctlyIdentified = 0
incorrectGuesses = []
for testing_image in testing_images:
patientIdentifier = getPatientIdentifier(testing_image)
totalNumberTested += 1
patientGuess = identifyPatient(testing_image)
print "Patient Guess: {}".format(patientGuess)
if patientIdentifier == patientGuess:
correctlyIdentified += 1
else:
incorrectGuesses.append(patientIdentifier)
percentageCorrect = (float(correctlyIdentified)/totalNumberTested)*100
incorrectlyIdentified = totalNumberTested - correctlyIdentified
percentageIncorrect = (float(incorrectlyIdentified)/totalNumberTested)*100
print "--------------------------------------------------"
print "Results: "
print "--------------------------------------------------"
print "Total number of testing images: {}".format(totalNum-
berTested)
print "Correctly identified images: {} / {}%".format(correctlyI-
dentified, percentageCorrect)
print "Incorrectly identified images: {} / {}%".format(incorrect-
lyIdentified, percentageIncorrect)
print incorrectGuesses
print "--------------------------------------------------"
############### END TESTING SECTION ###############
createTablesDB()
try:
if sys.argv[1] == "testing":
buildConfusionMatrix()
exit(0)
else:
app.run(host='0.0.0.0', port=5001, debug=True)
except IndexError:
app.run(host='0.0.0.0', port=5001, debug=True)

159
raspiClient.py

# setup:
# pip install requests
# pip install
import sys
import RPi.GPIO as GPIO
from picamera import PiCamera
import requests
from PyQt4 import QtGui, QtCore
from PyQt4.QtGui import *
class veinCamera(object):
def __init__(self, patientID, URL):
self.camera = PiCamera()
self.patientID = patientID
self.URL = URL
self.veinPhoto = ''
GPIO.setmode(GPIO.BCM)
GPIO.setup(4, GPIO.IN)
def takePicture(self):
self.veinPhoto = str('/home/pi/Desktop/' + self.patientID + '.jpg')
picOutput = open(self.veinPhoto, 'wb')
print self.veinPhoto
self.camera.start_preview()
while GPIO.input(4) != 0:
pass
self.camera.capture(self.veinPhoto)
self.camera.stop_preview()
picOutput.close()
self.camera.close()
def postPicture(self):
files = {'file': open(self.veinPhoto, 'rb')}
r = requests.post(self.URL, files=files)
return r.text
class Initial_Window(QtGui.QWidget):
def __init__(self):
QtGui.QWidget.__init__(self)
self.button_id = QtGui.QPushButton('Identify', self)
self.button_enroll = QtGui.QPushButton('Enroll', self)
self.button_id.clicked.connect(self.handleButton_id)
self.button_enroll.clicked.connect(self.handleButton_enroll)
layout = QtGui.QVBoxLayout(self)
layout.addWidget(self.button_id)
layout.addWidget(self.button_enroll)
self.setWindowTitle('BioRFID')
self.resize(320,240)
self.patientID = ''
def handleButton_id(self):
url = 'http://192.168.2.233:5001/identifyPatient'
vc = veinCamera('unknown', url)
vc.takePicture()
bestGuess = vc.postPicture()
msg = QMessageBox()
msg.setIcon(QMessageBox.Information)
msg.setText("Patient Best Guess:")
msg.setInformativeText('<strong>'+bestGuess+'</strong>')
msg.setWindowTitle("BioRFID")
if (bestGuess=='s1'):
msg.setIconPixmap(QPixmap("/home/pi/Desktop/1cf11d7.jpg"))
else:
msg.setIconPixmap(QPixmap("/home/pi/Desktop/1cf11d6.jpg"))
msg.exec_()
def handleButton_enroll(self):
print self.patientID
input = QInputDialog()

160
input.setTextValue(self.patientID)
text, ok = input.getText(self, 'BioRFID Patient Enroll', 'Enter patient
name:', text=self.patientID)
if ok:
print text
self.patientID = text
url = 'http://192.168.2.233:5001/uploadPatientTemplate'
vc = veinCamera(self.patientID,url)
vc.takePicture()
result = vc.postPicture()
msg = QMessageBox()
msg.setIcon(QMessageBox.Information)
msg.setText("Patient Submitted")
msg.setInformativeText(result)
msg.setWindowTitle("BioRFID")
msg.exec_()
app = QtGui.QApplication(sys.argv)
window = Initial_Window()
window.show()
sys.exit(app.exec_())

161

Vous aimerez peut-être aussi