Vous êtes sur la page 1sur 159

Data matching schemes to improve accuracy and completeness of the electoral registers evaluation report

March 2012

Translations and other formats For information on obtaining this publication in another language or in a large-print or Braille version, please contact the Electoral Commission: Tel: 020 7271 0500 Email: publications@electoralcommission.org.uk
The Electoral Commission 2012

Contents
Executive summary 1 2 3 4 5 6 7 8 9 10 11 Introduction Set-up and coordination Databases and the matching process Pilot authorities: overview and emerging issues Data matching results: Department for Work and Pensions Data matching results: Driver and Vehicle Licensing Agency Data matching results: Education databases Data matching results: Ministry of Defence Data matching results: Citizen Account Pilot costs Conclusions and recommendations Appendices Appendix A: Local authority pilot profiles Appendix B: Data tables 1 13 21 28 36 44 63 71 75 79 82 92

99 147

Acknowledgements
The Electoral Commission would like to thank all the staff at both the local authorities and the data holding organisations for the time and effort they devoted to these data matching pilot schemes. We would also like to thank the Cabinet Office for their assistance in the collection of data from the pilots.

Executive summary
Background
As part of the proposed shift to individual electoral registration (IER), the UK Government is exploring the extent to which the use of national public databases can help Electoral Registration Officers (EROs) improve the accuracy and completeness of their electoral registers. The Electoral Commission was given a statutory responsibility to report on the effectiveness of the data matching schemes. The schemes were based on the piloting of a range of national public databases in 2011 by 22 individual local authorities in England and Scotland.1 Our statutory evaluation considers the degree to which data matching schemes assisted EROs in improving the completeness and accuracy of their registers; resulted in any issues around administration, time and costs; or prompted objections to the schemes. Our findings are based on the data and feedback received from local authorities, data-holders and others during the course of the pilot schemes. In February 2012 the UK Government published its response to pre-legislative scrutiny and public consultation on the IER White Paper. In the response, the UK Government indicated its intention subject to the results of the evaluation of pilot schemes and further testing to widen the scope of data matching 2 simplify the transition to IER for The UK Government indicated that rather than, as originally intended, check accuracy and to identify people who may be eligible to register to vote, and then invite them to apply to register,3 it was now their intention names and addresses of all individuals currently on an electoral register will be matched against the data held by public bodies such as the Department for Work and Pensions information can be matched, the individual will be automatically placed onto the new IER register and would not need to take any further action to be registered

1 2

At the outset there was a pilot authority in Wales but they dropped out early in the process. HM Government (2012) Government Response to pre-legislative scrutiny and public consultation on Individual Electoral Registration and amendments to Electoral Administration law, Cm 8245 3 HM Government (2011) Individual Electoral Registration, Cm 8108

Electors whose details could not be matched in this way would be asked to apply individually and to supply personal identifiers. This proposal has not been tested by these pilots. Further piloting is needed to ensure that the advantages and disadvantages of these proposals are understood. In this report we set out some of the key questions we think need to be answered to help understand the issues.

Set-up and coordination


The Cabinet Office managed the overall pilot process and was also involved in the delivery of the pilots. The Commission advised the Cabinet Office throughout the set-up period, in particular on the need for a clear common framework for delivering the pilots, but the final decisions on the processes were taken by the Cabinet Office. The open application and selection process used for the pilots, and the absence of a clear, common framework, contrary to our advice, led to significant variation in the planned approaches of the pilots. This introduced challenges for the evaluation in comparing the results of the different schemes and therefore the ability to draw consistent conclusions. This also created challenges for local authorities delivering the pilots because in several cases the methodology they originally planned to use proved to be based on incorrect assumptions. The timing of the pilots, which took place alongside the annual canvass, coupled with delays to the process, put pressure on the capacity of the local authority teams involved and added to the difficulty in this evaluation of drawing firm conclusions from the pilot schemes as a whole. Local authorities reported varying levels of communication with the Cabinet Office and identified areas for improvement, including a better understanding of what data was going to be shared with them. The pilots did not follow processes, in terms of the IT systems and matching arrangements, which would be used for nationwide data

HM Government (2012) Government Response to pre-legislative scrutiny and public consultation on Individual Electoral Registration and amendments to Electoral Administration law, Cm 8245

matching. The evaluation cannot therefore draw conclusions about how the costs of these pilots would translate to a national roll-out.

The databases and the matching process


Ten databases were due to be tested as part of the scheme. These were the: Department for Work and Pensions (DWP) Centric database Driver and Vehicle Licensing Agency (DVLA) Driver database Student Loans Company (SLC) database National Pupil Database (NPD) (through the Department for Education) Individual Learner Record (ILR) (through the Department for Business, Innovation and Skills) Citizens Account (CA) database (through the Improvement Service in Scotland) (MoD) Joint Personnel Administration database and Anite housing database Higher Education Funding Council for England (HEFCE) student database Royal Mail change of address database However, not all were tested to the same extent. In particular, there were difficulties in accessing HEFCE and Royal Mail data. A key component of the trials was the matching process between databases to identify people to invite to register to vote. The processes employed could not be rolled out nationally but do allow for a greater understanding of the requirements of any framework for national data matching. Two different processes were used for matching the electoral registers with the DWP data on one hand and the DVLA and education databases on the other. These different rules make it difficult to compare the results from the two processes.

Pilot authorities: overview and emerging issues


There were 22 data matching pilots testing various combinations of databases. The complex nature and format of the data supplied to local authorities highlighted the need for good data management and analysis skills. Many of the pilots either had these skills available within the local authority or recruited additional staff using Cabinet Office funding. However, several did not and struggled to use the information provided. The process, as tested in these pilots, was labour intensive with significant work required to analyse the data. Those involved felt that the level of work required would not be sustainable in the future. A number of pilot authorities were able to use locally-held data to interrogate the data received from the national databases. The results of this activity suggest that there is scope for more use to be made of local data both to complement any future national data matching and to improve accuracy and completeness in general.

Data matching results


Department for Work and Pensions (DWP)
Eighteen pilots accessed the DWP Centric database. The level of match between the electoral registers and the DWP data varied significantly between local authorities. For those areas matching the whole register it ranged from 57.6% to 82.4%. These differences are partly due to different interpretations across the pilots, in the absence of a consistent framework, of what constitutes a match but are also likely to driven by differences between the local authorities in terms of demographics.

6,573 people were added to the registers as a result of follow-up activity undertaken using names suggested by the DWP Centric database. This was 13.2% of all names followed up. The response rate for the pilot follow up was affected by whether the ass. Where it took place during the canvass, the pilot response was depressed by the fact that many of the names identified by the match registered through the canvass. The pilots highlighted crucial differences in address formats between the electoral registers and the other national databases. This meant that many records could not be matched as simple address differences were not recognised. This problem could have been significantly reduced if time had been allowed for an address cleansing exercise. The absence of a unique identifier attached to each address on the public national databases was a key issue for the pilots. These would have allowed for a more straightforward matching process for local authorities. Many of the potential new electors suggested by the match with the DWP Centric database proved to be based on out of date or incorrect information. The problems posed by this could have been reduced by the inclusion of the date when the DWP record changed something which DWP were willing to provide but were not asked to do so. The absence of nationality information meant that several pilot authorities conducted follow up with people ineligible to register. However, the scope of this issue is not clear from these pilots and will have varied depending on the demographics of the local authority area.

Driver and Vehicle Licensing Agency (DVLA)


Match levels between the DVLA driver database and the electoral registers were lower than between the registers and the DWP Centric database, partly because because the match process used stricter criteria. The match levels varied from 51.7% to 67.3%.

this was more likely to reflect poor data currency rather than significant under-registration. 5

208 were added to the register as a result of pilot follow-up activity. This was 4.1% of all names followed up. Many of the responses to follow-up activity indicated the person written to was not res DVLA Driver database is not current. The DVLA data was more effective at targeting 16 and 17 year olds as opposed to the population as a whole.

Education databases
There were very few registrations from data matching with the Student Loans Company (SLC) database. This, and responses to the follow-up activity, support the view expressed by the SLC that the data used for these pilots (at the end of the academic year) was sometimes out of date. The National Pupil Database (NPD) and Individual Learner Record (ILR) proved effective at identifying attainers5 in these pilots. However, while the NPD and ILR identified attainers successfully, the majority of registrations were achieved through the annual canvass, which was taking place alongside the pilots, and not in response to follow-up activity through the pilots. Under IER, unlike in the current household system, individual attainers might need to complete their own form (rather than being registered by adults in the household). It is therefore possible that the number of registered attainers will fall. The ability to use data in order to target them in this way may therefore be a more useful tool for EROs in the future.

Ministry of Defence (MoD)


The MoD provided limited data for these pilots. They were able to confirm that existing service voters were still resident but not provide details of potential new service voters. They also provided details of addresses occupied by service personnel in the area but this excluded barracks.

An attainer is a 16- or 17-year-old who will reach voting age (18 years old) during the life of a current electoral register.

There was therefore no real prospect of addressing the completeness of service voter registrations in the pilot areas. Two pilots were able to use the MoD data to improve the accuracy of their register and amended or deleted a number of their records (9.6% and 13.2%) of the total number of service voters held on the register.

Citizens Account (CA)


The CA database is administered by the Improvement Service in Scotland and is intended to be a record of all residents within a participating local authority area. However, the CA database is not as comprehensive as the pilot authority originally anticipated the total number of records provided by CA represented only 27% of the Renfrewshire electorate. The level of match between the CA data and the electoral register was high with 88.8% of the CA records also found on the register. The matching exercise suggested a small number of potential new electors (1.7% of the size of the register after local matching). Follow-up activity was still under way at time of publication.

Pilot costs
The overall cost of the pilots is estimated at around 425,910, against an original budget of 1.2m. These figures exclude staff costs for the Cabinet Office. The under-spend is largely explained by initial budgeting over-estimates by both Cabinet Office and the pilot authorities, due to a lack of clarity about what the pilot process would entail, and by many local authorities not completing some of the activities, e.g. follow-up work, which they originally planned for. Given the size of the over-estimates, it seems unlikely that the pilots would ever have cost the full amount budgeted. The main item of expenditure reported by local authorities is the costs of additional staff, which account for about 50% of the total spent by local authorities. Local authorities reported that the process was labour intensive

and they needed to incur much of this cost before they could begin the process of contacting potential new electors. Staff costs could be reduced by improving the quality of the data matched and automating more of the process but we cannot conclude, from the information gathered in these pilots, what the cost would be of any national data matching roll out. There was some limited expenditure on databases that were not used in the pilots and so did not deliver any benefit. While the costs of these pilots appear high in terms of numbers of people added to the registers this does not mean that data matching could not be cost effective if implemented differently. In order to assess potential scalability of data matching, it would be necessary to have more consistent information than is available about the costs incurred, and this information should include the additional internal costs incurred by both local authorities and data-holding organisations.

Conclusions
Our conclusions broadly follow the statutory evaluation criteria set out in Sections 35 and 36 of the Political Parties and Elections Act 2009. These criteria concern the degree to which data matching schemes assisted EROs in improving the completeness and accuracy of their registers; resulted in any issues around administration, time and costs; or prompted objections to the schemes.

The registration objectives: completeness and accuracy


On the whole, these pilots did not prove very effective at getting new electors on to the registers. Despite the efforts invested by authorities in the data pilots, very few additions (only 7,917) were subsequently made to the registers. However, better results were achieved where the local authority was able to begin their pilot follow-up activity before, or at a very early stage, of their annual canvass. This was largely because, where the follow up did not begin until later, many people had already registered through the canvass. In these pilots, the most useful databases in terms of adding people to the registers were those which targeted specific under-registered groups (e.g. 168

and 17-year-olds) such as the National Pupil Database (NPD) and the Individual Learner Record (ILR). The issues surrounding the currency of address information on some of the other databases would need to be addressed in order to improve their effectiveness at finding new electors. However, the low number of registrations does not mean that the principle of data matching is not worth pursuing further and many local authorities were clear that they still see potential in it. Refinements to the matching process such as improvements to the currency, quality and compatibility of the data provided would need to be in place before this objective could be fully tested. In relation to improving the accuracy of the registers, the MoD data was useful, up to a point, at helping EROs to amend or delete the records of service voters. Two local authorities amended or deleted a number of their records, representing 9.6% and 13.2% of the total number of service voters held on the register. However, there was limited testing of the usefulness of the other databases for improving accuracy with only one pilot providing information to the Commission on this aspect. Finally, not all the public national databases included in the current scheme were tested to the same degree. As set out earlier in the report, there was no testing of HEFCE or Royal Mail data by the pilot authorities and we are unable to draw any conclusions about the usefulness of this data in addressing the registration objectives.

Objections to the schemes


At the outset there were concerns that the use of public data in this way could generate objections from the public. However, where data has been provided, local authorities indicated they received few objections to the schemes. Where local authorities did receive queries, the vast majority of people were content with the use of the data when the purposes of the schemes were explained to them. This indicates that the data matching pilots did not generate any substantial level of concern amongst the public. However, any future testing or roll out of data matching would need to be well implemented in order to ensure there is continued public support.

Ease of administration
Many pilots raised concerns that in its current format the process of data matching was too labour intensive for regular use. Additional staff resource was required by many of the authorities. In the main, this tended to be due to the large volumes of data received, issues with data compatibility and the workload involved in sorting the data for use. Many authorities also emphasised the need to understand the skill sets required for this kind of activity, and highlighted that in many cases these skills were not held by those currently working on registration activities. In the interim, developing the process of local data matching would not only be useful to EROs in maintaining the registers but would also help to build skills which could be used to understand and manipulate data provided from national databases.

Time and costs


The pilot schemes proved to be both time consuming and costly. However, it is not possible to draw robust conclusions about the long-term cost effectiveness of data matching from these pilots as the processes used here would not be repeated in a nationwide system of data matching. Nevertheless, it is clear that unless the process is made substantially more straightforward, it is doubtful that many authorities would have had the resources available to undertake data matching without additional finance which, in this case, was provided by the Cabinet Office.

Recommendations
This section sets out our recommendations for future data matching activities.

Pilot processes
Further testing of national databases by local authorities would need to be undertaken in order to establish whether data matching is made available for use by all local authorities. Any further testing needs to be set up in a way that addresses the limitations set out in this report in order to ensure that meaningful data can be collated. The Electoral Commission would encourage the Government to consult us in detail in order to achieve this.

10

We recommend that any further piloting (with a focus on improving accuracy and completeness): takes place outside of the annual canvass period and avoids other significant electoral events. Piloting data matching alongside the annual canvass added a layer of complexity to the testing process and meant it was harder for local authorities to isolate the impact of the data matching as opposed to canvass response rates. It also had consequences for local authority capacity to utilise the data when it was available to them. Several EROs thought that data matching could have more use following the canvass to pick up new registrants in the run-up to elections. has a clear framework for the use of data that all participating authorities can follow. This current scheme allowed local authorities to adopt varying approaches to piloting the data they received. The differing methodologies meant it was harder to draw conclusions about the effectiveness of the data and thus the future of the registration system. A clear framework would help to ensure comparability between the pilots but still allow for some local differences for example, targeting particular groups and making use of local databases. tests, as closely as possible, the process which would be made available to all local authorities if data matching was to be rolled out nationally. ensures that participating areas are sufficiently staffed and have appropriate expertise to complete the pilot and test the data provided. allows for a better understanding of the benefits of access to national data compared to existing local databases. allows for a clearer analysis of the cost of data matching through more informed budgeting and prescribed reporting of costs incurred. ensures that good communication between the pilots, the data holders and the Cabinet Office is maintained throughout the process.

Databases
In relation to the specific databases included in these schemes:

11

There is merit in re-testing nearly all of the databases included in these pilots providing the specific issues identified in this evaluation are addressed, namely that: address format compatibility issues should be mitigated where possible. The planned inclusion of Unique Property Reference Numbers (a unique identifier for each address held) on the DWP database will help with this issue, as will plans for a single national address file. Other mitigating steps could be taken for matches with other databases, for example using address cleansing software. Data currency issues should be tackled by ensuring that, where possible, the information shared includes details of the dates on which database records are updated.

We would not recommend further testing of the MoD data, unless the range of data which can be shared is increased. While the data supplied in these schemes was useful for the pilot authorities it is likely to

Proposals for verifying identity


As outlined earlier, the Government is currently considering whether the results from the data matching exercise could be used to confirm the identity of individuals captured by the household canvass during the transition to IER. In relation to this, we recommend that: There is a need for more evidence to support this proposal, given that this is was not an objective of these pilots. Any future piloting that includes this as an objective for testing should allow for an analysis of matched and non-matched records in order to check the accuracy of the matching process used. It is possible that this analysis could use the annual canvass process. As a result the timing of these pilots may need to be slightly different to that of any pilots focused on accuracy and completeness. These plans should also stay abreast of developments in the . There are other initiatives within government on the processes that might be used in the future to verify identity. Learning lessons and adopting best practice from these other initiatives is important in order to ensure that the approach to verification followed under IER, and therefore the security of the registers, is as robust as possible. 12

1 Introduction
1.1 statutory evaluation of the 2011 data matching pilot schemes. The schemes were based on matching a range of national databases by Electoral Registration Officers (EROs) in 22 local authorities. This was the first time that EROs have been able to test the usefulness of national data for improving the quality of their electoral registers. 1.2 The overall aim of the pilot schemes was for EROs to test whether national public databases can help to improve the accuracy and completeness of their electoral registers.

Background
Accuracy and completeness of the electoral registers
1.3 Electoral registers underpin elections by providing the list of those who are eligible to vote. Those not included on the registers cannot take part in elections. Registers are also used for other important civic purposes, including selecting people to undertake jury service, and calculating electorates to inform Parliamentary and local government boundary reviews, which are the basis for ensuring representative democracy. People not registered are therefore not counted for these purposes either. 1.4 In addition, credit references agencies may purchase complete copies of electoral registers, which they use to confirm addresses supplied by applicants for bank accounts, credit cards, personal loans and mortgages. 1.5 Great Britain does not have one single electoral register. Rather, each local authority appoints an ERO who has responsibility for compiling an accurate and complete electoral register for their local area. 1.6 Accuracy The accuracy of the electoral registers is therefore a measure of the percentage of entries on the registers which relate to verified and eligible voters who are resident at that address. Inaccurate register entries may relate to entries which have become redundant (for example, due to people moving home), which are for people who are ineligible and have been included unintentionally, or which are fraudulent. 13

1.7

Completeness

therefore refers to the percentage of eligible people who are registered at their current address. The proportion of eligible people who are not included on the register at their current address constitutes the rate of under-registration. 1.8 Great Br 6 2011 provided the first national estimates of the completeness of the electoral registers since estimates of the 2000 England and Wales registers, as well as the first national estimates of the accuracy of the registers since 1981. This study was funded by the Cabinet Office in order to inform the development of the approach to the introduction of individual electoral registration (IER). 1.9 The research estimated the April 2011 Parliamentary registers to be 82.3% complete; the comparable figure for the local government registers was 82.0%. This equates to approximately 8.5 million unregistered people in Great Britain as of April 2011. However, this does not mean that these registers should have had 8.5 million more entries, because many, but not all, of those not registered correctly may still have been represented on the registers by an inaccurate entry (for example, at a previous address). 1.10 The April 2011 parliamentary registers were 85.5% accurate; the comparable figure for the local government registers was 85.4%. 1.11 The research also demonstrates the extent to which both the accuracy and completeness of the registers deteriorate between the publication of the registers in December each year and the time when elections are usually held in the following spring. Although in December 2010 the estimated number of people not registered in Great Britain was at least six million, by April 2011 the number had grown to around 8.5 million (17.7%).

Current system of updating the electoral registers


1.12 At present, EROs use an annual canvass and rolling registration to update their registers. Individual electors can register to vote throughout the year by However, most updates to the registers take place during the annual canvass, which is undertaken each autumn. At its simplest, the canvass involves delivering a

http://www.electoralcommission.org.uk/__data/assets/pdf_file/0007/145366/Great-Britainselectoral-registers-2011.pdf

14

registration form to each household and following up, via postal reminders and personal visits, those households who do not respond. Revised registers are then published on 1 December. 1.13 Almost all EROs use locally held data, such as council tax and housing records, to improve the effectiveness of their registration activity. However, EROs have not been able to make use of national databases in order to improve the quality of their local registers.

Data matching and the move to individual electoral registration


1.14 The previous UK Government, during the passage of the Political Parties and Elections Act 2009 (PPE Act), introduced legislation providing for the phased introduction of individual electoral registration (IER) in Great Britain. The PPE Act made provision for IER to be introduced in accordance with a statutory timetable. The PPE Act also included provisions to allow data matching pilot schemes to be carried out, with a view to establishing which national public databases might be useful to EROs in helping maintain electoral registers during the transition to IER. 1.15 Under the PPE Act, data matching schemes approved by the Secretary of State would require a public or local authority to supply an ERO with data which they could then use for the purpose of maintaining complete and accurate registers. 1.16 In June 2011 the Coalition Government published a White Paper setting out its plans to speed up the implementation of IER in Great Britain. The new system to be implemented from 2014 will require each elector to register individually (unlike the current system where registration takes place predominantly by household) and to supply personal information for verification purposes prior to names being added to the electoral register. 1.17 The IER White Paper explained that the UK Government would explore, o identify people eligible to vote but missing from the register so they can invite 7 If successful the Government indicated that it would look at how data matching used in this way could be extended across the country and support the move to IER.

HM Government (2011) Individual Electoral Registration, Cm 8108, p11.

15

1.18 In February 2012 the UK Government published its response to prelegislative scrutiny and public consultation on the IER White Paper.8 In the response, the UK Government indicated its intention subject to the results of the evaluation of pilot schemes and further testing to widen the scope of data The UK Government indicated that rather than only using data matching to identify potential electors, it was now their individuals currently on an electoral register will be matched against the data held by public bodies such as the DWP and local authorities themselves that I be matched, the individual will be automatically placed onto the new IER register and would not need to take any 9 Electors whose details could not be matched in this way would be asked to apply individually and to supply personal identifiers. 1.19 The UK Government has acknowledged that this would represent a significant change to the position set out in the White Paper, which envisaged all potential electors applying individually and supplying personal identifiers, with data matching used as a means of identifying potential electors. It stated its an efficient and effective system ready in time to support the implementation of
10

The Electoral Registration Data Schemes Order 2011


1.20 The Electoral Registration Data Schemes Order 2011 (the 2011 Order), made on 9 June 2011, gave effect to proposals by local authorities to run datamatching schemes. Under the 2011 Order, an agreement between the dataholding organisation and the ERO needed to be in place before personal data could be shared between the two parties. The purpose of the agreement was to explain governance arrangements for data transfer and matching, explain the expected outputs and inputs for this process, set out information security standards, and detail timescales. 1.21 The Cabinet Office was responsible for the selection and coordination of the schemes. The process for recruiting local authorities to run pilots was by

HM Government (2012) Government Response to pre-legislative scrutiny and public consultation on Individual Electoral Registration and amendments to Electoral Administration law, Cm 8245. 9 Ibid. 10 Ibid.

16

open application, with the Government wanting to see how people responded to the idea of using national databases to help maintain the electoral register.

Aims and objectives of pilots


1.22 The overall aim of the pilots was for EROs to test whether public databases can be useful for improving the accuracy and completeness of their electoral registers. However, in practice, the majority of pilots were more focused on completeness (finding people eligible to vote but missing from the register) than they were on accuracy (finding and removing inaccurate entries on the register). 1.23 The detailed objectives of the schemes varied due to the open application process and lack of a common framework. Each authority submitted a proposal on how they would undertake a data matching exercise based on particular challenges in their area. These proposals varied in terms of both scale and focus. For example, some pilots matched their whole register with the available data while others targeted particular wards with historically low response rates to the annual canvass. Some areas were particularly focused on certain demographic groups, e.g. attainers or the over-70s, while others looked at all residents. The objectives of individual pilot schemes are examined in more detail later in this report.

Role of the Commission


1.24 The Commission was given a statutory responsibility to report on the effectiveness of the data matching schemes. The approach we have adopted is based on the requirements for an evaluation set out in Sections 35 and 36 of the PPE Act. 1.25 The PPE Act a description of the scheme an assessment of the extent to which the scheme assists the ERO in meeting the registration objectives11, which are: that persons who are entitled to be registered on a register are registered on it

11

Registration objectives are set out in Section 31.8 of the PPE Act 2009.

17

that persons who are not entitled to be registered on a register are not registered on it, and that none of the information relating to a registered person that appears on a register or other record kept by a registration officer is false whether there was an objection to the scheme, and if so how much how easy the scheme was to administer the extent to which the scheme resulted in savings of time and costs, or the opposite anything else specified in the order under Section 35. The 2011 Order did

Our approach
1.26 Our approach to the evaluation has been based on our statutory responsibilities outlined above. We have assessed: The administration of the pilots: the way the schemes were run, any difficulties experienced or lessons learned by local authorities, data holders, other organisations involved and the objections to the scheme. Data quality: the potential for data matching to improve the registration process. Resources: resources and skills necessary for administering the pilots, their costs and the extent to which data matching can result in cost and time savings.

1.27 We worked with the Cabinet Office during the set-up of the schemes with the aim of allowing for an effective evaluation of each pilot as well as the schemes as a whole. We emphasised in particular the desirability of consistency across key components of the schemes (methodology, matching and follow-up process), in order that the findings could be compared across areas and across databases. However, the final decisions made by the Cabinet Office did not always reflect the advice given and no clear, common framework for the pilots was established.

18

1.28 Together with the Cabinet Office, we monitored the work of the participating local authorities throughout the running of the schemes and were in contact with the authorities to provide assistance and address issues. 1.29 The evaluation is based on a range of qualitative and quantitative data collected before, during and at the end of the process. Data and other evidence were collected from: Questionnaires from local authorities: each authority submitted a proposal before the start of the pilots which outlined their objectives and their approach to delivering the scheme. Data from local authorities: we designed a template, with the input of the Cabinet Office, for collecting data from the local authorities about the various databases and the results from the follow-up activities. Local authorities were asked to submit interim data (between August and October) and a final return with all results by 14 December 2011. However, not all authorities met this deadline or provided data in the format requested. Evaluation report from local authorities: all authorities were required to submit an evaluation of their pilot by 23 December 2011 using a template designed by us and the Cabinet Office. The report covers the key areas of the evaluation. Interviews with local authorities: we conducted individual interviews with each participating local authority between the end of October 2011 and the beginning of January 2012. Interviews with data-holders and software suppliers: we also conducted interviews with those organisations that hold the datasets being tested in the pilots and software suppliers who had assisted local authorities with the data. Regular contact with the Cabinet Office: we liaised closely with the Cabinet Office throughout the project and were part of a Registration Improvements Board, which monitored the progress of the pilots.

This report
1.30 This report considers the effectiveness of the data matching schemes in improving the accuracy and completeness of the electoral registers.

19

1.31 The remainder of this report is divided into the following: Chapter 2 summarises the set-up and coordination of the pilot schemes by the Cabinet Office, including details of the selection process and issues relating to the timing of the pilots. Chapter 3 summarises the national databases included in the pilot schemes. Chapter 4 sets out details of each specific pilot area and issues encountered by the pilots in delivering the schemes. Chapters 5, 6, 7, 8 and 9 set out the key data, provided by the local authority pilots, for each of the national databases accessed. It reviews the quality of data returned to each local authority and the usefulness of that data in meeting the registration objectives set out for the schemes. Chapter 10 summaries the costs of the data matching schemes. Chapter 11 summarises the key findings and recommendations.

20

2 Set-up and coordination


2.1 This chapter sets out how the pilots were set up and coordinated by the Cabinet Office. It also considers the impact of the approach to the management of the pilots on the findings of the evaluation. Key points The Cabinet Office managed the overall pilot process and was also involved in the delivery of the pilots. The Commission advised the Cabinet Office throughout the set-up period, in particular on the need for a clear, common framework for delivering the pilots, but the final decisions on the processes were taken by the Cabinet Office. The open application and selection process used for the pilots, and the absence of a clear, common framework, contrary to our advice, led to significant variation in the planned approaches of the pilots. This introduced challenges for the evaluation in comparing the results of the different schemes and therefore the ability to draw consistent conclusions. This also created challenges for local authorities delivering the pilots because in several cases the methodology they originally planned to use proved to be based on incorrect assumptions. The timing of the pilots, which took place alongside the annual canvass, coupled with delays to the process, put pressure on the capacity of the local authority teams involved and added to the difficulty in this evaluation of drawing firm conclusions from the pilot schemes as a whole. Local authorities reported varying levels of communication with the Cabinet Office and identified areas for improvement, including a better understanding of what data was going to be shared with them. The pilots did not follow processes, in terms of the IT systems and matching arrangements, which would be used for nationwide data matching. The evaluation cannot therefore draw conclusions about how the costs of these pilots would translate to a national roll-out.

21

Overview
2.2 The encompassed: design of the pilot framework, including drafting of secondary legislation that set out how the pilots were to operate and when the pilots were to be undertaken issuing to all local authorities an invitation to participate, and selecting which areas were to take part in the scheme overseeing the delivery of the pilots by local authorities negotiating with data-holders to allow for matching to take place ensuring appropriate confidentiality and data security agreements were in place with participating areas and data-holders developing the data matching process for some databases, overseeing the match with the register providing funding for the scheme and overseeing payments to dataholding organisations and local authorities

2.3 There are several aspects of the set up and management of the schemes which introduced challenges for local authorities delivering the pilots. They have also made it more difficult for our evaluation to draw clear conclusions on the success of the pilots. These are considered below.

Selection of the pilot schemes


2.4 The Cabinet Office issued an invitation, in September 2010, to all local authorities in Great Britain to pilot data matching. To participate, authorities were required to submit a proposal outlining their objectives for data matching, how they would deliver the scheme, and estimated costs. Each of the authorities then provided further information about the proposed delivery of their pilot in order to inform the selection process. 2.5 The Cabinet Office assessed the ability of the authorities to meet the requirements of the scheme and selected participants based on the quality of their application, taking into account the demographic groups they wanted to 22

target, any innovative ideas they proposed and the estimated budget for the activity. The geographic spread of the final group of selected authorities was also considered. The final group of pilots was chosen in January 2011 and the statutory instrument for the schemes was confirmed in June 2011.

Variable methodologies
2.6 As noted, local authorities were encouraged to submit their own proposals and suggestions as to how data matching might work in their area. The Cabinet rationale for the open application process was to allow local authorities to identify ways in which data matching might help them to address the particular challenges or target audiences relevant to their local area. 2.7 While there are advantages to encouraging ideas and innovative approaches from local authorities, we consistently stressed, in our advice to the Cabinet Office, the need for a clear framework for the pilots, which would provide consistency in delivery and therefore allow for an effective evaluation. We also formally raised this need as part of our response to the Cabinet Office consultation on the Electoral Registration Data Schemes Order 2011 and the Representation of the People (Electoral Registration Data Schemes) Regulations 2011.12 2.8 However, no specific instructions were given to local authorities about how to implement the schemes, and no clear framework was put in place to ensure consistent delivery, although some support was available from the Cabinet Office. 2.9 The absence of a clear framework for delivery meant that a wide variety of approaches were adopted for implementing pilots and this wide variation has made it more difficult to draw clear comparisons in this evaluation. For example, authorities differed in how they treated the match scores in the data returned to them (for more information see Chapter 5). A register entry which was matched against an entry on the Department for Work and Pensions Centric (DWP) database would be scored between 10 and 100 depending on the exact nature of the match. Some areas chose to treat all scores above 55 as a match while others chose all scores above 80. The Cabinet Office did not attempt to impose any standardisation of approach. This has implications when comparing the quality of the results across local authorities.

12

www.electoralcommission.org.uk/__data/assets/pdf_file/0011/117695/ElectoralCommission-consultation-response-Data-matching-SI.pdf

23

2.10 The approach adopted for contacting people identified as a result of data matching varied across authorities, involving either one or more letters to names identified or one or more visits by canvassers to the addresses associated with those names, or a combination of both letters and visits. The variety of approaches taken complicates the analysis of the results, as a high response rate in one pilot may have more to do with the use of canvassers than the quality of the data for that area. 2.11 Finally, the open nature of the application process meant that the initial proposals also often made assumptions about certain processes or criteria being in place for delivering the pilot schemes. Consequently, when some of these assumptions proved to be incorrect, authorities struggled to deliver the data matching scheme. As one local authority set out in their evaluation report: maybe for future work, the Cabinet Office needs to be a little more prescriptive on the processes and outcomes it requires.

Timing
2.12 The data matching schemes had originally been due to commence in June 2011 with all activities (including evaluations) to be completed by September 2011. During this set-up period, we emphasised the importance of avoiding significant overlap with the annual canvass. 2.13 However, the Cabinet Office decided to allow pilot activity to continue until the end of November 2011 with evaluations taking place afterwards. In addition, there were delays at the outset that compounded the problem. The authorities had expected to receive the data in late June 2011. However, due to delays in ensuring all the necessary technical arrangements and data access agreements were in place, the matching of the registers did not commence until July August for most authorities. These delays meant that a number of authorities had to adapt their approach to testing the data because they no longer had the resources available to manage the process or because they had anticipated contacting residents in advance of their canvass beginning, but were no longer able to do so. One local authority commented: Slipping of the timetable made it impossible to complete the pilot as it was first intended. 2.14 Running the pilots alongside the annual canvass added a complicating factor both for the delivery of the pilot schemes and also for assessing the value of data matching. It also had an impact on the capacity and resources of authorities to use the data returned to them (these issues are considered further 24

in Chapter 4). It was in anticipation of these problems that we raised concerns during the initial planning phase about the proposed timing of the schemes.

Control groups
2.15 For most areas it was not therefore feasible to contact local residents before the annual canvass had begun across their area. To address this issue, we encouraged pilots to create control groups of names identified from the national data, where no dedicated follow up would take place and the names would subsequently be tracked in the annual canvass. 2.16 This was intended to determine how many would have been registered anyway in the absence of the pilot. However, not all the authorities were able to put in place a clear process for separating out the canvass from the data matching activities and often people identified to be followed up by letter were found to have already registered through the canvass. 2.17 For the purposes of this evaluation this means that data on the response rates for those names followed up by pilot authorities has to be viewed in the context of how the authority was able to manage the two processes of the pilot and the annual canvass. 2.18 As the example below shows, in many cases the fact that people were registering through the canvass depressed the response rate to letters issued though the pilot process.

25

Effect of the canvass on pilot response rates The matching process suggests 500 names that appear to be resident in the area (because they appear on another database) but are not found on the electoral register. The canvass has already begun by the time the authority is in a position to write to these individuals and when the 500 names are checked against canvass returns 150 are found to have registered already. The pilot can only therefore write to the remaining 350 names. From the 350 letters issued, 50 respondents register to vote equating to a 14% response. There are fewer responses because it is very likely that there will be proportionately more incorrect names, ineligible people or people less likely to register among the 350 than among the original 500. This is mainly because 150 people who are resident, eligible and interested have already been removed. But if all 150 had responded to the letter as they did to the canvass form the response rate would have been 40% and even if only half (75) had responded it would still have been notably higher at 25%.

Communication
2.19 The Cabinet Office had intended to run monthly meetings with the pilot areas. While some meetings took place, they were less frequent and more sporadic than had originally been anticipated. Notwithstanding this, the Cabinet Office also made themselves available to local areas to discuss issues and this was noted by several authorities. For example, one authority reported: Throughout the project general communication with the Cabinet Office and the provision of update information was effective. 2.20 However, some areas also commented that in the immediate run-up to the matching taking place they noticed decreasing contact from the Cabinet Office. They did not feel fully informed about changes to the process and in some instances noted that queries went unanswered. 2.21 For example, they had expected that the data returned from the DWP would include unique property reference numbers to ensure that addresses on their electoral register could be found on the DWP database. They had also thought that the data would include dates of record changes. Neither of these

26

elements was included in the data returned. Some of these things were crucial to the effectiveness of the pilots and are discussed further below. 2.22 Several pilot authorities also indicated that they did not know what the format or layout of the data matching results would be before they were sent to them. Practically, these issues meant there was a period of confusion among several pilots when they initially received the results of the matching activity. 2.23 Several pilot authorities and the DWP felt that it would have been beneficial to have had more direct communication, rather than always using the Cabinet Office as a go-between. This may have helped to ensure the pilots were more up to date about the process and in a better position to interpret the outputs from the matching process.

Scalability
2.24 The technical matching processes and the IT systems used in these pilots could not be scaled up and rolled out across Great Britain. For example, data files were sent to and from local authorities by email, with some matching carried out by DWP directly and some by a team within the Cabinet Office (see Chapter 3 for more details). The approach worked for this limited number of pilots but would not be sustainable for every local authority in Great Britain. 2.25 This also has an impact on the analysis of the costs of these pilots as the individual budgets relate to processes which would not be replicated. As a result this evaluation can make only limited comment on the value for money of data matching. 2.26 Nonetheless, running the data matching schemes has allowed for a greater understanding of the requirements of any framework for national data matching.

27

3 Databases and the matching process


3.1 This chapter considers the national public databases that were included in the pilot schemes. As noted above, the Cabinet Office arranged for access to a range of public databases through discussions with the relevant data-holding organisations. The databases that could be accessed by each pilot were then set out in the statutory instrument for the schemes.13 Key points Ten databases were due to be tested as part of the scheme. However, not all were tested to the same extent. In particular, there were difficulties in accessing Higher Education Funding Council England (HEFCE) and Royal Mail data. A key component of the trials was the matching process between databases to identify people to invite to register to vote. The processes employed could not be rolled out nationally but do allow for a greater understanding of the requirements of any framework for national data matching. Two different processes were used for matching the electoral registers with the Department for Work and Pensions (DWP) data on one hand and the Driver and Vehicle Licensing Agency (DVLA) and education databases on the other. These different rules make it difficult to compare the results from the two processes.

Overview of databases
3.2 Broadly, the databases fall into two groups: P Centric database (everyone with a national insurance number), and the DVLA driver database (the
www.cabinetoffice.gov.uk/sites/default/files/resources/schemes-order-draft.pdf

13

28

Department for Transport estimated that in 2010 80% of men and 66% of women had a driving licence) the education and Ministry of Defence databases 3.3 Table 1 sets out which databases were included in the statutory instrument. It also sets out the coverage of each database and a brief overview of how they are updated. While each database contains different information, the pilots only accessed the specific fields needed to match to the electoral registers: name, full address and, in some cases, date of birth (so although, for Centric database includes national insurance numbers this information was not included in the data supplied to local authorities).

Access to the data


3.4 Between them, the participating authorities were due to test all the databases included in Table 1. However, there were some difficulties in accessing some of the databases, which meant that authorities were not able to use them as had originally been anticipated. HEFCE data 3.5 HEFCE decided not to provide the data directly to local authorities, instead restricting access to a computer screen at the Cabinet Office. This meant that local authorities could not adequately compare the data against their registers or locally held data. It also prevented them from testing the quality of the data through contacting any of the names on the HEFCE database but not on the register. Royal Mail data 3.6 There were delays in Royal Mail agreeing and signing the Article 4 agreement which was required before data could be transferred. By the time that the legal agreements were in place only one pilot was still interested in the data (Colchester). The data was therefore matched but was only available to be sent to Colchester on 30 November when the staff at the local authority were participating in strike action. The data was therefore not sent to Colchester although they would not have been able to make significant use of it at that point anyway.

29

Table 1: Databases

key information Database Centric Coverage of pilot data All those with either a national insurance number or a child reference number All those holding a provisional or All pupils in state or partially statefunded schools in England All learners at state-funded further education institutes All current students with a loan or grant All service voters All addresses classed as service family accommodation Updates I daily by a range of sources including benefits offices, pension providers and employers Driver details are updated online or by form when the driver provides the information Information is collected annually from each school via the relevant local authorities Information is collected at set points during the year Student initiated: details are updated online, by phone or by form Ad hoc updates by individual service voters Centrally managed by Anite

Organisation Department for Work and Pensions (DWP) Driver and Vehicle Licensing Agency (DVLA) Department for Education (DfE) Department for Business, Innovation and Skills (BIS) Student Loans Company (SLC) Ministry of Defence (MoD)

Driver database National Pupil Database (NPD) Individual Learner Record (ILR) Customer database Joint Personnel Administration Anite housing database

30

Table 1: Databases Organisation Improvement service14

key information (continued) Database Citizens Account Coverage of pilot data All individuals who chose to maintain an electronic record Updates Ad hoc updates by individuals and updates linked from other sources (where consent has been given) All students at state-funded higher Information is collected annually from higher education institutions education institutions

Higher Education Funding Higher Education Council England (HEFCE) Statistics Agency (HESA) individualised student record Royal Mail Change of Address

All those who register their change Information is provided directly by home of address with the Royal Mail movers close to the time they move house

14

The Improvement Service is a partnership between the Convention of Scottish Local Authorities (COSLA) and the Society of Local Authority Chief Executives (SOLACE). It is a company limited by guarantee.

31

The matching process


3.7 The first step in the process was for participating areas to provide their electoral registers (either to the data-holding organisation or to the Cabinet Office) for matching. The matched data was then returned to the local authorities, who used the data to decide who to contact to register. In practice this meant that, following interrogation, local authorities followed up names found on the national databases and not on their register. Figure 1 below illustrates how the process of data matching broadly worked. Figure 1: The pilot process All or some of electoral register extracted by pilot authority

Sent as an encrypted ZIP file by secure email to DWP/Cabinet Office or MoD

produced detailing results of process Sent as an encrypted ZIP file by secure email to relevant pilot authority

results against local data sources for an additional level of check

Identify names to be followed up either to encourage registration or to query validity of existing registration

Follow up activity e.g. issuing letters or sending canvassers

Responses resulting in new registration, deletion, amend or no action 32

3.8 the national database records they had been matched against with the accompanying match score (see below for further information). It also contained those records which did not match either register entries or national database entries. Data transfer arrangements 3.9 Throughout these pilot schemes data was transferred as attachments by secure email. However, one data-holding organisation stressed to us that this was not their preferred method for sending sensitive data and that their future involvement in any further piloting would be at least partially dependent on more robust data transfer processes being put in place. 3.10 In addition, the use of email attachments led, in one instance, to the match file for one local authority (containing electoral register entries and data from one national database) being returned to another. In this case the mistake was swiftly identified and the local authority that wrongly received the data deleted the file. However, it is clearly important that any future data matching system (potentially involving hundreds of local authorities) avoids such errors.

Variation in matching processes


3.11 Although Figure 1 provides a generic step-by-step guide to the data piloting process there were three separate processes in relation to the accessing and matching of the registers to the different databases: For the match with data from the DWP Centric database, the matching process was carried out by the DWP and the results provided to the local authorities. For the match with the MoD data, the matching of personnel records to the register was completed by the MoD and the results provided to local authorities. The matching with all other databases was carried out by staff within the Cabinet Office and the results were returned to authorities in a single .

Matching process for the Department for Work and Pensions 3.12 The process used for matching against the DWP Centric database was a new, previously untested approach, designed by the Cabinet Office. It used the first name (F), surname (S), first line of address (A) and the postcode (P) from register entries in order to match them against the DWP Centric database. Each 33

examples where there is a small difference in spelling, where one names sounds like another or where one name includes another, e.g. 3.13 A score was assigned to a match depending on the interaction of the four variables and whether they were exactly or fuzzily matched. In the list below fuzzy matches are denoted by an apostrophe. So, for example, a score of 80 would be awarded for a fuzzy match first name and surname and an exact match postcode and first line of address. FSA = 90 = 85 F S P A = 100 = 99 = 95 = 94 PA = 45 = 40 = 20 = 10 FPA FSP = 50 = 65 = 60

3.14 The matching process used for all the other databases, apart from those owned by the MoD, was explained by the Cabinet Office as follows: The matching process used for all the other databases (apart from MoD) was based on a complex matching process contained in an IBM proprietary product (IBM was commissioned to provide the central hub services). This approach either marked each record as unmatched or gave it a score ranging from 81 to 118. The matching algorithm used in this process was very sophisticated but (unlike at DWP) tended only to identify firm matches in the great majority of cases. Impact of variation 3.15 For the purposes of evaluating the comparative strengths and weaknesses of these databases in updating the electoral registers, the use of several processes was not ideal. It also added to confusion among the local authorities over how to interpret the data and hampered attempts to cross reference information provided by the DWP with information from other databases. 34

3.16 However, the most important difference was that the process used for the DWP Centric match was less strict than that used for matching against the DVLA and education databases. As a result matches against the DVLA, for example, which would have matched (at least partially) through the DWP process were not counted as matched for DVLA. This has clear implications for the comparability of the results from the DWP match and the other databases.

35

4 Pilot authorities: overview and emerging issues


4.1 This chapter sets out details of each of the pilot schemes in terms of the databases they accessed and the groups or areas they targeted. It goes on to consider some of the key issues identified by the local authorities in the delivery of the data matching pilots. Key points There were 22 data matching pilots testing various combinations of databases. The complex nature and format of the data returned to local authorities highlighted the need for good data management and analysis skills. Many of the pilots either had these skills available within the local authority or recruited additional staff using Cabinet Office funding. However, several did not and struggled to use the information provided. The process, as tested in these pilots, was labour intensive with significant work required to analyse the data. Those involved felt that the level of work required would not be sustainable in the future. A number of pilot authorities were able to use locally-held data to interrogate the data received from the national databases. The results of this activity suggest that there is scope for more use to be made of local data both to complement any future national data matching and to improve accuracy and completeness in general.

Overview
4.2 Twenty two local authorities were selected by the Cabinet Office to take part in the data matching schemes. Table 2 provides the full list of each participating authority and which databases they planned to access. It also outlines whether or not they matched their full register or part of their register, and which groups they were targeting as part of the data matching scheme. 4.3 These differences should be remembered when considering the results from each pilot. In addition, some local authorities opted to conduct a targeted 36

follow up either in specific areas or with specific groups, while others followed up with random sample of names from across their area. The results from these different exercises are not, therefore, always comparable. 4.4 profiles in Appendix A. and results is provided in the

37

Table 2: Data matching pilots Local authority Blackpool Camden Colchester Forest Heath Forest of Dean Glasgow Greenwich Lothian Manchester Newham Peterborough Renfrewshire Rushmoor Shropshire Southwark

overview Target groups Empty properties in low responding areas Students, young people and the mobile population General under-registered and service personnel Young people and the mobile population Attainers Students, young people and the mobile population Young people, BME groups and those underregistered for financial reasons General under-registered Empty properties, students and BME groups Young people and the mobile population Seasonal workers and those living in houses of multiple occupation (HMOs) General under-registered Service personnel Service personnel General under-registered Area Six electoral wards Whole register Whole register Whole register Whole register Two electoral wards Whole register Whole register Whole register One electoral ward Whole register Service voters list Service voters list Three electoral wards

Database(s) requested DWP Centric, NPD, Royal Mail, ILR, HEFCE, DVLA DWP Centric, ILR, SLC, NPD, HEFCE DWP Centric, SLC, MoD, Royal Mail DWP Centric DWP Centric, DVLA, ILR, NPD, HEFCE, DVLA DWP Centric, DVLA, SLC DWP Centric, DVLA, ILR, NPD, HEFCE, MOD DWP Centric DWP Centric, DVLA, SLC DWP Centric DWP Centric Citizen Account MoD MoD DWP Centric, Royal Mail

38

Table 2: Data matching pilots Local authority Stratford-on-Avon Sunderland Teignbridge Tower Hamlets Wigan Wiltshire Wolverhampton

overview (continued) Target groups Attainers, over-70s and Service personnel Students and BME groups Attainers, young people and the mobile population General under-registered Students and young people Service personnel Attainers, young people, BME groups and the mobile population Area Whole register One electoral ward Wards with 5%+ nonresponse to canvass Whole register Whole register Service voters list Whole register

Database(s) requested DWP Centric, MoD DWP Centric, ILR, NPD, SLC, HEFCE DWP Centric, DVLA DWP Centric, ILR, NPD, HEFCE DWP Centric, DVLA MoD DWP Centric, NPD, ILR, HEFCE

39

Emerging issues
4.5 There are several key evaluation findings which relate to the different skills, capacity and experience of the local authorities involved in the pilots.

Skills
4.6 There was significant variation between the pilots in terms of the skills available within the local authority as a whole. Each pilot was generally led by electoral administrators, who used support available to them within their team, within the local authority or externally. 4.7 The pilot authorities divided into three groups with regard to how they managed the data: Those who managed the pilot within their existing electoral services team and who had access to existing data management or IT expertise within the wider local authority Those who intended to manage the pilot within their existing electoral services team with no dedicated local authority data handling team Those who used pilot funding to recruit additional, temporary staff for the purposes of data analysis and data management

4.8 Those authorities with data management support were able to interrogate the information provided to a much greater extent, while others were unable to do so and used the data as it was provided. In the absence of data analysis and volumes of data but with some authorities receiving hundreds of thousands of lines of data there was a clear need to automate some of the process. In extreme cases the electoral services team were simply overwhelmed by the volume of data provided and in the absence of data management skills could do little with the information. For example, one area explained: Because of the unexpectedly large numbers of apparently probable new identities found in the data matching, and the quantity and difficulty of dealing with such large volumes of data, it was decided to limit the number of potential electors

40

4.9

Another area noted: Some of the technical issues relating to local management of received data, and conversion to a form that could be used for our purposes, needed considerable time to deal with, and suggests there is a need to develop a range of data management knowledge and skills not available within the current local electoral services team.

4.10 In a couple of pilots, as a result of delays or a lack of skills and capacity, no follow-up activity was undertaken even where data matching identified people potentially eligible but not on the registers. 4.11 Electoral administrators in the authorities with good data management support were also clear that they could not have coped with this volume of data in the absence of that support as they do not have the skills themselves. 4.12 Although any future roll-out of data matching across the country would not follow the process used in these pilots, any process which requires electoral administrators to manipulate and analyse data would require a change in the skill sets of many electoral registration teams. The closer any data matching process gets to an automated provision of lists of potential new electors (which can be easily integrated into the software used to manage the registers), the smaller the required change will be.

Capacity
4.13 In addition to the skills required to make full use of this data there was a more general need for additional capacity within many teams. The volume of data provided meant that several areas could not do as much with the information as they had originally intended. This was exacerbated by the delays which pushed the process further into the canvass period. 4.14 Several pilots raised concerns that in its current format the process of data matching was too labour intensive for regular use. Many also pointed out that they are currently facing significant cuts in budgets and as a result they can only see a future for data matching if it is able to improve registration with no additional, or a reduced, financial burden for the authority. 4.15 This is significant not just because local authorities are unlikely to be expanding their electoral services teams in the near future but also because it calls into question the likelihood of significant resources being devoted to

41

training existing staff or providing new data management software without clear evidence that it may lead to cost savings later. For example, one area reported that: Currently there are too many records to make this a viable exercise with the resources available.

Local data matching


4.16 Related to the availability of relevant data management skills is the variation between the pilots in their existing use of locally-held data to assist with electoral registration. EROs have powers to inspect other data held by the local authority for the purposes of maintaining the register and the vast majority of EROs make some use of information, e.g. from council tax records. However, there is substantial variation in both the range of data accessed and the methods by which it is used. 4.17 For example one of the pilots, Newham, has developed a system for use across the authority, which draws together information from sources including council tax, housing benefit, libraries and leisure centre records to create a searchable electronic database of residents in the borough. But another similar authority only regularly accesses council tax records provided in Excel spreadsheet format. The different starting points of these two pilot authorities coming into the pilot process meant that while the first could draw on the expertise built up during the development of their in-house system, the latter needed 4.18 This also meant that Newham could check the data provided through the pilot against the data they already held on their systems, gathered locally. As a result Newham only issued letters to names which were suggested by DWP Centric and could be corroborated by local data. As the data in Chapter 5 indicates, this did not result in significantly higher registrations than other areas but, unlike several pilots, they received very few responses indicating that the person written to was no longer resident. 4.19 This evidence is not conclusive but it does suggest that there is significant scope for more use to be made of local data.

Conclusion
4.20 This chapter has provided an overview of the pilot schemes, setting out the databases they accessed and the groups or areas they targeted. It has also considered the key issues that cut across all the pilot areas the skills and 42

capacity of electoral services teams as well as the importance of good use of local data. 4.21 The following chapters go on to consider the results of the matching exercise and follow up, in turn, for each national database.

43

5 Data matching results: Department for Work and Pensions


Introduction
5.1 This section sets out the key data produced by the pilot authorities using DWP Centric database. It focuses on the levels of match found between the electoral registers and the DWP Centric database as well as the results of the follow-up activity undertaken using the data provided. Key points Eighteen pilots accessed the DWP Centric database. The level of match between the electoral registers and the DWP database varied significantly between local authorities. For those areas matching the whole register it ranged from 57.6% to 82.4%. These differences are partly due to different interpretations across the pilots, in the absence of a consistent framework, of what constitutes a match, but are also likely to be driven by differences between the local authorities in terms of demographics. 6,573 people were added to the registers as a result of follow-up activity undertaken using names suggested by the DWP Centric database. This was 13.2% of all names followed up. The response rate for the pilot follow up was affected by whether the activity took place before, during or after canvass. Where it took place during the canvass, the pilot response was depressed by the fact that many of the names identified by the match registered through the canvass. The pilots highlighted crucial differences in addressing formats between the electoral registers and the public national databases. This meant that many records could not be matched as simple addressing differences were not recognised. This problem could have been significantly reduced if time had been allowed for an address cleansing exercise. 44

The absence of a unique identifier attached to each address on the national databases was a key issue for the pilots. These would have allowed for a more straightforward matching process for local authorities. Many of the potential new electors suggested by the match with the DWP Centric database proved to be based on out-of-date or incorrect information. The problems posed by this could have been reduced by the inclusion of the date when the DWP record changed something which DWP were willing to provide but were not asked to do so. The absence of nationality information meant that several pilot authorities conducted follow up with people ineligible to register. The scope of this issue is not clear from these pilots and will have varied depending on the demographic nature of the local authority area. -legislative scrutiny of their policy on individual electoral registration, to use national data sources to verify the identity of electors has not been tested by these pilots. Further piloting would be required to ensure that the advantages and disadvantages of these proposals are understood.

Matching results
5.2 Eighteen of the 22 pilot areas accessed data from the DWP Centric database. The results of the matching process are presented in Table 3 and are based on the data supplied by local authorities. 5.3 The results vary considerably across different local authorities and this is partly explained by the different approaches adopted by each pilot area (as set out in Chapter 2). For example, the Stratford pilot, with the highest reported level of match, focused its attention on attainers and the over-70s which made it more likely they would see a higher match level (as both groups are less likely to change address frequently). 5.4 Also, as mentioned above, the match score at which a pilot accepted a result as a match varied significantly and this has led to greater variation in the data returned (see Chapter 3 for a full explanation of the match process and scores). For example, Peterborough appears to record a relatively low match level (54.7%) but they accepted only those records which scored 99 and above as a match. Wigan records a high match rate (82.4%) but accepted all records scoring 45 and above as a match. If a score of 65, between those two levels

45

was accepted as a match,

5.5 Those pilots with the ability to fully interrogate the data found that the likelihood of a match did not necessarily increase with the score. For example, Colchester found that while those records that scored 65 were very likely to be real matches, there were many with scores above that which proved to be false positives, i.e. they received a high match score but, on checking, were found not to be true matches. Where a local authority had this the data received from DWP, the match rate cannot easily be compared like-for-like with thos . 5.6 The data presented in Table 3 should therefore be treated with some caution. Nonetheless the results show: There is substantial variation across local authorities regarding the level of match between the electoral registers and the DWP Centric database ranging from 45.7% to 85.3%. In total, 1,925,336 register entries were sent for matching and 1,370,006 were found on the DWP Centric database.15 That equates to a match level of 71.2%. The percentage of register entries sent for matching but not found on DWP Centric varied across local authorities from 12.4% to 47.6%. The number of records found on the DWP Centric database and not on the register as a percentage of the total number of entries sent for matching varied from 4% to 73.5%.

5.7 In relation to the last bullet point, it is significant that, excluding the pilots that targeted particular wards, the areas where DWP Centric data suggested the most new names (relative to the number of register entries sent for matching) are also those we would expect to have the most population churn (i.e. a high number of people moving in and out of the area) Newham, Glasgow, Southwark and Camden. 5.8 In other words it is likely that the DWP Centric database is capturing information on a large proportion of the people who move through an area even

15

The figures shown are totals for those pilots where data has been provided to us on both the number of records sent for matching and the number of matches.

46

if they are only resident for a short period of time. These records are then being returned in the matching process as resident but unregistered. 5.9 In relation to the potential benefits from greater use of local data (see paragraphs 4.16 4.19), it is interesting that Southwark discovered that, of all the names found on the DWP Centric database and not on the electoral register, 14% could also be found on local data sources. Again this suggests that there is some potential benefit to effective use of local data especially as access to it should be substantially easier to manage than access to national data.

47

Table 3: DWP Centric matching: data analysis Local authority Part of the register sent for matching Score accepted as match 16 30% up 60% up Varied 94% up 55% up 40% up 55% up Varied 99% up Varied 95/100% up 45% up ERO records sent17 33,210 153,290 126,983 45,696 1,143 47,666 184,438 654,515 210,000 8,009 30,840 17,342 10,659 ERO records matched 98,286 100,512 828 21,762 130,048 474,113 121,000 4,379 18,204 14,787 5,927 % match On % on On DWP % on register / register / not on DWP not on only register only DWP 15,798 47.6% 5,907 17.8% 38,630 25.2% 30,978 20.2% 26,653 21.0% 20,119 15.8% 315 27.6% 116 10.1% 21,612 45.3% 16,590 34.8% 31,120 139,613 89,000 1,795 5,445 1,328 16.9% 21.3% 42.4% 22.4% 17.7% 12.5% 47,384 65,853 101,000 5,886 8,849 932 2,619 25.7% 10.1% 48.1% 73.5% 28.7% 5.4% 24.6%

Blackpool Camden Colchester Forest Heath Forest of Dean Glasgow Greenwich Lothian Manchester Newham Peterborough

Six out of 21 wards Whole Whole Whole 16/17 year olds Student wards with lowest response rate Whole Whole 5% random sample Whole Ward with lowest response rate

64.1% 79.2% 72.4% 45.7% 70.5% 72.4% 57.6% 54.7% 59.0% 85.3% 55.6%

Southwark Partial Stratford-on-Avon Attainers/Over 70 years old Sunderland


16

One out of 25 wards

ature of the matches. All accepted match scores are approximate as each pilot may have carried out limited work on records which scored below the level indicated in the table. 17 Some of these figures reported by the local authorities may include empty properties in addition to the number of electors sent for matching. Where voids are included the match rate shown will be lower than t

48

Table 3: DWP Centric matching: data analysis (continued) Local authority Part of the register sent for matching Score accepted as match 99% up ERO records sent 37,000 ERO records matched 30,972 % match On % on On DWP % on register / register / not on DWP not on only register only DWP 83.7% 3,368 9.1%

Teignbridge

Polling districts with 95% or lower 2010 canvass response Whole Whole Whole

Tower Hamlets Wigan Wolverhampton Total Average

45% up 65% up

250,710 192,741 2,004,242

206,678 142,510 1,370,006

82.4% 73.9% 68.4% 68.3%

31,095 36,566 438,970

12.4% 19.0% 21.9% 25.5%

10,025 30,788 350,414

4.0% 16.0% 17.5% 22.9%

Note: Where a cell contains - the required data was not provided to the Electoral Commission before publication of this report.

49

Follow-up activity
5.10 A total of 6,573 individuals were added to the electoral registers as a result of the follow-up activities undertaken as part of the pilot, representing 13.2% of the total records that were followed up from data matching. Table 4 sets out the results for each of the pilots using this data, in terms of the numbers followed up and registered. 5.11 The percentage registering from all those followed up ranged from 1% to 44%. However, that 44% was a significant outlier the next highest response rate was 28%.18 The majority of the pilots reporting response rates at the higher end were ones where a specific area or target group was the focus. Pilots which matched the whole register and followed up a random sample tended to record lower rates. 5.12 The relatively small number of electors being added to the registers from the follow-up, particularly in response to the letters that were issued, is likely to be the result of several factors, including: many of the names provided were for people who are no longer resident (or never were in the case of correspondence addresses on the DWP Centric database) the fact that the pilots took place alongside the canvass when people are also registering through that process

5.13 Although the pilot groups should, ideally, not have been included in the canvass, in some areas the delays in the process and the complexity of the task meant that all properties were canvassed as normal ahead of contact being made with the pilot group. For example, Greenwich originally identified just over 4,000 names to receive a direct invitation to register but nearly 500 of them were subsequently found to have registered anyway during the canvass. Of the remaining names who were contacted, only 5.7% registered. 5.14 The figures also show that the control groups (see paragraphs 2.15-2.18 for more information on control groups) commonly recorded a better registration rate than the pilot groups (22% against 13.2%). This is unsurprising as the

The percentage registering in the Blackpool pilot was higher at 29.9% but the denominator for this calculation was the number of properties, not people, and as a result the response rate is inflated.

18

50

control groups were subject to the full annual canvass, including door knocking and reminders, which would have maximised response rates. 5.15 A more revealing comparison is between those pilots (matching the whole register) that sent their pilot follow up letters at the same time as canvass forms were first issued (Southwark and Wolverhampton) rather than during the canvass (e.g. Camden, Greenwich). Both Wolverhampton and Southwark recorded higher response rates than other authorities matching the whole register, at least partly because the canvass had not already captured the registration of anyone included in the pilot group.

51

Table 4: DWP Centric data: Numbers followed up and registered and control groups Local authority Stage of annual canvass at which follow-up started Alongside annual canvass After the first canvass reminder letter Final month of annual canvass (November) After annual canvass After first annual canvass letter Total number followed up 2,467 9,230 3,693 4,696 33 331 Total added to electoral registers 72719 387 169 5 94 New electors as a % of those followed up 29.5% 4.2% 4.6% 15.2% 28.4% Control group 2,466 1,234 1,678 70 Two polling districts 4,176 9,875 % of control group registered 31.2% 15.9% 4.2% 72.9% 80.3%

Blackpool Camden Colchester Forest Heath Forest of Dean Glasgow

Greenwich Lothian Manchester Newham Peterborough Southwark Stratford-onAvon Sunderland Teignbridge


19

After annual canvass Prior to commencement of annual canvass No follow up Stage 3: door-knock stage No follow-up Alongside annual canvass Before and during the first canvass forms After annual canvass No follow-up

3,713 10,215 0 1,902 No follow up 5,829 1,035 2,408 0

211 1,139 0 79 0 2,545 10 297 0

5.7% 11.2%

13.0% 31.0%

4.2%

No control group No control group No control group

43.7% 1.0% 12.3%

648 141

42.0% 72.8%

No control group 0 0

This figure includes electors added as a result of follow up from the National Pupil Database (NPD) match as well. The pilot was unable to separate the results.

52

Table 4: DWP Centric data: Numbers followed up and registered and control groups (continued) Local authority Stage of annual canvass at which follow-up started Between first and final stage of annual canvass Majority of letters sent just before annual canvass Total Total added to New electors Control % of control number electoral as a % of those group group followed up registers followed up registered 5,012 187 3.7% 1,138 26.8% 3,868 49,736 723 6,573 18.7% 13.2% 14.0% 6,992 28,418 12.7% 22.0% 36.6%

Tower Hamlets Wigan Wolverhampton Total Average

Note: Where a cell contains - the required data was not provided to the Electoral Commission before publication of this report.

53

Follow-up on register inaccuracies


5.16 The majority of the follow-up activity focused on names not found on the registers (i.e. on improving the completeness of the registers). A few pilots did use the data to explore potential inaccuracies on their registers but only one pilot supplied data on this element of their activities. 5.17 Southwark sent letters to 6,773 names that appeared on their register but could not be found on the DWP Centric database (and subsequently sent canvassers to 3,742 of them). A total of 2,137 names were subsequently deleted from the register in response to the letters and 349 in response to canvassers (a total of 31.6% of the total number followed up). 5.18 It is difficult to draw any meaningful conclusions from such a limited test but given the results from Southwark it is a pity that they were the only authority to provide this data.

Data issues
5.19 The pilots highlighted important issues that need to be overcome if the matching process is to be refined into an efficient method for helping to maintain the registers. Most of these issues relate to more than just the DWP Centric database (and are referenced where relevant in the chapters below) but are mentioned in detail here as DWP data was used by the majority of pilots and much of the feedback from pilots on the data and the matching process related to it.

Name and address format compatibility


5.20 People do not always use exactly the same name when providing information to different organisations. These differences can range from the use of abbreviations on one form and not on another, to significantly different spellings of the same name. This poses a challenge for matching exercises and many matching processes are pre-programmed to recognise common differences. 5.21 The sophistication of the matching process is the main method of tackling this problem. In these pilots the processes used did allow for some of these common differences to be picked up but there is room for further development. However, this factor is difficult to mitigate entirely and will remain an issue for data matching exercises in the future.

54

5.22 Perhaps a more significant issue highlighted by these schemes is that of the different address formats used by the electoral registers and the national databases. 5.23 The electoral registers are address-based databases. EROs therefore make significant efforts to ensure that their property lists are up to date with standardised address formats and Unique Property Reference Numbers (UPRNs) which allow for clear referencing of different addresses and for individuals to be assigned to those properties. 5.24 However, the other databases are primarily databases of people where the address, although important, is secondary. They are therefore not managed in the same way as the electoral registers. For example, while the electoral registers commonly use the National Land and Property Gazetteer, other databases tend to use other references, e.g. the postcode address file (PAF). 5.25 Consequently, the same address can be listed differently on two databases and a matching process may fail to connect them, even as a fuzzy match, e.g. Flat A, 1 Acacia Road and 1A Acacia Road. Example of problems stemming from this addressing issue include: Abbreviations: even the simplest of differences could lead to mismatches, for example, Acacia Road and Acacia Rd would not match cleanly or at all. Duplicates: often resulting from the same address being written in two different ways on a database. Incomplete addresses: this is a particular issue for some areas with more complicated addressing, for example, where many houses had been converted into smaller flats the specific flat number was sometimes missing. Where an address was incomplete it was impossible to match details between the databases.

5.26 Consequently, as local authorities found during the course of the pilots, if the address fails to match it is more difficult or impossible to match the names of individuals held on the two databases. As one authority explained: idea of data matching with Government databases, it is essential that there is a common link i.e. UPRN or property identifier, to enable accurate matching. 5.27 The DWP have indicated to us that the problems encountered by the pilots due to address differences could have been significantly reduced if address 55

cleansing software had been used on both sets of addresses. This is the process they would commonly use for other matching exercises they participate in. However, the timetable for these pilots did not allow this to take place. Single national address file 5.28 The absence of UPRNs should be addressed by the urrent plan to create a single national address file by integrating existing information. Indeed one of the aims of this central government users to pass better quality definitive address details between themselves and to other organisations in an efficient and cost effective manner 20 5.29 However, the development of the single address database would only be the start of the process as it would then need to be taken up and integrated across the existing public databases. The DWP has begun this process and expects to have integrated UPRNs into its data warehouse by mid-2012.

Multiple matches
5.30 The match with DWP Centric which related to address matches only (see paragraph 3.13 for details on the match scores). In theory this should have been useful information the databases recognise the same addresses but have different people living there. However, it was not a straightforward process to use this data. As the example below shows, the matching process checked all the register entries against all the DWP Centric data and returned all the results even if a strong match had been found. Multiple match example Register entries for Mr John Smith and Mr Edward Jones at 1 Acacia Avenue, LL11 2XX are sent for matching. The DWP database contains a record for a John Smith at 1 Acacia Ave, LL11 2XX and the match between the two is scored 99.

On 3 December 2010, Eric Pickles MP, the Secretary of State for Communities and Local Government, announced the formation of a single national address gazetteer, replacing several separate addressing databases. This project brings together local government's address and streets gazetteers; the existing National Land and Property Gazetteer (NLPG) and the National Street Gazetteer (NSG), with all of Ordnance Survey's addressing products. www.nlpg.org.uk/nlpg/link.htm?nwid=19

20

56

The DWP database does not contain a record for an Edward Jones but the register entry is matched against the DWP entry for John Smith and scores 10 as the postcode matches and there is a fuzzy match with the first line of the address.

record: one with a score of 99, and one with a score of 10. 5.31 The example above is relatively straightforward and easy to understand in hindsight but it added to the confusion among the pilots trying to interrogate the data and identify names for following up.

Data quality: currency and nationality


5.32 databases were last updated. Whether or not the information held by national databases is up to date was an important question for the pilot areas. 5.33 Several pilots also indicated that the absence of nationality information from the matching process was disappointing, as nationality is a key element in determining eligibility to register to vote. 5.34 In Greenwich, 11% of those contacted in the pilot follow-up indicated that the person written to was not a current resident at the property, and 4% indicated that they were not eligible to register due to their nationality. Similarly in Camden, in nearly 14% of cases where follow-up letters were sent, a response came back indicating the person was not a current resident. A further 3% indicated the person contacted was not eligible. 5.35 The same pattern is also evident in data provided by Wigan. The authority issued 5,012 letters and nearly 4% registered (as shown in Table 4), but 8% indicated the named individual was not a current resident at the address. A further 11.5% indicated that in fact they were already on the register the match had been missed due to variations in spelling, for example, between databases. 5.36 These issues with the accuracy of matches or mis-matches relate primarily to the matching process used rather than the data held by DWP. The matching process was designed by Cabinet Office to allow for a wide range of possible matches and it was therefore inevitable that some apparent matches would prove to be false. 5.37 However, it is also the case that the responses to the pilot follow up activities are likely to understate the inaccuracies in the data as relatively few people who were written to responded either to register or to say the name 57

was incorrect. Indeed the fairly low numbers registering from the control groups compared with the overall canvass response (excluding the attainer-focused pilots) suggests that the level of inaccuracies in the data is high. 5.38 The detailed results provided by Wigan on their control group are revealing as these names were subject to the full canvass. Of 1,138 names tracked through the canvass, 58% of responses resulted in registrations that were not for the person named by the DWP at that address. 5.39 The Colchester pilot used canvassers to follow up names suggested by matching with DWP and found similar results. The canvassers achieved 936 responses to their enquiries. Of these, 54% indicated that the person had moved out, was unknown at the address or was deceased. 5.40 DWP have indicated to us that they are aware of issues with the currency of some of the data they hold. Specifically, that the likelihood of someone having an up to date address on the DWP Centric database is related to how often they interact with DWP or another agency that feeds data into the DWP data warehouse. For example, DWP think it is more likely that people claiming some form of benefit will have an up to date address, as this is required in order to receive the benefit. On the other hand, people do not need to update their addresses in order to continue to receive their pension and many may fail to do so. DWP have also indicated that at the outset they highlighted the issue of variable data currency to the Cabinet Office as a possible issue for these pilots.

5.41 The issue of data currency could have been, at least partially, addressed by the provision of change date information for each record, i.e. the date on which the most recent amendment was made to the record (not simply when the individual had contacted, e.g., a benefits office, but if they had changed some part of their details). With this information EROs would be in a position to evaluate the usefulness of the data provided by comparing when a record was last updated to when information for that address was last changed on the register or on a local dataset. This was not included in the list of information that could be legally shared under these schemes although DWP indicated they would have been able to provide it if requested. 5.42 The solution employed for these pilots was for DWP to only provide pilot authorities with names they held, but did not appear on the register, if the

58

relevant DWP record had been amended within the last two years.21 However, this was not enough reassurance for some areas, such as Manchester, who were concerned about the information provided being less up-to-date than local data and consequently decided not to contact names suggested by the check with the DWP Centric database as had originally been envisaged.

Using DWP data to confirm identity


5.43 An unanticipated finding of the pilots was the match rate between electoral registers and the DWP Centric database. Since local authorities varied in the match rate they accepted, the Cabinet Office undertook some additional analysis, with a consistent match score of 65. The findings are shown in the table below.

21

An initial matching exercise was carried out for with no two-year time limit. When this exercise returned an extremely large amount of data to the local authorities, much of which was out of date, it became apparent that a time limit was necessary.

59

Table 5: Match levels Local authority Blackpool Camden Colchester Forest Heath Forest of Dean Glasgow Greenwich Lothian Manchester Newham Peterborough Southwark Stratford Sunderland Teignbridge Tower Hamlets Wigan Wolverhampton Overall No match 22.8% 31.6% 14.9% 23.0% 45.4% 43.8% 16.2% 21.3% 21.8% 18.9% 13.0% 20.1% 13.8% 26.5% 13.9% 22.2% 12.0% 14.3% 19.6% Address Weak Strong match only matches matches 11.9% 3.0% 62.3% 11.8% 4.0% 52.6% 12.2% 2.9% 70.0% 9.1% 3.1% 64.9% 14.5% 0.4% 39.6% 11.4% 2.1% 42.7% 14.2% 4.3% 65.2% 6.8% 2.4% 69.5% 16.7% 5.0% 56.5% 24.8% 6.7% 49.6% 20.6% 9.3% 57.1% 17.2% 5.6% 57.2% 6.8% 2.3% 77.1% 12.9% 3.4% 57.3% 8.2% 2.4% 75.5% 22.3% 6.0% 49.6% 6.6% 3.0% 78.4% 9.5% 3.8% 72.3% 11.7% 3.7% 65.0%

,22 this finding is interpreted as 5.44 In follows the evidence so far suggests that comparing entries on an electoral register with information held by the DWP allows us to confirm as accurate a rie 5.45 While it is true that the results show an average match rate between the electoral registers and the DWP Centric database of around 65%, it is important to note that these pilots used a new, previously untested, matching process. In addition, very few of the pilots set out to confirm the accuracy of entries which matched, but to see whether those records which did not match were people missing from the electoral register who could be encouraged to reply.

HM Government (2012) Government Response to pre-legislative scrutiny and public consultation on Individual Electoral Registration and amendments to Electoral Administration law, Cm 8245.

22

60

5.46 There was limited evidence from an analysis carried out by Colchester that assigned, e.g. the matches scoring 65 could all be correct but some scoring 75 would, upon manual inspection prove to be false matches. In addition, some authorities did investigate the quality of the matched records by interrogating local data. They found a number of matches, where the person on the register was not the same as the person on the DWP Centric database. 5.47 However, there was no systematic follow-up of entries which matched between the electoral registers and DWP (for example, contacting a sample of matched entries to ascertain whether the person was still resident at the property, or whether they were indeed the same person). Such activity would have provided evidence about the accuracy of the records which matched on both the register and the DWP Centric database. 5.48 There is evidence from follow-up activity conducted as part of these pilots that the matching process used this time also failed to match some people who were present on both the register and the DWP Centric database (due to the use of spelling variations or abbreviated names on each database). There is clearly therefore scope to improve the sophistication of the matching process. 5.49 The variation in match rates across the pilots is also an indication that differences between local authorities, e.g. demographic differences, affect the likelihood of a match. But more evidence on this aspect of the matching is required. 5.50 verifying register entries (and the timetable for the implementation of individual electoral registration (IER)), we recommend that further piloting take place urgently to assess the strengths and weaknesses of such an approach. At this stage, in our view, the evidence from these pilots is not sufficient to support such a significant change to the registration system. 5.51 We recommend that further piloting allows for analysis of: the accuracy of both matched and non-matched records (and therefore an assessment of the effectiveness of the matching process and the validity of the scoring system) the potential variation in match levels which could occur due to variations in data currency, e.g. due to demographic and other factors

5.52 Additionally, given the aim of IER is to improve the security of the electoral register by verifying identity, we recommend that the Electoral Registration 61

Transformation Programme remains abreast of developments to verify identity across Government, so that appropriate lessons can be learnt.

Conclusions
5.53 Overall, these results suggest that there is a substantial degree of overlap between the electoral registers and the DWP Centric database and thus, names which do not match should offer the potential to find new people to register. 5.54 However, the issues with the currency of the data and different address formats mean that the match scores should be treated with caution, particularly given the example of matches between different people, and the number of records followed up where the person was no longer resident. 5.55 While it is not possible to draw definitive conclusions, it appears that where DWP data is used to target a more specific area or target audience, it may be of more use. However, in general the combination of the matching process used in these pilots and the often out of date data held by DWP Centric database meant that within each group of names followed up there was a lot of wasted time and resource. 5.56 The results from these pilots do not (and were not designed to) provide sufficient evidence to support the this to verify the identity of electors. One of the significant benefits expected from the change to IER is assurance about the accuracy of the entries on the registers and further testing is required to explore the advantages and disadvantages of the approach of not requiring everyone to provide identifiers.

62

6 Data matching results: Driver and Vehicle Licensing Agency


Introduction
6.1 Six pilots accessed Driver and Vehicle Licensing Agency (DVLA) data and this chapter sets out the results reported by five of them. Glasgow and Manchester did not provide separate DVLA results, so they are not included here. Key points Match levels between the DVLA driver database and the electoral registers were lower than between the registers and the Department for Work and Pensions (DWP) coverage is smaller and partly because the match process used stricter criteria. The match levels varied from 51.7% to 67.3%.

this was more likely to reflect poor data currency rather than significant under-registration. 208 were added to the register as a result of pilot follow-up activity. This was 4.1% of all names followed up. Many of the responses to follow-up activity indicated the person written to DVLA Driver database is not current. The DVLA data was more effective at targeting 16- and 17-year-olds as opposed to the population as a whole.

63

The matching process


6.2 Table 6 sets out the levels of match found between the electoral registers and the DVLA driver database, as well as the volumes of names found solely on either the DVLA driver database or the electoral registers. 6.3 These results show: The level of match between the registers and the DVLA driver database was lower than that for the same registers against the DWP Centric database. The overall match level was 60%. 33.6% of register entries sent for matching did not appear on the DVLA driver database. The percentage of electors found only on the DVLA driver database and not on the registers varied between 35 51% (of the total size of the register) for those who matched the whole register.

64

Table 6: Driver and Vehicle Licensing Agency Driver database matching: data analysis Local authority Register sent for matching ERO records sent 1,143 184,438 37,000 ERO records matched 650 95,437 20,846 % match On register / not on DVLA 493 86,366 % on register only On DVLA / not on register 163 94,324 13,030 % on DVLA only 14.3% 51.1% 35.2%

Forest of Dean Greenwich Teignbridge

Wigan Total

16/17 years-olds Whole Polling districts with a 95% or lower 2010 canvass response Whole

56.9% 51.7% 56.3%

43.1% 46.8% -

250,710 473,291

168,754 285,687

67.3% 60.4%

71,960 158,819

28.7% 33.6%

2,433 109,950

1.0% 23.2%

65

6.4 The lower level of match recorded, compared to DWP, is likely to be the result of two main factors. Firstly, the DVLA driver database does not have the coverage of the DWP Centric database there are more people with a DWP Centric entry than hold a driving licence registered with DVLA (see paragraph 3.2 for more details). Secondly, as mentioned in Chapter 3, a more restrictive matching process was used for the DVLA matching exercise than was employed in the DWP match. So entries which may have matched in the DWP matching process would not have matched in this stricter process. 6.5 The most significant difference between the results from the match with the DVLA driver database and those from the DWP Centric database is the volume rocess, i.e. people the DVLA data suggests live in the area but are not on the registers. In the authorities that matched against both the DWP and DVLA databases, the DWP database suggested new names of between 4 10% of the number of register entries sent for matching. But the DVLA database, excluding the Forest of Dean and Wigan pilots that only looked at attainers, suggested possible new electors equivalent to 35 51% of the electoral register in these areas. 6.6 It is extremely unlikely that the majority of these names are genuinely resident in the area. For example, the DWP Centric database indicated that nearly 18,000 people may live in Greenwich and not be on the electoral register. Regardless of how accurate that number is, it is striking that DWP Centric, a database containing details for the majority of the adult population, would suggest this figure while a database of drivers would suggest a figure of around five times the size 94,000. 6.7 Interestingly, Greenwich matched the records provided by DVLA and DWP in order to identify names that appear on both databases. They encountered problems with this process due to differences in naming and addressing standards so the results should be treated as indicative rather than exact. However, despite the 94,000 unregistered names offered by the DVLA driver database, and taking into account the relatively high level of population movement within a London borough, they could find only 5,853 names which agreed on both databases. 6.8 These figures support the view, expressed by the DVLA during this evaluation, that their database does not represent an up to date record of names at current addresses. Although it is a legal requirement to update the address on your driving licence when you move house, DVLA do not believe that

66

the majority of people do so at once (nor can they estimate how quickly updates are in fact made).23

Follow-up activity
6.9 Three authorities conducted follow-up activity based on the match with DVLA. The number of individuals followed up by these pilot authorities ranged from 3.6% to 69.9% of the total number of potential electors identified (i.e. those names that appeared on the DVLA driver database but not on the electoral register). 6.10 The results of the follow-up of names suggested by the DVLA driver database confirmed the view that the data was not up to date and contained many redundant entries for the pilot areas. For example in Greenwich, of the total number of new identities (3,480) suggested by the DVLA driver database that could be loaded into G therefore tracked), only 177 (5.1%) were subsequently added to the register, either through the canvass or in response to a direct follow-up invitation to register through the pilot. 6.11 From all the follow-up invitations to register issued by Greenwich (to names suggested by all the databases accessed), 1,442 responses indicated the person was no longer resident. The DVLA data accounted for 60% of those responses. 6.12 More positively, for the set of names which Greenwich identified as appearing on both DWP and DVLA databases, there was a better response nearly 19% of the total were added to the register, either through the canvass or in response to a follow-up letter through the pilot. 6.13 -up activity based on DVLA data was slightly more positive. The authority focused their activity on attainers. Wigan sent out 1,701 letters to potential new registrants resulting in 161 people (9.5%) being added to the register. On the other hand they received 237 responses indicating that the
They believe that in many cases updates are made some time later when the driver needs to use their licence for identification or is involved in a road traffic incident where the police will check the licence. In the case of removing entries after death the DVLA is either updated by the next of kin or via the Tell Us Once system. However, the approximate numbers coming from Tell Us Once (800 2,000 a week) are dwarfed by the average weekly deaths in England and Wales of approximately 9,000. The data reported by the pilots indicates that many of the records held by the DVLA are not accurate in terms of address and do contain records for deceased people.
23

67

person was already registered, which suggests the strict matching process used for the DVLA data (see paragraphs 3.15 3.16) meant that many genuine matches were missed. 6.14 The Forest of Dean pilot also focused on using the DVLA data to identify attainers. Again, their results were more positive than in those pilot authorities that followed up all names although the numbers involved in this pilot were small. Of the 34 names identified for follow up by letter, nine registered. Additional evidence suggests that the data itself was very accurate as an additional 94 names, identified as potential new electors, were all found to have registered through the canvass. 6.15 Data on attainers might be expected to be more up to date as they will have only recently appeared on the database (when they first apply for a provisional licence) and will be less mobile (as they are likely to still be living with parents) than many other groups.

68

Table 7: Driver and Vehicle Licensing Agency Driver database: Numbers followed up and registered and control groups Local authority Stage of annual canvass at which follow-up started Total number followed up Total added to ER New electors as a % of followed up 26.5% 1.1% 9.5% 4.1% Control group % of control group registered 94 100.0% 3,505 3.9% No control group 3,599 6.4%

Forest of Dean Greenwich Wigan Total

After annual canvass After annual canvass Between first and final stage of annual canvass

34 3,399 1,701 5,134

9 38 161 208

69

Conclusions
6.16 These results suggest that the DVLA driver database is not sufficiently up to date for useful matching against the whole electoral register. However, the results for those pilots focused on attainers were significantly more positive and there appears to be potential, subject to more robust costings, for this specific data to be of future use to Electoral Registration Officers (EROs) wishing to target attainers.

70

7 Data matching results: Education databases


Introduction
7.1 Several databases relating to education were used in these pilots. These included the Student Loans Company (SLC) database, the National Pupil Database (NPD) and the Individual Learner Record (ILR). The results for each separate database are considered below. Key points There were very few registrations from data matching with the SLC database. This, and responses to the follow-up activity, support the view expressed by the SLC that the data used for these pilots (at the end of the academic year) was quite out of date. The NPD and ILR proved effective at identifying attainers24 in these pilots. However, while the NPD and ILR identified attainers successfully, the majority of registrations were achieved through the annual canvass, which was taking place alongside the pilots, and not in response to follow-up activity through the pilots. Under individual electoral registration (IER), unlike in the current household system, individual attainers might need to complete their own form (rather than being registered by adults in the household). It is therefore possible that the number of registered attainers will fall. The ability to use data in order to target them in this way may therefore be a more useful tool for electoral registration officers in the future.

Student Loans Company


7.2 The SLC database was accessed by three pilots although final data is only available for two areas, Colchester and Glasgow one pilot was not able to
An attainer is a 16 or 17 year old who will reach voting age (18 years old) during the life of a current electoral register
24

71

disaggregate the consolidated results received from Cabinet Office into figures referring to the individual databases. 7.3 The results from the matching with this database are not comparable with those from matching with the Department for Work and Pensions (DWP) and Driver and Vehicle Licensing Agency (DVLA) databases, as the SLC data only includes information on students with a loan or grant which is a small part of the population. 7.4 In Colchester, the matching exercise identified 1,236 potential new electors, which is approximately 4% of 25 year old population.25 7.5 The Colchester pilot followed up only a small number of names suggested by the SLC data. This was partly because they identified significant duplication between the SLC data and that provided from the DWP Centric database. This is plications with DWP, resulting in extensive agreement between the two databases. 7.6 Of the 39 names followed up, only two were added to the register even after visits by a canvasser. These results call into question the accuracy of the data as canvassing should have resulted in a higher response rate than this if those followed up were in fact eligible new electors. 7.7 Glasgow focused on two predominantly student wards in the city and identified 102 potential new electors as a result of the matching. However, in response to the follow-up letters issued only three people registered and a further 12 responses indicated the named person had moved. 7.8 These results do not suggest that the SLC database is of significant value in updating the register. However, in an interview for this evaluation, the SLC indicated that the data used by the pilot was the least accurate they hold as it was taken at the end of the academic year when many students either have changed, or will be about to change, their address.

National Pupil Database


7.9 The NPD, provided via the Department for Education (DfE), was accessed by five pilots, although final data is only available for four of the pilot areas one pilot was not able to disaggregate the consolidated results received from Cabinet Office into figures referring to the individual databases
25

Figures for the population of Colchester are from the 2010 mid-year population estimates.

72

7.10 As with the SLC database, the NPD covers a subsection of the population so data on match levels between the database and the registers is not relevant. 7.11 The results from follow-up exercises conducted on names suggested by the NPD are more positive than from the other databases discussed so far. For example, in Wolverhampton, letters were sent to 560 names suggested by the matching exercise as resident in the area but not registered. Nearly 60% (331) subsequently registered. 7.12 In Greenwich, a group of 244 names of 17 18 year olds from the NPD was 52%) were found to have registered either as a result of the canvass (108) or in response to a direct mailing through the pilot (18). The control group recorded a similar result, with 142 of the names suggested by DfE being registered through the canvass process. 7.13 The results from Forest of Dean also suggest that this data is accurate, with four people registering from a total of 12 letters issued, and a further 35 (out of a total of 44 names) being picked up through the canvass process. 7.14 It is clear from these results that the focused nature of this database which is updated at several points in the year makes the data more reliable for the purposes of updating the register. However, in several cases the majority of registrations came through the canvass and not in response to follow up letters through the pilots. So although the results prove that the data is accurate, under the current household system they do not suggest that matching against these records would be an effective replacement for the annual canvass. It is quite possible that most of these registrations would have been achieved without any pilot activity. 7.15 However, under IER, unlike in the current household system, individual attainers might need to complete their own form (rather than being registered by adults in the household). It is therefore possible that the number of registered attainers will fall. The ability to use data in order to target them in this way may therefore be a more useful tool for EROs in the future.

Individual Learner Record


7.16 The Individual Learner Record (ILR), provided via the Department for Business, Innovation and Skills, was accessed by three pilots, although final data is only available for two pilots. As with the two databases above, ILR records were merged with SLC and NPD data, so one pilot was unable to provide figures for each database individually. 73

7.17 The results for the follow-up of names suggested by the ILR are similar to those reported for the NPD and again suggest that the information provided by these databases is effective at identifying attainers. 7.18 Greenwich took two approaches with the ILR data provided to them. They identified one general group of potential new electors to follow up and one group of 17 18 year olds. Of the 681 names in the general group, 86 (13%) were subsequently registered through the canvass or in response to a follow up letter. But of the 169 names of 17 18 year olds, 97 (57%) registered through one of these routes.26 7.19 However, in both cases (and similarly to the NPD data discussed above) the majority of registrations were made through the canvass. 7.20 The results from Forest of Dean also suggest that the data is very accurate but again the majority of registrations came through the normal canvass process (73 out of 75 names) rather than in response to the letter issued through the pilot (six out of 31 names). 7.21 As with the NPD results, it is clear that this data is accurate for attainers but the majority of actual registrations were the result of canvass activity and would have happened anyway, regardless of these pilots.

Conclusions
7.22 The results from these databases suggest that this data could be used to target attainers, as the NPD and the ILR appear to hold accurate data on this specific group. However, as with other databases, the effect of the annual canvass is difficult to separate from the pilot activity and there is limited evidence that, in the absence of the canvass, writing to a list of names identified would generate large numbers of registrations. 7.23 In the case of the SLC database, it is clear that the timing of this exercise was wrong and better results may have been seen if the pilots had been run at a point earlier in the academic year, when the data is more current.

These figures may not reconcile with the total results presented for the Greenwich match with the ILR, as the total figures were updated close to publication while the breakdowns were not.

26

74

8 Data matching results: Ministry of Defence


Introduction
8.1 Five pilots planned to use the data provided by the Ministry of Defence (MoD), but final data is only available from four areas, as Colchester were unable to undertake any analysis of the data due to delays in receiving the information. Key points The MoD provided limited data for these pilots. They were able to confirm that existing service voters were still resident but not provide details of potential new service voters. They also provided details of addresses occupied by service personnel in the area but this excluded barracks. There was therefore no real prospect of addressing the completeness of service voter registrations in the pilot areas. Two pilots were able to use the MoD data to improve the accuracy of their register and amended or deleted a number of their records (9.6% and 13.2%) of the total number of service voters held on the register.

Overview
8.2 The MoD provided two types of information: Information on service voters: the MoD confirmed whether the name and address of the service voters on the electoral registers matched with the name and address held on the Joint Personnel Administration database. They did not provide details or numbers of potential new service voters.27

Note: the MoD were not matching the electoral registers against all service personnel in an area, but all service voters their data indicated were still resident in an area. Only 22% of all service personnel who are registered to vote are registered as a service voter.

27

75

The results show an average match of 40.2%, indicating that 59.8% of the register entries on service voters in these local authorities were inaccurate. The MoD also confirmed the list of its properties and informed the local authorities about any properties missing from their address database. However, this did not include barracks. Overall, 85% of MoD properties held on the electoral registers were matched with the ones on the MoD database. The MoD data also identified 233 new properties, 2.7% of the MoD properties previously held on the electoral registers.

Service voters
8.3 As Table 8 shows, Shropshire reported a considerably lower match rate for service voters at 28.9% while the figure for the other two local authorities ranged from 44% to 59.4%. Wiltshire did not receive information on service voters and therefore only property data is available for this local authority. Table 8: Data from Ministry of Defence: Service personnel Local authority Rushmoor Shropshire Stratfordupon-Avon Total Number of service voters on register before pilot 500 384 128 1,012 Number of entries matched with MoD data 220 111 76 407 % match 44.0% 28.9% 59.4% 40.2%

8.4 These match rates are all fairly low but, as outlined above, the MoD did not provide details of any potential new service voters. Local authorities could therefore only delete or amend records held on their register, thus improving its accuracy but not its completeness. 8.5 As Table 9 shows, two local authorities, Rushmoor and Shropshire, followed up the results received from the MoD and amended or deleted a significant number of service voter records on their register.

76

Table 9: Follow up from MOD data Local authority Number of s details amended % amended as total of service voters registered 57 11.4% 28 7.3% 85 9.6% Number of service voters deleted 83 34 117 % deleted as total of service voters registered 16.6% 8.9% 13.2%

Rushmoor Shropshire Total

8.6 The results show that MoD data can have a positive impact on the accuracy of the electoral register. However, the high level of inaccurate entries raise questions about the effectiveness of canvassing MoD properties (a problem highlighted by several pilots during interviews for this evaluation) and the potential high level of incompleteness.

Properties
8.7 The match rate for properties was higher than that for service voters as shown in Table 10 below. As with the match of service voters, Shropshire recorded a much lower match rate (62%) on properties than the other authorities (82 99%). In addition, they also received the largest percentage of new addresses (as a percentage of the total number of military addresses they held originally). However, they already had a record of the vast majority (51 of 56) of these properties although they were not identified as MoD properties. For both Rushmoor and Stratford, the number of new addresses provided were very low.

77

Table 10: Data from Ministry of Defence: MoD properties Local authority MoD addresses provided for checking Number of MoD addresses matched % Match New MoD addresses identified % of new MoD properties as a total of addresses sent 4 0.2% 56 4.8% 2 1.6% 3.0% 2.7%

Rushmoor Shropshire Stratfordupon-Avon Wiltshire Total

1,760 1,169 124 5,642 8,695

1,748 719 102 5,471 8,040

99.3% 61.5% 82.3% 97.0% 85%

171 233

Conclusion
8.8 electoral registration officers (EROs) and, in the case of Shropshire and Rushmoor, did allow for some amendments and deletions to be made to the service voter list. However, due to the limited data provided by MoD, they were unable to add any service voters to the register. 8.9 The check of properties was of less practical use, although for Shropshire and Wiltshire it did provide some updates to the address information they held. 8.10 Several of the pilot authorities indicated that they felt that access to this data could be more useful if more information could be shared (e.g. names of unregistered military personnel in an area). However, they thought that establishing good relations with the commanding officer on military bases, in order to arrange access etc, was likely to be a more immediately effective step.

78

9 Data matching results: Citizen Account


9.1 This section sets out the key data produced by Renfrewshire Valuation Joint Board (VJB) using Citizen Account (CA) data. They were the only pilot to test this data. Key points The CA database is administered by the Improvement Service in Scotland28 and is intended to be a record of all residents within a participating local authority area. However, the CA database is not as comprehensive as the pilot authority originally anticipated the total number of records provided by CA represented only 27% of the Renfrewshire electorate. The level of match between the CA data and the electoral register was high, with 88.8% of the CA records also found on the register. The matching exercise suggested a small number of potential new electors (1.7% of the size of the register after local matching). Follow-up activity was still under way at time of publication.

9.2 The CA allows the 32 local authorities in Scotland to share data on their residents and is managed through the Improvement Service. The information is updated online by citizens or through local authorities. In theory, this database aims to cover the entire population of a local authority area which is signed up to the system. 9.3 The results from the matching exercise between the Renfrewshire register and CA show:

The Improvement Service is a partnership between the Convention of Scottish Local Authorities (COSLA) and the Society of Local Authority Chief Executives (SOLACE). It is a company limited by guarantee.

28

79

The CA database is not as comprehensive as the pilot authority originally anticipated the total number of records provided by CA represented only 27% of the Renfrewshire electorate. However, the level of match between the CA data and the electoral register was high; with 88.8% of the CA records also found on the register (this rose to 90.9% after local matching and data cleansing). The matching found 3,997 individuals on the CA dataset who were not on the electoral register, corresponding to 11.2% of the total number of entries on the register. After checking the data received with information held locally and resolving fuzzy matches, the number of mismatches went down to 628; 1.7% of the total number of entries originally sent for matching. Figures on the number of individuals on the register but not on the external database were not provided.

9.4 Several of the issues raised in reference to other databases, such as address formats that differed from those used on the register, were also issues for this database. On the question of the data currency the evaluation report from Renfrewshire VJB stated: The currency of the data was quickly found to be questionable when a check identified that many of the names on the list supplied were no longer resident at the address shown or, in a number of instances, were known to have died several years earlier 9.5 At the time of this report, the local authority was still awaiting the results from its follow-up activity.

Conclusion
9.6 The CA database is not as comprehensive as the pilot authority originally anticipated. 9.7 The number of new potential electors identified is relatively low (1.7%). Renfrewshire VJB speculated that this could be due to the high registration rate in the area (they estimate non-registration in Renfrewshire at 2.5%). 9.8 More testing of this dataset is recommended for several reasons: This dataset is updated directly by citizens or through the local authorities

80

The CA database is not only a database but a system that grants residents access to local services and is therefore potentially more likely to be kept up-to-date.

81

10 Pilot costs
10.1 This chapter sets out information on the initial budgets for each of the pilot schemes and the actual expenditure recorded. It also details what the most significant costs were in the set-up and administration of the pilots as a whole. 10.2 The figures in this chapter are based on the information reported by local authorities in their report on the pilot and additional information provided to us by the Cabinet Office. At the time of this report, much of the expenditure was not finalised, as the Cabinet Office has not been invoiced by many pilot authorities, so the figures are to be considered as estimates. Key points The overall cost of the pilots is estimated at around 425,910, against an original budget of 1.2m. These figures exclude staff costs for the Cabinet Office. The under-spend is largely explained by initial budgeting over-estimates by both the Cabinet Office and the pilot authorities, due to a lack of clarity about what the pilot process would entail, and by many local authorities not completing some of the activities e.g. follow-up work, which they originally planned for. Given the over-estimates, it seems unlikely that the pilots would ever have cost the full amount budgeted. The main item of expenditure reported by local authorities is the costs of additional staff, which account for about 50% of the total spent by local authorities. They reported that the process was labour intensive and they needed to incur much of this cost before they could begin the process of contacting potential new electors. Staff costs could be reduced by improving the quality of the data matched and automating more of the process, but we cannot conclude, from the information gathered in these pilots, what the cost would be of any national data matching roll out. There was some limited expenditure on databases which were not used in the pilots and so did not deliver any benefit. While the costs of these pilots appear high in terms of numbers of people added to the registers, this does not mean that data matching could not be cost effective if implemented differently. 82

In order to assess potential scalability of data matching, it would be necessary to have more consistent information than is available about the costs incurred, and this information should include the additional internal costs incurred by both local authorities and data-holding organisations.

Overview
10.3 Ideally, we would offer an assessment of the cost effectiveness and valuefor-money of the pilots, including an analysis of any cost savings that data matching might allow for. However, this has not been possible due to the issues, discussed in previous chapters, related to the scalability of the pilots, the comparability of the data provided and the problems encountered by several pilots in delivering the work. In order to assess potential scalability of data matching, it would be necessary to have more consistent information than is available about the costs incurred. This information should include the additional internal costs incurred by both local authorities and data-holding organisations. 10.4 As Table 11 shows, the overall spending for the pilots is estimated at 425,910 (excluding VAT) against an overall budget of 1.2m (including VAT). 10.5 The expenditure fell into two main headings: expenditure incurred by dataholding organisations and the Cabinet Office to ensure the necessary processes were in place for the secure transfer and matching of data, and the expenditure incurred by local authorities to work on the data and to conduct their follow up registration activity. Table 11: Total budget and expenditure for pilot schemes Initial budget 29 Local authorities (including Electoral Register Management system) Other organisation Contingency30 Grand Total 696,933 317,893 200,000 1,214,826 Actual expenditure 317,875 108,035 0 425,910

The total budget for the programme includes VAT. The figures given for actual expenditure exclude VAT. 30 The contingency budget included 100,000 for local authority contingency and 100,000 for data holding organisation contingency.

29

83

10.6 Given that the process for data matching on a national scale would not be comparable with that used for the pilots, it is not possible to scale up the costs to give an indicative figure. Nevertheless, the pilots did identify some particularly high-cost aspects and options for improvement.

Local authority costs


10.7 The initial budget figures reported in this section are based on the costs provided by the Cabinet Office at the beginning of the pilot process. The figures on actual expenditure (or estimated expenditure) have either been reported by local authorities in their evaluation reports or, where an invoice has been submitted, the final amount paid to the local authority by the Cabinet Office. 10.8 The amount spent by all local authorities that took part in the pilot scheme is estimated at 317,875 against a budget of 696,933 (45.6% of the budget).31 10.9 The total amount spent for each local authority varied considerably, ranging from 305 (Peterborough) to 59,13332 (Tower Hamlets). The differences in costs were due to: Methodology: the pilots varied in their planned scale. For example, some pilots focused on specific areas, such as electoral wards, while others looked across the whole authority area. This had a clear impact on the resources they required for delivery. However, in addition, some pilots did not carry out some of the activities they had originally budgeted for. For example, Peterborough did not follow up any of the individuals identified through the matching process. Internal resources: some local authorities hired extra staff while others managed the process using their internal resources. Some local authorities accounted for the costs of internal staff who worked on the pilots, while some others did not.

10.10 Table 12 shows the budget and costs incurred by each local authority broken down by key categories: staff33, IT and software costs, outreach work, and any other expenditure.

Although the initial budget included VAT, while the expenditure does not. If this is accounted for the expenditure is approximately 57% of the initial budget. 32 Tower Hamlets accounted for internal staff costs. 33 Staff costs include both internal and external staff. Separate figures are not available.

31

84

10.11 The under-spend was in part due to many local authorities not completing all planned steps of the data matching exercise, especially the follow-up activities. However, given that the pilots spent only 45% of their budget, it seems unlikely that the total amounts would ever have been spent. While it is not possible to tell for certain, the initial over-estimates of the budgets required may in part be due to a lack of clarity, at the time when the budgets were produced, about the nature of the process to be undertaken. 10.12 The main item of expenditure reported is additional staff costs, which account for more than 50% of the overall spending34. Staff costs were mainly determined by the work needed to check the data received from the Department for Work and Pensions (DWP) or the Cabinet Office. 10.13 Staff capacity was an issue for the pilots, as raised in Chapter 4, and the need for large numbers of additional staff was at least partly driven by the lack of automation in the process. It was time consuming, particularly given the large volumes of data received. As one authority mentioned: We feel that wherever possible processes should be automated (e.g. processing of information, analysis and reports) in order for costs to be reduced. 10.14 Three electoral register management software companies Xpress, Halarose and Strand were involved in the pilots and provided support to local authorities by introducing some changes to their software to facilitate the work at local level. The Cabinet Office has estimated the costs for this work to be approximately 81,000 across the pilot schemes. 10.15 Expenditure on followin Table 12) was low, 18% of the total expenditure compared to 51% for staffing and 27% for IT, because only some (and in some cases none) of the potential new electors identified through data matching were in fact followed-up. 10.16 The costs per elector added to or removed from the register varied considerably from area to area and ranged from 7 to 811. However, due to the issues described in previous chapters with the pilot methodologies and timings, it is not possible to draw valid conclusions from these figures. 10.17 Where authorities received and cleansed the data but only followed up a small number of electors (or none), the costs were higher, and where they
This figure was calculated by excluding from the total those pilots which did not provide broken down figures.
34

85

started the follow-up activities after the annual canvass, the number of people added to the register is inevitably low.

86

Table 12: Costs incurred by local authorities (VAT not included) Local authority Initial budget Staff IT and Software Outreach work Other Total Electors Cost per added/ elector removed added/ removed35 727 68.3 404 47.3 171 87.0 N/A N/A 24 264.0 97 82.5 597 40.6 1139 13.8 N/A N/A 79 165.8 N/A N/A N/A N/A N/A N/A N/A N/A 2545 7.1 10 810.9

Blackpool Camden Colchester Forest Heath Forest of Dean Glasgow Greenwich Lothian Manchester Newham Peterborough Renfrewshire Rushmoor Shropshire Southwark Stratford-on-Avon
35

67,000 40,000 20,000 9,750 18,000 8,000 70,730 3,500 70,000 16,000 20,739 50,000 11,300 15,000 54,500 3,238

40,348 3,780 7,701 4,006 1,776 4,481 10,809 10,000 0 1,500 159 7,500 0 2,783 0 2,184

2,779 8,882 5,566 815 4,100 3,000 8,287 2,185 2,880 10,320 0 3,200 980 979 5,000 750

6,540 6,434 1,614 5,997 100 250 5,135 3,575 0 1,282 0 3,700 0 0 8,000 5,175

0 0 0 0 360 269 0 0 0 0 145 200 0 784 5,000 0

49,667 19,096 14,881 10,818 6,336 8,000 24,230 15,760 2,880 13,102 305 14,600 980 4,547 18,000 8,109

Where figures are not provided there was either no pilot follow-up activity or insufficient data was provided to us to produce a figure.

87

Table 12: Costs incurred by local authorities (VAT not included) (continued) Local authority Initial budget Staff IT and Software Outreach work Other Total Electors Cost per added/ elector removed added/ removed 297 44.8 N/A N/A N/A N/A 348 48.9 N/A N/A 1054 9.9 7,492 42.4

Sunderland Teignbridge Tower Hamlets Wigan Wiltshire Wolverhampton Total

70,000 190 10,484 2,054 571 13,300 19,000 2,687 979 0 0 3,666 55,000 50,841 8,292 0 0 59,133 43,646 4,716 5,300 6,047 956 17,019 15,000 3,000 0 0 0 3,000 16,530 4,069 1,066 1,707 3,605 10,447 696,933 162,531 85,843 57,610 11,891 317,875

88

Other costs
10.18 Cabinet Office also 36 spent 108,035, from a budget of 317,893, to fund the work of DWP and other data-holding organisations. 10.19 A total of 93,915 was paid to the DWP for the work carried out. This included the cost of legal support to develop an agreement for data exchange; the cost of meeting security requirements; travel expenditure; staff costs involved in developing the business case for the release of data and other various costs. The Cabi with DWP meant that final costs were substantially different from their initial estimates. 10.20 The Cabinet Office also incurred expenditure of 1,227 to introduce a standalone desktop for data matching with the Higher Education Statistics Agency (HESA) data provided by the Higher Education Funding Council for England (HEFCE). HEFCE and Royal Mail also received 4,983 and 8,000 respectively for the preparatory work, even though their datasets were not fully tested. No costs were claimed from the Cabinet Office by the other data-holding organisations. 10.21 It is unclear from these pilots how these costs would change if data matching was rolled out across Great Britain. Much of the cost to data holders of any national system of data matching will be driven by the approach to the matching process and the IT systems used to deliver it, i.e. which organisations do the matching work, how much of that process is automated and how much requires regular staff input etc. It is not yet clear how these processes would work in the future, partly because a detailed implementation plan for individual electoral registration (IER) is not yet available. But it is clear that the processes would be very different to those employed (and therefore costed for) in these pilots.

36

The initial budget includes VAT, while the expenditure does not.

89

Table 13: Other costs Organisation DWP Cabinet Office Royal Mail HESA Limited Total Initial budget Total 300,000 93,915 5,000 1,227 8,000 8,000 4,893 4,893 317,893 108,035

Conclusion
10.22 In total, the pilots are estimated to have cost around 425,910 against an initial budget of 1.2m. Given the inconsistency in the cost information returned, we would recommend more prescriptive reporting and allocation of costs in future. 10.23 Overall, the costs of the pilots were high compared with the numbers of people added to the register by local authorities. Clearly, data matching, when undertaken in this way, is not a cost-effective method of improving the completeness and accuracy of the electoral register. Although this does not mean that data matching could not be a cost-effective tool in the future if the process was refined and improved. 10.24 The most significant local authority costs were due to the volume of work that was undertaken and the resources required to support this, in particular the amount of data that required cleansing at a local level. There is potential to reduce these costs by: Improving the quality and clarity of the data provided to local authorities and therefore reducing the volume of data supplied to them Providing an IT infrastructure that would facilitate and possibly automate the matching at local level. Such a system, although potentially expensive, could then be implemented across all local authorities

10.25 A total of 108,035 was spent to fund the work of DWP, the Cabinet Office and other data-holding organisations to set up the legal framework and necessary infrastructure to exchange and match data. However, it is not possible to tell how these costs would change if data matching was rolled out across Great Britain. 10.26 There was also a small budget for matching Royal Mail and HEFCE data, however this delivered limited benefit as it was not used by the majority of pilots. 90

A clearer framework for the data matching schemes might help to avoid such a situation in future.

91

11 Conclusions and recommendations


11.1 This chapter outlines our conclusions and recommendations for data matching, which are based on our analysis of the evidence provided by participating local authorities, data holders and others involved in the data matching schemes. The chapter includes an assessment of whether the pilots have met the registration objectives set out in the Political Parties and Elections Act 2009 (PPE Act) 11.2 However, for the reasons outlined in previous chapters, the results from the 22 data matching schemes undertaken in 2011 do not provide sufficient evidence to decide whether data matching could be an effective and costefficient approach to improving the completeness and accuracy of the electoral registers. 11.3 There was nevertheless a strong degree of consensus about the benefits or otherwise of the initial pilots and any future data matching. Local authorities and others were also in broad agreement about what needs to be in place before any further data matching is undertaken. This broad consensus is reflected in the recommendations set out below. 11.4 These recommendations are based on how this set of pilots have performed and what this tells us about the prospects of data matching for maintaining electoral registers. However, there are a number of aspects of the new individual electoral registration (IER) system that have yet to be finalised, including the start date for IER and the role that data matching may play in the short and long term future of the registration system. In particular, the Government has announced that it is minded to simplify the transition process by confirming as accurate, entries that match on the Department for Work and Pensions (DWP) database and the electoral registers. This was not the aim of the current round of pilots and will require further investigation.

Conclusions
11.5 We are required to evaluate the data matching schemes against a number of statutory criteria, which are set out in Sections 35 and 36 of the PPE Act 2009. Our conclusions are set out below. In broad terms these criteria concern the degree to which data matching schemes assisted electoral registration officers 92

(EROs) in improving the completeness and accuracy of their registers; resulted in any issues around administration, time and costs; or prompted objections to the schemes.

The registration objectives: completeness and accuracy


11.6 The first registration objective is that persons who are entitled to be registered on a register are registered on it. On the whole, these pilots did not prove very effective at getting people on to the register. Despite the efforts invested by authorities in the pilots, very few additions (only 7,917) were subsequently made to the registers. 11.7 However, better results were achieved where the local authority was able to begin their pilot follow-up activity before, or at a very early stage of, their annual canvass. This was largely because where the follow up did not begin until later many people had already registered through the canvass. 11.8 In these pilots, the most useful databases in terms of adding people to the registers were those which targeted specific under-registered groups (e.g. 16 17 year olds) such as the National Pupil Database and the Individual Learner Record (ILR). 11.9 The issues surrounding the currency of address information on some of the other databases would need to be addressed in order to improve their effectiveness at finding new electors. 11.10 However, the low number of registrations does not mean that the principle of data matching is without merit, and many local authorities were clear that they still see potential in the principle of data matching. However, refinements to the matching process such as an improvement in the currency, quality and compatibility of the data provided would need to be in place before this objective could be fully tested. 11.11 The second registration objective is that persons who are not entitled to be registered on a register are not registered on it. 11.12 The Ministry of Defence (MoD) data was useful, up to a point, in helping EROs to amend or delete the records of service voters in their area. Two local authorities amended or deleted a high number of their records (9.6% and 13.2% of the total number of service voters held on the register). However, the pilots were unconvinced that it was more effective than developing good relationships with senior military personnel in the area and had hoped to receive more data, e.g. on service personnel not already registered. Although the other authorities had been less focused on this objective at the outset of the pilots, the scale of 93

the data returned to them, together with concerns about the currency of that data, meant that, aside from the pilots using MoD data, only Southwark ran a limited test. A total of 2,137 names were subsequently deleted from the register suggesting that there is merit in investigating further. 11.13 The third registration objective is that none of the information relating to a registered person that appears on a register or other record kept by a registration officer is false. As with the above objective, most of the pilot schemes did not test this objective. This was in part due to a primary focus on identifying missing names and in part due to the volume and currency of the data received. 11.14 However, the MoD pilots did find that the data allowed them to amend some records, and the Stratford-upon-Avon pilot reported making changes to information held on some over-70s in order to correct the age marker on the register.37 11.15 Not all the data sets included in the current scheme were tested to the same degree. As set out earlier in the report, there was no actual testing of Higher Education Funding Council of England (HEFCE) or Royal Mail data and we are unable to draw any conclusions about the usefulness of this data in addressing the registration objectives. 11.16 While it may be desirable to include HEFCE and the Royal Mail in any future data testing, it is essential that the issues faced during the lifetime of this scheme are addressed ahead of the time that any matching is due to begin. It should also be a pre-requisite that any data, intended to improve the accuracy or completeness of the registers, is ultimately provided to the participating authorities; this was not the case for HEFCE data this time round.

Objections to the schemes


11.17 At the outset there were concerns that the use of public data in this way could generate objections from the public. However, where data has been provided to us, local authorities indicated they received few objections to the schemes. And where local authorities did receive queries, the vast majority of people were content with the use of the data when the purposes of the schemes were explained to them.

In England and Wales, all electors over the age of 70 years must indicate this fact on the annual canvass form or rolling registration form by which they apply to be registered.

37

94

11.18 For example, Camden indicated that they received 440 calls from the public asking about the pilot but the majority of callers were happy with the response they received. On the other hand, Blackpool indicated that their canvassers did receive some feedback on the doorstep from people concerned about personal information being shared between organisations, but this was extremely limited. 11.19 This indicates that the data matching pilots did not generate any substantial level of concern amongst the public. However, any future testing or roll out of data matching would need to be well implemented in order to ensure there is continued public support.

Ease of administration
11.20 Overall, many pilots struggled to deliver the pilots and several raised concerns that in its current format the process of data matching was too labour intensive for regular use. As one authority explained: considerable ICT staff resource was used to interpret and manipulate the returned matched data. 11.21 Additional staff resource was required by many of the authorities. In the main, this tended to be due to the large volumes of data received, issues with data compatibility and the workload involved in sorting the data for use. The level of input was not proportional to the benefits. As one authority said: I could not agree to the same level of staff input on a regular basis in the future therefore the data match process must be improved. 11.22 Many authorities also emphasised the need to understand the skill sets required for this kind of activity and highlighted that in many cases these skills were not held by those currently working on registration activities. The data provided to local authorities required further analysis before it was ready for use and many authorities were reliant on software suppliers or other internal expertise to provide this support. Other authorities did not have the expertise available and were therefore not able to fully utilise the data provided. 11.23 In order for data matching to be rolled out nationally without the need for significant re-skilling of staff, the process would need to be much more like an automated provision of lists of potential new electors (which can be easily integrated into the software used to manage the registers). In the interim, developing the process of local data matching would not only be useful to EROs

95

in maintaining the register but would also help to build skills which could be used to understand and manipulate data provided from national databases.

Time and costs


11.24 The pilot schemes proved to be both time consuming and costly. Some of the resources required to run the pilots were more extensive than those required if data matching was to be rolled out permanently, but on the basis of these schemes there is little evidence that data matching is a cost effective and manageable alternative to current registration processes. 11.25 The pilots were resourced by the Cabinet Office, with local authorities required to bid for the financial support required to use the national data provided. The scale of financial support received by local authorities varied; however, it is doubtful that many authorities would have had the resources available to undertake data matching without this additional finance. 11.26 Moreover, many indicated in their feedback to us that they would not be able to allocate funding to national data matching without substantial changes to the approach.

Recommendations
11.27 This section sets out our recommendations for future data matching activities.

Pilot processes
11.28 Further testing of national databases by local authorities would need to be undertaken in order to establish whether data matching is made available for use to all local authorities. 11.29 Any further testing needs to be set up in a way that addresses the limitations set out in this report in order to ensure that meaningful data can be collated. The Electoral Commission would encourage the Government to consult us in detail in order to achieve this. 11.30 We recommend that any further piloting (with a focus on improving accuracy and completeness): takes place outside the annual canvass period and avoids other significant electoral events. Piloting data matching alongside the annual canvass added a layer of complexity to the testing process and meant it 96

was harder for local authorities to isolate the impact of the data matching as opposed to canvass response rates. It also had consequences for local authority capacity to utilise the data when it was available to them. Several EROs thought that data matching could have more use following the canvass to pick up new registrants in the run-up to elections. has a clear framework for the use of data which all participating authorities can follow. This current scheme allowed local authorities to adopt varying approaches to piloting the data they received. The differing methodologies meant it was harder to draw conclusions about the effectiveness of the data and thus the future of the registration system. A clear framework would help to ensure comparability between the pilots but still allow for some local differences for example targeting particular groups and making use of local databases. tests, as closely as possible, the process which would be made available to all local authorities if data matching was to be rolled out nationally. ensures that participating areas are sufficiently staffed and have appropriate expertise to complete the pilot and test the data provided. allows for a better understanding of the benefits of access to national data compared to existing local databases. allows for a clearer analysis of the cost of data matching through more informed budgeting and prescribed reporting of costs incurred. ensures that good communication between the pilots, the data holders and the Cabinet Office is maintained throughout the process.

Databases
11.31 In relation to the specific databases included in these schemes: There is merit in re-testing nearly all of the databases included in these pilots providing the specific issues identified in this evaluation are addressed, namely that: address format compatibility issues should be mitigated where possible. The planned inclusion of Unique Property Reference Numbers (a unique identifier for each address held) on the DWP database will help with this issue, as will plans for a single national address file but other mitigating steps could be 97

taken for matches with other databases, for example using address cleansing software. data currency issues should be tackled by ensuring that, where possible, the information shared includes details of the dates on which database records are updated We would not recommend further testing of the MoD data, unless the range of data which can be shared is increased. While the data supplied in these schemes was useful for the pilot authorities it is likely to

Proposals for verifying identity


11.32 Also, as outlined earlier, the Government is currently considering whether the results from the data matching exercise could be used to confirm the identity of individuals captured by the household canvass during the transition to individual electoral registration (IER). In relation to this, we recommend that: There is a need for more evidence to support this proposal, given that this was not an objective of these pilots. Any future piloting which includes this as an objective for testing should allow for an analysis of matched and non-matched records in order to check the accuracy of the matching process used. It is possible that this analysis could make use of the annual canvass process and as a result the timing of these pilots may need to be slightly different to that followed for any focused on accuracy and completeness. These plans should also stay abreast of developments in the There are other initiatives within government on the processes that might be used in the future to verify identity. Learning lessons and adopting best practice from these other initiatives is important in order to ensure that the approach to verification followed under IER, and therefore the security of the registers, is as robust as possible.

98

Appendix A: Local authority profiles


This appendix contains brief profiles of the 22 local authorities that took part in the pilot schemes.

Contextual data
Information on the demography of each area is based on the 2001 Census, ONS neighbourhood statistics and the 2010 ONS mid-year population estimates. The data shown is relevant to electoral registration, e.g. we know that people living in private rented accommodation are less likely to be registered to vote. Figures on the total number of entries on the electoral register and the canvass return rate were provided by local authorities to the Electoral Commission as part of its performance standards monitoring role. Results from data matching and related activities were provided by local authorities.

How to read the profile


Description of approach authorities with regards to: Databases: external Database(s) matched against the local electoral register; Parts of the register being matched: some pilots matched the entire register while others selected specific wards (i.e.: wards with low canvass return rate, populated by groups which are more likely to be un-registered); Localised matching/data cleansing: whether the local authority verified the data returned by DWP or the Cabinet Office against information held locally in order to check the quality of the data received. Follow up method/approach: to explain how individuals identified from data matching were contacted. Approach during/after the canvass: at what stage of the annual canvass data matching follow-up began and any other relevant changes made to the annual canvass process. 99

Key pilot data mismatches between external Databases and the electoral register and related follow-up activities. Please note that the number of individuals added to the electoral register (row of follow-up activities carried out as part of the data matching pilot and excludes people who were register during the normal annual canvass. Where pilot follow-up was carried out before or during the annual canvass, the figures can be compared to the ones in the control group1. Where people identified through data matching were contacted after the annual canvass had begun, the registration rate for the pilot will inevitably lower as the majority of individuals had already been contacted (and potentially registered) through the annual canvass. The figures are therefore to be treated with caution. Summary table of our evaluation: Assessment in meeting the registration objectives; Objections to the scheme; Ease of administering; Savings of time and costs.

Running the pilots alongside the annual canvass added a complicating factor for assessing the value of data matching. For most areas it was not feasible to contact local residents before the annual canvass had begun across the area. To address this issue, the Commission encouraged pilots to create control groups of names identified from the national data, where no dedicated follow up would take place and the names would subsequently be tracked in the annual canvass. This was intended to determine how many would have been registered anyway in the absence of the pilot. However, not all the authorities were able to put in place a clear process for separating out the canvass from the data matching activities and often people identified to be followed up by letter were found to have already registered through the canvass. For the purposes of this evaluation this means that data on the response rates for those names followed up by pilot authorities has to be viewed in the context of how the authority was able to manage the two processes of the pilot and the annual canvass. For more information see Chapter 2 of our evaluation report.

100

Blackpool
Local authority information
Population (16+): 114,354 Local government register entries (December 2010): 114,113 Canvass return rate (2010): 87.1% Estimated movers in the last 12 months: 17.3% Proportion of population who are BME: 1.5% Private renters: 15.7% Full-time students (18-74): 1.7% Density (person per acre): 40.7

Description of approach
Databases: DWP Centric and DfE (NPD). Parts of register being matched: The six electoral wards with the lowest response rates to the 2010 canvass. Target groups: 2 year non-responders and properties where no one is registered in the 6 wards but where individuals have been identified. Localised matching/data cleansing: None Follow up method/approach: Personal visit. Approach during/after canvass: pilot follow-up run alongside annual canvass. Control group: A randomly selected 50% of the households included in the pilot.

Key pilot data


Database Number of ER records sent ER records matched % match On ER/Not on Database % on ER only Of which followed up 15,798 47.6% No follow-up undertaken DWP Centric DfE (NPD) Match score: 30 33,210 33,210 No usable data 7,896 N/A 25,314 76.2%

On Database/Not on ER

5,907

5,167

101

% on Database only Of which followed-up (properties) Added to ER New electors as a % of followed up Control group % of control group registered

17.8% 2,4672 727 29.5% 2,4663 31.2%

15.6%

Summary table
Objective Assessment in meeting the registration objectives Objections to scheme Findings The response rate for the follow up in this pilot is inflated as the number followed up is for properties and not people. A total of 727 people were added to the register within the six pilot wards. The authority does not believe that the NPD data provided any useful additional information. All queries were received from Council members. No queries were received from members of the public. However, some people when visited for the pilot follow-up did raise concerns about personal information being shared between different organisations. The authority felt that selecting a relatively small number of wards (six) and focusing only on two-year non-responders and empty properties meant that the pilot did not become unmanageable. However, the problems encountered during the early phase of the pilot in terms of data transfer and interpretation had a negative impact on the scheme. In addition, many of the processes were undertaken manually so the process was labour intensive. Total cost: 49,667. The main item of expenditure was staff cost (40,348) but it should be noted that Blackpool accounted for internal staff cost.

Ease of administering

Savings of time and costs

2 3

This figure refers to the number of properties followed-up rather than the number of individuals. This is the number of properties followed-up.

102

Camden
Local authority information
Population (16+): 199,438 Local government register entries (December 2010): 153,143 Canvass return rate (2010): 93.7% Estimated movers in the last 12 months: 28.2% Proportion of population who are BME: 26.8% Private renters: 22.8% Full-time students (18-74): 8.7% Density (person per acre): 90.8

Description of approach
Databases: DfE, DWP Centric, BIS, SLC and HEFCE (used only to confirm existing records). Parts of register being matched: Whole register. Target groups: Students, young people (18-24), the mobile population (home movers and people in houses of multiple occupation). Localised matching/data cleansing: They planned to conduct local matching but did not have time as the data exchange took longer than expected. Follow up method/approach: Personalised letter and then visits from personal canvassers to a selection of non-responding addresses. Approach during/after canvass: Due to delays, letters went out in October, after the first canvass reminder letter. Control group: 1,234 for DWP data. No control group for the education records.

Key pilot data


Database Number of ER records sent ER records matched % match On ER/Not on external Database DWP Centric Education: Match score: 60 NPD, SLC, ILR 153,290 153,290 98,286 64.1% 38,630 11,207 N/A 162,0664

This figure includes duplicates.

103

% on ER only Of which followed up

25.2%

105.7%

No follow-up undertaken

On external Database/Not on ER % on Database only Of which followed-up Added to ER New electors as a % of followed up Control group % of control group registered

30,978 20.2% 9,230 387 4.2% 1,234 15.9%

402 0.3% 383 17 4.4% -

Summary table
Objective Assessment in meeting the registration objectives Findings DWP: A high number of mismatches were identified but the pilot authority had doubts that these records were up-to-date. Education: Camden also believes the education databases may not provide any more information on attainers/students that they do not already held on their own internal education records. Around 440 calls about the follow-up letter mainly where names had no current connection with an address many concerned that they may be under investigation (three people expressed particular concerns about fraud). Most people were happy with the response. The authority felt that the canvass period was not an appropriate time to conduct the exercise. The matching process was problematic due to the incompatibility of matching a property database (electoral register) against a people database, without matching fields such as UPRNs. There needs to be clarification of the rules for export and import of files, permissible secure email systems, file sizes and security protocols. Potential electors also needed to be asked about nationality in the absence of information about nationality from the databases. Total cost 19,096 (not including costs of matching data). The authority felt that a better comparison would be possible if the exercise was conducted outside the canvass when the response could be compared with rolling registration activity.

Objections to scheme

Ease of administering

Savings of time and costs

104

Colchester
Local authority information
The local authority is developing its own local data system, the customer index. Population (16+): 149,145 Local government register entries (December 2010): 126,677 Canvass return rate (2010): 82.3% Estimated movers in the last 12 months: 18.7% Proportion of population who are BME: 3.8% Private renters: 10.2% Full-time students (18-74): 4.1% Density (person per acre): 4.7

Description of approach
Database: DWP Centric and SLC. Parts of register being matched: Whole register; once DWP and SLC data were received and analysed, plans were scaled down to 12 polling districts (approx. 20% of households in the borough). Target groups: university students. Localised matching/data cleansing: Carried-out localised matches using Customer Index (centralised customer database). Chosen match level: 85% and checked all matches below 85%. Follow up method/approach: Door-to-door visit to non-responders, personalised letter to those living in households that returned a canvass form. Approach during/after canvass: Pilot follow-up started during the final month of the canvass (November). Control group: Applied to all Databases but not followed up.

Key pilot data


Database Number of ER records sent ER records matched % match On ER/Not on Database % on ER only Of which followed up 105 DWP Centric SLC Match level: Varied 126,983 126,983 100,512 79.2% 26,653 21.0% 1,971 N/A 122,632 96.6%

No follow-up undertaken

On Database/Not on ER % on Database only Of which followed-up Added to ER New electors as a % of followed up Control group % of control group registered

20,119 15.8% 3,693 169 4.6% 1,678 4.2%

1,236 1.0% 39 2 5.1% -

Summary table
Objective Assessment in meeting the registration objectives Objections to scheme Findings There was minimal difference between registration rates in the pilot group targeted for action and the control group. The pilot authority believed that the optimal outcome would be for most of the matching work to take place in a national hub with the standardised results provided directly to local authorities with a clear procedure for action. The general view among members of the public that passed comment was that the Council should already be aware of the information they were providing. No objections were received from Council Members. The start delay, the overlap pilot/annual canvass and the amount of roblems working with the data matched because it required manipulation in matches. Furthermore, each record identified for action or control required a manual check, to see whether individuals had been identified by the concurrent canvass activities. This required a lot of resource. Property identifiers, e.g. UPRNs, on the DWP database would have helped. There were also issues with the currency of data. The local authority believes it would be unfeasible to run the exercise again in its current design. Estimated cost around 14,881.

Ease of administering

Savings of time and costs

106

Forest Heath
Local authority information
Population (16+): 50,964 Local government register entries (December 2010): 38,980 Canvass return rate (2010): 98.6% Estimated movers in the last 12 months: 21.1% Proportion of population who are BME: 6.0% Private renters: 15.7% Full-time students (18-74): 1.3% Density (person per acre): 1.4

Description of approach
Database: DWP Centric Parts of register being matched: Whole register. Target groups: Migrant workers from EC, spouse of USA service personnel, horse racing industry workers. Mobile population and young people (18-24). Localised matching: Used previous electoral registers and council tax records to check the accuracy of the data provided. Follow up method/approach: Letter and door-to-door. Approach during/after canvass: Pilot to be run before the canvass. Control group: No.

Key pilot data


Data was not provided in a format that can be compared to the other pilot areas.

Summary table
Objective Assessment in meeting the registration objectives Objections to scheme Ease of administering Findings The local authority acknowledged that they received registration forms from individuals from who they have not had responses from before but felt that the data was also identifying people who were not resident at an address. A couple of people asked where they got the data from but made no objection. A few people made negative comments about politics and politicians. The local authority felt that the pilot was badly planned and should have never been run at the same time of the annual

107

canvass. Savings of time and costs


Total cost: 10,818.

108

Forest of Dean
Local authority information
Population (16+): 68,349 Local government register entries (December 2010): 65,947 Canvass return rate (2010): 98.4% Estimated movers in the last 12 months: 13.1% Proportion of population who are BME: 0.9% Private renters: 6.3% Full-time students (aged18-74): 1.8% Density (person per acre): 1.5

Description of approach
Databases: DWP Centric, DfE (NPD), BIS (ILR) and DVLA. Parts of register being matched: 16/17 years on whole register. Target groups: Attainers -16/17 years-olds. Localised matching/data cleansing: Data filtered to remove unnecessary records. Local matching with council tax to verify mismatches. Data cleansing and checks as matching with DOB did not work (system would return a 99% match if dates were completely different). Follow up method/approach: Personalised letter with name and DOB. Approach during/after canvass: The pilot follow-up took place after the annual canvass so personalised letters were sent only to those who did not register. Letters and forms to register to vote were sent following the publication of the register on 1 December. Electors who returned forms for registration were processed under monthly rolling registration. Control group: Yes.

Key pilot data


Database Number of ER records sent ER records matched % match On ER/Not on Database % on ER only 109 DWP Centric DfE (NPD) BIS (ILR) DVLA Match score: 94 1,143 1,143 1,143 1,143 828 72.4% 315 27.6% 838 73.3% 305 26.7% 760 66.5% 383 33.5% 650 56.9% 493 43.1%

Of which followed up

No follow-up undertaken

On Database/Not on ER % on Database only Of which followed- up Added to ER New electors as a % of followed up Control group % of control group registered

116 10.1% 33 5 15.2% 70 72.9%

61 5.3% 12 4 33.3% 44 79.5%

130 11.4% 31 6 19.4% 75 97.3%

163 14.3% 34 9 26.5% 94 100.0%

Evaluation findings
Objective Assessment in meeting the registration objectives Findings The pilot felt data matching was not good for increasing completeness of the register but could be beneficial for online individual registration. nationality) would be useful. Monthly updates on individuals who move to the area could be a more useful way of providing information. One query from member of the public in response to a pilot followup letter. The timetable slippage in the provision of data added to pressure on staff. The pilot felt that unique property reference numbers would have assisted with the address matching process and that the currency of the data was also questionable. The provision of the match results from some Databases as one consolidated file added to difficulties in assessing the data. They believe it would have been more beneficial to conduct the exercise following the publication of the 2011/12 register. Total cost was 6,336. Matching only resulted in a small number of electors being added to the register. The authority considered the resources used to have outweighed the benefits gained.

Objections to scheme Ease of administering

Savings of time and costs

110

Glasgow
Local authority information
Population (16+): 495,584 Local government register entries (December 2010): 457,638 Canvass return rate (2010): 90.3% Estimated movers in the last 12 months: 13.7% Proportion of population who are BME: 5.4% Privately rented households: 6.8% Full-time students (aged 18-74): 7.4% Density (person per acre): 32.9

Description of approach
Database: DW P Centric, DVLA, and SLC. Parts of register being matched: Wards with lowest response rate where students live. Target groups: Students, private renters, mobile population. Localised matching/data cleansing: Data received from DWP were matched with Council Tax records to achieve increased accuracy. Follow up method/approach: Personal visit only at properties where no response had been received from the annual canvass. Two individual enquiry forms plus an explanatory letter were left at each property where there was no initial response. Approach during/after canvass: Data matching forms sent out after the first issue of the annual canvass forms. Control group: No control group.

Key pilot data


Database Number of ER records sent ER records matched % match On ER/Not on Database % on ER only Of which followed up DWP Centric Consolidated data Match score: 55 (DVLA and SLC) 47,666 53,109 21,762 45.7% 21,612 45.3% No follow-up undertaken 14,085 N/A 33,553 63.2%

111

On Database/Not on ER % on Database only Of which followed-up Added to ER New electors as a % of followed up Control group5 % of control group registered

16,590 34.8% 331 94 28.4%

19,133 36.0% 102 3 2.9% Two polling districts 80.3%

Summary table
Objective Assessment in meeting the registration objectives Findings It was difficult to separate the effects of pilot from the annual canvass. There was a poor response to postal forms and canvassers more successful. The local authority therefore believe there was little benefit from data matching. It was not an ideal time to follow-up students who often go home for the summer holidays and return to a different residential location. Objections to scheme Ease of administering There were two complaints where electors were already registered. The pilot was relatively easy to administer and relatively inexpensive. However, there were issues with the currency of the data and the format in which it was supplied. The delay in starting the project had an impact and made it difficult to assess the impact of the pilot. Savings of time and costs The total cost was 8,000.

Glasgow selected two polling districts as control group. The total number of people in the control group is not available and the registration rate is the average of the canvass return rate for the two districts.

112

Greenwich
Local authority information
Population (16+): 179,381 Local government register entries (December 2010): 170,845 Canvass return rate (2010): 92.1% Estimated movers in the last 12 months: 18.0% Proportion of population who are BME: 22.8% Private renters: 9.5% Full-time students (aged 18-74): 4.5% Density (person per acre): 45.2

Description of approach
Database: DWP Centric, DfE (NPD), BIS (ILR), and DVLA. Parts of register being matched: Whole register with target wards for particular nationalities. Target groups: Young people, BME groups and those under-registered for financial reasons. Localised matching: They intended to carry out localised matches but the start delay meant this was not possible. Follow up method/approach: Personalised letter. Approach during/after canvass: Pilot letters sent after annual canvass. Control group: Yes, for each Database.

Key pilot data


Database DWP DVLA DfE BIS Consolidated Centric (NPD) (ILR) (DVLA, BIS, Match DfE) score: 40 184,438 184,43 184,43 184,438 184,438 8 8 130,048 95,437 1,573 9,210 -

Number of ER records sent ER records matched

113

% match On ER/Not on Database % on ER only Of which followed up

70.5% 31,120 16.9%

51.7%

N/A

N/A 86,3666

46.8% No follow-up undertaken 46.8% 46.8%

On Database/Not on ER % on Database only Of which followed-up Added to ER New electors as a % of followed up Control group % of control group registered

47,384

94,324

3,172

4,528

3,292 46.8% 1.8% 247 4 1.6% 17 23.5%

25.7% 3,7137 211 5.7% 4,176 13.0%

51.1% 3,399 38 1.1% 3,505 3.9%

1.7% 244 18 7.4% 391 36.3%

2.5% 724 24 3.3% 849 16.3%

Summary table
Objective Findings followed. They found it hard to separate the effect of the pilot followup from the annual canvass. The pilot thought it likely that the personalised letter may have helped to make the additional registrations. There were a range of findings across the different Databases, including: DWP 11% of those contacted were not resident, 4% ineligible; DVLA responses to invitations show records are not current; NPD appear to be the most accurate Databases; ILR the majority of registrations made through the canvass. The mailing generated a large number of calls to the contact centre immediately after delivery, with several people expressing concern that the area had written to relatives who had died, in some

Assessment in meeting the registration objectives

Objections to scheme

6 7

This figure for DVLA, NPD and ILR were consolidated. Includes DWP data checked and combined with other Databases (DVLA, BIS and DfE).

114

instances over 20 years previous. The pilot raised issues relating to the quality, quantity and currency of the data, in particular the DVLA. Differences in naming and addressing standards across the Databases made analysis difficult and unique property reference numbers would help. Data management of received data was problematic and timeconsuming. The authority felt that the process suggests a need to develop a range of data management knowledge and skills (that were not already available within the local electoral services team). Total cost estimated at 24,230. Just over 3% of those identified through data matching and contacted registered to vote and the local authority felt it was therefore not as effective as household registration.

Ease of administering

Savings of time and costs

115

Lothian
Local authority information
The Lothian Valuation Joint Board is composed of City of Edinburgh, East Lothian, Midlothian and West Lothian. The figures below are the sum or average of the four areas. Population (16+): 695,404 Local government register entries (December 2010): 595,074 Canvass return rate: 88.6% Estimated movers in the last 12 months: 14.5% Proportion of population who are BME: 2.8% Privately rented households: 7.7% Students (aged 18-74): 6.8% Density (person per acre): 4.5

Description of approach
Database: DWP Centric Parts of register being matched: Whole register. Target groups: Under-registered areas. Localised matching/data cleansing: Intend to match with Council Tax Payers Lists and Council Tax & Non-Domestic Property Lists prior to follow-up to provide degree of refinement to matched/updated data. Follow up method/approach: Personalised letter and door-to-door. Approach during/after canvass: Data matching follow-up carried out prior to the commencement of the annual canvass. Control group: Yes.

Key pilot data


Database Number of ER records sent ER records matched % match On ER/Not on Database % on ER only Of which followed up DWP Centric Match score: 55 654,515 474,113 72.4% 139,613 21.3% No follow-up undertaken

On Database/Not on ER 116

65,853

% on Database only Of which followed-up Added to ER New electors as a % of followed up Control group % of control group registered

10.1% 10,215 1,139 11.2% 9,875 31.0%

Summary table
Objective Assessment in meeting the registration objectives Objections to scheme Findings The registration rate for the pilot group (11.2%) is relatively high in comparison to other pilot areas. This is likely to be at least partly due to the timing of the follow up before the annual canvass. The pilot indicated that there needs to be more information included on the national data (e.g. the how recent the records were updated should be evident). No response from the public in relation to letters inviting those named to register. During the door to door activity there were a small number of refusals to provide information. There were issues related to the quality and currency of the data. The lack of a unique property reference number also caused problems. Improvements need to be made to the national data to make the matching process viable The VJB believe that the pilot was run too close to the annual canvass. This made subsequent follow up activity and analysis more difficult. Total cost for the pilot was 15,760. Considerable ICT staff resource was needed to interpret and manipulate the returned matched data.

Ease of administering

Savings of time and costs

117

Manchester
Local authority information
Manchester carried out data matching with local data already in 2010 and again in 2011. Population (16+): 411,465 Local government register entries (December 2010): 360,802 Canvass return rate (2010): 93.0% Estimated movers in the last 12 months: 23.9% Proportion of population who are BME: 19.0% Private renters: 16.4% Full-time students (aged 18-74): 10.0% Density (person per acre): 33.9

Description of approach
Database: DWP Centric. Parts of the register being matched: random sample of around 10,000 (5%). Target group: non-respondents, properties with no voters, general under registered (students, BME, people in deprived areas). The local authority, once it received the data, decided not to take any actions as it found the data not fit for purpose.

Summary table
Objective Findings It is not possible to evaluate whether data matching would help but Manchester felt the amount of time and resources required to try to overcome the limitation in the data provided would have not justified the benefits. Manchester found that the currency of the data was not clear and that any data could be as much as two years old and was therefore of no use compared to current data available to the Council. The matching process returned over 130,000 results against the 10,000 properties provided. At least 100,000 of these records were spurious. By the time the issue of the currency of the data was recognised and an attempt made to rectify this, the Council was well progresses with the annual canvass. Manchester did data matching with local data in 2010 that helped 118

Assessment in meeting the registration objectives

Objections to scheme Ease of administering Savings of time and costs

increase the registration rate. It found it easier and more effective. It reported that the continuation of this approach helped to improve registration in 2011 from 93% to 96%. N/A Manchester stopped the work because it found the data was not fit for purpose. Total cost of the pilot 2,880 for the electoral register software.

119

Newham
Local authority information
Population (16+): 179,742 Local government register entries (December 2010): 198,724 Canvass return rate (2010): 87.1% Estimated movers in the last 12 months: 18.7% Proportion of population who are BME: 60.5% Private renters: 17.6% Full-time students (aged 18-74): 6.2% Density (person per acre): 67.3

Description of approach
Database: DWP Centric. Parts of register being matched: Whole register. Target groups: Persons in privately rented accommodation, young persons, and persons registered but not residing at the registered address. Localised matching/data cleansing: Localised match after the initial centralised match with a Citizens Index (project CIMBA using Multivue from Visionware) which holds data from ER, Housing Management, Revenues & Benefits and their Dynamics CRM. Follow up method/approach: By personalised letter. Control group: None. Approach during/after canvass: Personalised letters were sent out at the 3rd stage of the annual canvass (door-to-door stage).

Key pilot data


Database Number of ER records sent ER records matched % match On ER/Not on Database % on ER only Of which followed up DWP Centric Match score: 55 210,000 121,000 57.6% 89,000 42.4% No follow-up undertaken

On Database/Not on ER 120

101,000

% on Database only Of which followed-up Added to ER New electors as a % of followed up Control group % of control group registered

48.1% 1,902 79 4.2% -

Summary table
Objective Assessment in meeting the registration objectives Objections to scheme Findings 4% of those followed up were added to the register, although no control group was used for comparison. Newham felt that the national data would only be of use if authenticated against local Databases unless improvements are made (e.g. the currency of data improved, and inclusion of middle names and UPRN). None noted. Newham reported issues with the amount, quality and currency of the external data supplied (e.g. duplicate entries had to be identified manually). The lack of a unique property identifier also made the process of identifying who to follow-up difficult. The authority also felt that Information sharing with the Cabinet Office was also poor and they did not know what to expect when they opened the files. Cost 13,102. More than 72% of the overall budget was spent on ICT and data source processing.

Ease of administering

Savings of time and costs

121

Peterborough
Local authority information
Population (16+): 137,067 Local government register entries (December 2010): 130,683 Canvass return rate (2010): 96.0% Estimated movers in the last 12 months: 16.4% Proportion of population who are BME: 10.2% Private renters: 8.8% Full-time students (aged 18-74): 1.5% Density (person per acre): 4.5

Description of approach
Database: DWP Centric. Parts of register being matched: Central Ward which comprises 4,369 properties. Target groups: Those for whom English is not their first language, those who are not aware of the entitlement to vote, transient population. Localised matching/data cleansing: Local matching using council tax and benefits records. Follow up method/approach: No follow-up due to timing of pilot and resource issues. Approach during/after canvass: N/A. Control group: No.

Key pilot data


Database Number of ER records sent ER records matched % match On ER/Not on Database % on ER only Of which followed up DWP Centric Match level: 99 8,009 4,379 54.7% 1,795 22.4% No follow-up undertaken

On Database/Not on ER 122

5,886

% on Database only Of which followed-up

73.5% No follow-up undertaken

Evaluation findings
Objective Findings The work identified 5,886 potential electors in the targeted ward but the pilot did not contact these people to invite them to register. They were therefore unable to verify whether they were still living in the area and eligible to be on the electoral register. This was due to the timing of the project and resources issues. Peterborough joined the data matching project 3-4 months after the pilot started and the lack of time and staff meant that they could not complete the follow up of individuals identified the data matching. None.

Assessment in meeting the registration objectives

Objections to scheme

Ease of administering

Savings of time and costs

A large volume of data was received (almost 25,000 records while the council expected around 6,500). The data required a great deal of refinement and there were problems such as the repetition of names. The absence of UPRNs on the DWP database was also a problem. If pilot was delivered outside canvass period, the local authority believe the project would have been more successful. The total cost for this pilot was 305.

123

Renfrewshire Valuation Joint Board


Local authority information
The data transferred for matching relates only to Renfrewshire Council. Renfrewshire population (16+): 139,976 Local government register entries (December 2010): 130,553 Canvass return rate (2010): 94.3% Estimated movers in the last 12 months: 9.2% Proportion of population who are BME: 1.2% Privately rented households: 3.5% Students (aged 18-74): 3.8% Density (person per acre): 6.6

Description of approach
Database: Improvement service - Citizen Account. Parts of register being matched: The number of records supplied was 35,809. Target groups: Focus on young people and students in 18-25 age group and individuals living in areas with multiple deprivations. Localised matching/data cleansing: Local matching with Council tax. Follow up method/approach: Letter and door-to-door if no reply. Due to delays in the process elsewhere, they missed the door-to-door canvass non-returns will be followed up in the spring. Approach during/after canvass: Data matching letters were sent out at the end of the annual canvass and follow-up activities are still underway. Control group: None.

Key pilot data


Database Number of ER records ER records matched % match On ER/Not on Database % on ER only Of which followed up Citizen Account 35,809 31,812 88.8% No follow-up undertaken

124

On Database/Not on ER % on Database only Of which followed-up Added to ER New electors as a % of followed up Control group % of control group registered

3,997 11.2% 628 Follow-up is still underway -

Summary table
Objective Assessment in meeting the registration objectives Findings The CA data provided was not a comprehensive record of the area in terms of entries it equated to only around a third of the total electorate. However, the match level between the electoral register and the CA was high at 88.8%. The data supplied was smaller than the VJB anticipated and did not assist with the section of the population or housing stock where registration levels are lower. None.

Objections to scheme

Ease of administering

Designing and agreeing on the legal agreement to exchange data was time consuming and had an impact on the success of the pilot. There were problems with matching addresses due to the lack of UPRN on CA data. The VJB also believed that the currency of the data was patchy. Total costs: 14,600.

Savings of time and costs

125

Rushmoor
Local authority information
Population (16+): 74,237 Local government register entries (December 2010): 65,690 Canvass return rate (2009): 88.4% Estimated movers in the last 12 months: 20.2% Proportion of population who are BME: 4.4% Private renters: 11.0% Full-time students (aged 18-74): 1.8% Density (person per acre): 23.3

Key pilot data


Service voters (SV) Number of service voters on ER before pilot Number of entries matched with MoD data % match 500 220 44.0% 57 % amended as total SV Registered details deleted % deleted as total SV Registered Properties MoD addresses provided for checking Number of MoD addresses matched % match New MoD addresses identified % new properties as a total of addresses sent 1,760 1,748 99.3% 4 0.2% 11.4% 83 16.6%

Summary table
Objective Assessment in meeting the registration objectives Findings The original aim was to improve the accuracy and completeness of that part of the Electoral Register that related to properties occupied by service personnel. However, MoD only confirmed information about existing service voters. The properties included in the data exchange were specifically 126

Objections to scheme

targeted as part of the canvass process to confirm that the data was correct and to obtain new registrations. The pilot did not help to improve the completeness of the register but, based on the number of records added/deleted, the register accuracy in relation to service voters improved by 28%. The council is of the view that more accurate data is held locally rather than nationally. The pilot did identify some useful learning points. Rushmoor believe that with proper data matching arrangements and appropriate data, an effective system is still achievable. None noted. Limited communication at various levels created issues and hindered success (e.g. little contact between the MoD and the Council or MoD and the local Garrison). The Council noted that working arrangements with the Local Garrison have developed considerably since the end of the pilot and a programme of activities has been developed. Only around 10% of the requested budget was spent (mainly on software). Total cost: 980.

Ease of administering

Savings of time and costs

127

Shropshire
Local authority information
Population (16+): 241,543 Local government register entries (December 2010): 230,190 Canvass return rate (2010): 86.6% Estimated movers in the last 12 months: 14.8% Proportion of population who are BME: 1.2% Private renters: 9.8% Full-time students (aged 18-74): 1.5% Density (person per acre): 0.9

Key pilot data


Service voters (SV) Number of service voters on ER before pilot Number of entries matched with MoD data % match 384 111 28.9% 28 % amended as total SV Registered details deleted % deleted as total SV Registered Properties MoD addresses provided for checking Number of MoD addresses matched % match New MoD addresses identified % new properties as a total of addresses sent 1,169 719 61.5% 56 4.8% 7.3% 34 8.9%

Summary table
Objective Assessment in meeting the registration objectives Findings The MoD was only able to confirm whether the information held by the council matched their records. The MoD property database was therefore of limited use. The pilot did follow-up the results received from MoD and amended/deleted a number of service voter records on their 128

Objections to scheme Ease of administering Savings of time and costs

register. The results from the matching exercise showed registered service voters was not entirely up-to-date. The data provided helped in updating the accuracy of the register relating to service voters (approx. 16% of records were amended or deleted). The pilot authority would ideally like a system to enable the MoD to provide a list of service personnel movements between bases and also contact address; this would enable EROs to write to SVs to confirm their change of details. They believe that his would assist in maintaining the accuracy of the register as the ERO is not always told of service voter movements. The 5 year renewal of declaration also means that there is a good possibility that SV will move within that timescale. None.

The late provision of the data from the MoD meant further analysis was not possible. Concerns also about the accuracy and currency of the MoD data. Total cost 4,547.

129

Southwark
Local authority information
Population (16+): 236,542 Local government register entries (December 2010): 197,326 Canvass return rate (2010): 92.4% Estimated movers in the last 12 months: 22.3% Proportion of population who are BME: 36.9% Private renters: 13.2% Full-time students (aged 18-74): 7.6% Density (person per acre): 84.8

Description of approach
Database: DWP Centric. Parts of register being matched: 3 wards, one in each constituency and covering 3 political groupings on council. Target groups: Young professionals in north of borough. BME groups in Peckham. Localised matching/data cleansing: Matches against locally held data (benefits, Council Tax, LLPG, housing records) prior to, during and after. They matched the data received from DWP with local data in order to obtain valid matches to properties and then to the people linked to the property. Follow up method: The letter in pilot areas (excluding the control group) addressed to potential new electors. Second form addressed to the occupier as normal. The canvassers for properties who had not responded after the second post were provided with names to prompt for at the door knocking stage. Approach during/after canvass: Run alongside the annual canvass work. Control group: 648 names were selected for a control group.

Key pilot data


Database Number of ER records sent ER records matched % match On ER/Not on Database % on ER only DWP Centric Match score: varied 30,840 18,204 59.0% 5,445 17.7%

130

Of which followed up Deleted from ER Deleted electors as a % of On Database/Not on ER % on Database only Of which followed-up Added to ER New electors as a % of followed up Control group % of control group registered followed up

6,773 2,137 31.6% 8,849 28.7% 5,829 2,545 43.7% 648 42.0%

Summary table
Objective Findings There were no noticeable differences in the number of electors added/removed between the pilot wards and the control group (44% of those followed up from the national database but not on the register were added in the pilot wards compared to 42% of the control group). 3% of those records followed up confirmed that they were of non-eligible nationality. The usefulness of the data was mixed but the pilot provided a number of learning points. 50% of all new electors added to the register by the end of the canvass period for the three wards covered by the pilot were not present on DWP Centric as potentially missing electors. Southwark believe data matching could best be used to confirm existing electors as valid for inclusion in IER without further transactions. Feedback from the public was limited as the exercise ended up occurring at the same time as the annual canvas. No complaints or comments were received. The initial process had to be amended due to the slippage in the start time of the pilot and the volume and quality of the match data returned. Poor data currency and the lack of a unique property identifier hindered the pilot. Costs in the region of 18,000 which exclude some staffing costs.

Assessment in meeting the registration objectives

Objections to scheme

Ease of administering Savings of time and costs

131

Stratford-upon-Avon
Local authority information
Population (16+): 97,799 Local government register entries (December 2010): 95,301 Canvass return rate (2010): 87.8% Estimated movers in the last 12 months: 14.7% Proportion of population who are BME: 1.3% Private renters: 7.9% Full-time students (aged 18-74): 1.5% Density (person per acre): 1.1

Description of approach
Database: DWP Centric and MoD (Joint Personnel Administration and Anite Housing). Parts of register being matched: Whole register. Target groups: Attainers, over-70s and service personnel. Localised matching/data cleansing: Before and during canvass to check accuracy of the Register. Follow up method/approach: Personalised letter. Approach during/after canvass: Data matching letters sent out before and during the first canvass forms. Control group: Random sample.

Key pilot data


Database Number of ER records sent ER records matched % match On ER/Not on Database % on ER only DWP Centric Attainers Match score: 100 Match score: 95 1,078 16,264 DWP Centric Over 14,333 88.1% 454 42.1% 470 43.5%

132

Of which followed up

The local authority contacted individuals who were already on the register to confirm their details (register mark for those aged over 70). The pilot recorded the following response rate to the letters: 60% for over 70 (63 records needed changing) and 26% for attainers. 171 15.8% 181 4 2.2% 15 13%

On Database/Not on ER % on Database only Of which followed-up8 Added to ER New electors as a % of followed up Control group % of control group registered

761 5.6% 854 6 0.7% 126 80.0%

Ministry of Defence
Service voters (SV) Number of service voters on ER before pilot Number of entries matched with MoD data % match 128 76 59.4% No follow-up undertaken Properties MoD addresses provided for checking Number of MoD addresses matched % match New MoD addresses identified % new properties as a total of addresses sent 124 102 82.3% 2 1.6%

These figures include individuals who were already on the register and contacted to confirm their details (register mark for those aged over 70) and therefore the low registration rate.

133

Summary table
Objective Findings The pilot authority believes that more work needs to be done to establish which data is more accurate national or local. The pilot also aimed to verify the accuracy of the entries on the register (mark for those over 70) and had problems to capture this information in the reporting forms. The low registration rate reported is due to the way the follow-up was conducted (confirming details rather than simply inviting people to register) and the pilot reported a good response rate to the letter for over 70 (60%) but low for attainers (26%). The MoD data did not identify any new electors. However, the authority would find it useful to receive a periodic release of MoD properties (particularly on military bases/barracks). Approximately 20 queries about the nature and legitimacy of the project. There was no adverse feedback from either the press or from the public in general. It was difficult to administer. Most of the Issues related to resources/skills required due to the process (volume of work, the timing) There was a very small increase in the number of people on the register and the authority does not believe it was cost effective. Total cost: 8,109.

Assessment in meeting the registration objectives

Objections to scheme Ease of administering Savings of time and costs

134

Sunderland
Local authority information
Population (16+): 234,591 Local government register entries (December 2010): 217,436 Canvass return rate (2010): 90.0% Estimated movers in the last 12 months: 12.4% Proportion of population who are BME: 1.8% Private renters: 5.1% Full-time students (aged 18-74): 3.2% Density (person per acre): 20.4

Description of approach
Database: DWP Centric. Parts of register being matched: One ward (out of the 25) with high concentration of students and BME groups. Target groups: People on benefits to understand if they are under-registered. Localised matching/data cleansing: Tested the effectiveness of the data supplied by comparing other Council records (i.e. Council Tax and Housing Benefits). Follow up method/approach: By letter (not personalised) and then door-to-door canvassing. Approach during/after canvass: The national data was matched to the canvass register in the final weeks of the canvass and non-matches or names not included on a registration form were chased up by personal canvasser. Control group: None.

Key pilot data


Database Number of ER records sent ER records matched % match On ER/Not on Database % on ER only Of which followed up DWP Centric Match score: 45 10,659 5,927 55.6% 1,328 12.5% No follow-up undertaken

135

On Database/Not on ER % on Database only Of which followed-up Added to ER New electors as a % of followed up Control group % of control group registered

2,619 24.6% 2,408 297 12.3% -

Summary table
Objective Findings The delay in the provision of matched data resulted in a change of focus to whether the information being provided from the national source could have been, or was, available from another internal Council provider. 297 electors were added to the electoral roll as a result of the pilot. The pilot also demonstrated there are names available from a national database that are not included within a Local Authority system. However, it also showed that quality of address data provided by national databases does not always match up to the quality of that contained on the Electoral Register or the local authority land and property gazetteer. Minimal - 3 properties objected from over 1,000 approached.

Assessment in meeting the registration objectives

Objections to scheme

The quantity, quality and currency of the data were issues. Ease of administering The lack of unique property identifiers on the national database and inconsistencies in the data (e.g. abbreviations particularly in multioccupation addresses) meant it was hugely time consuming in order to match addresses and cleanse the records before a meaningful comparison could be made. The cost of the pilot was 13,299.

Savings of time and costs

136

Teignbridge
Local authority information
Population (16+): 105,776 Local government register entries (December 2010): 102,469 Canvass return rate (2010): 96.8% Estimated movers in the last 12 months: 15.7% Proportion of population who are BME: 1.0% Private renters: 10.6% Full-time students (aged 18-74): 1.8% Density (person per acre): 1.7

Description of approach
Database: DVLA, DWP Centric Parts of register being matched: Planned to match 31% of the register but they narrowed it down to 5%. Target groups: Attainers, mobile population, young people (18-24) and older people. Localised matching/data cleansing: Cross-referenced with Council Tax records. Follow up method/approach: N/A. Approach during/after canvass: N/A. Control group: No.

Key pilot data


Database DWP Centric DVLA Match score: 99 37,000 30,972 83.7% No follow-up undertaken

Number of ER records sent ER records matched % match On ER/Not on Database % on ER only Of which followed up

37,000 20,846 56.3% -

On Database/Not on ER % on Database only

3,368 9.1%

13,030 35.2%

137

Of which followed-up Added to ER New electors as a % of followed up Control group % of control group registered

Evaluation summary
Objective Findings Delays in the initial delivery of the data and the start of the annual canvass meant insufficient staff resources were available to conduct follow-up and fully participate in the data matching. The authority also thought electors would find it confusing to receive so many conflicting communications. They were not confidently able identify anyone who was not already registered or who did not make an application via the canvass form. Objections to scheme No comments or feedback in relation to information on the council website or press releases about the pilot. Teignbridge were unable to conduct a follow-up assessment and write to mismatched electors due to the delays in providing the data. However, they did find issues with the quality, quantity and format of the data supplied. Some of the issues related to poor communication, for example, data was supplied as an Access file, which Teignbridge electoral staff had no experience of dealing with they expected the data in a different format. Teignbridge believe that in areas where a high canvass returns is achieved anyway, data matching on this scale may not be an appropriate use of resources. Cost of the pilot was limited purely to staff time; total of 3,666.

Assessment in meeting the registration objectives

Ease of administering

Savings of time and costs

138

Tower Hamlets
Local authority information
Population (16+): 190,655 Local government register entries (December 2009): 160,278 Canvass return rate (2009): 81.8% Estimated movers in the last 12 months: 22.5% Proportion of population who are BME: 48.5% Private renters: 14.4% Full-time students (aged 18-74): 6.3% Density (person per acre): 99.2

Description of approach
Database: DWP Centric. Parts of register being matched: Whole register. Target groups: General under registered and houses with more than 8 occupants. Localised matching/data cleansing: They carried out localised matches after the initial match to confirm information. Follow up method/approach: Only checked part of the data received (approx. 7%) against response from annual canvass. Approach during/after canvass: N/A. Control group: N/A.

Key pilot data


Data were not provided in a form that can be compared to the other pilot areas.

Summary table
Objective Assessment in meeting the registration objectives Findings Their original plan could not be followed due to delay in receiving the data. Verified some of the new potential electors identified from data matching (2,700 out of almost 39,863 mismatches) and 494 were confirmed living at that address (18%). Identified some potential electors but this included people who were not eligible. Pilot did not assist in helping identity fraudulent entries. None established.

Objections to

139

scheme There were issues related to the delays in the provision of the data. Currency, quality and quantity of data was also an issue (they did not expect to receive both matches and mismatches and found there was too much data to check). The process was very resource intensive. Identifying non-matches was problematic because they were comparing a property database against a people database, without matching fields such as UPRNs. There were also problems because some data (e.g. abbreviations /middle names) were inconsistently recorded. Total cost of 59,132.80 (including internal staff costs). Tower Hamlets feel that data matching was not a success and should be conducted using local records (otherwise cross-checking national and local data will be essential to verify accuracy and currency).

Ease of administering

Savings of time and costs

140

Wigan
Local authority information
Population (16+): 249,664 Local government register (December 2010): 241,145 Canvass return rate (2010): 92.3% Estimated movers in the last 12 months: 11.3% Proportion of population who are BME: 1.2% Private renters: 4.5% Full-time students (aged 18-74): 1.6% Density (person per acre): 16.0

Description of approach
Database: DWP Centric, DVLA. Parts of register being matched: Whole register. Target groups: Attainers, young people (18-24), general under registered groups. Localised matching/data cleansing: Internal matching against council tax was carried out on all DWP records. They were unable to review fuzzy matches due to the time required. Follow up method/approach: Personalised letter. Control group: About 25% of those identified from data match. Approach during/after canvass: Letters issued between first and final stages of annual canvass (after the initial delivery of forms but before final door-knock stage).

Key pilot data


Database DWP Centric DVLA Match score: 45 250,710 250,710 206,678 82.4% 31,095 12.4% No follow-up undertaken 168,754 67.3% 71,960 28.7%

Number of ER records sent ER records matched % match On ER/Not on Database % on ER only Of which followed up

141

On Database/Not on ER % on Database only Of which followed-up Added to ER New electors as a % of followed up Control group % of control group registered

10,025 4.0% 5,012 187 3.7% 1,138 26.8%

2,433 1.0% 1,701 161 9.5% -

Summary table
Objective Findings DWP 5,012 people were identified through data matching. 4% per cent of those contacted as a result of data matching went on to register compared to 27% of the control group were added or found to be already registered as part of the normal canvass. DVLA 1,701 eligible attainers identified were identified through data matching. 10% per cent of those contacted as a result of data matching went on the register. Both Databases also identified electors no longer living at the address. However, the council considered the process a lot of work with very little success rate. Two strong complaints received from the public about incorrect information about residents on pilot letter. Phone calls also received from people who had been sent a letter but no longer lived at the address. The size of the datafiles (which included duplicates) caused IT problems, as did the time taken to understand and to manipulate them. The ERO would have preferred to receive just unmatched data receiving both matches and mismatches added to the work required to sport through the data. The currency of data could also be improved. Matching was also problematic due to the incompatibility of matching a property database (electoral register) against a people database, without matching fields such as UPRNs (the inconsistent use of initials, middle names compounded difficulties). Total cost of pilot was 17,019.

Assessment in meeting the registration objectives

Objections to scheme

Ease of administering

Savings of time and costs

142

Wiltshire
Local authority information
Population (16+): 370,538 Local government register entries (2010 December): 354,589 Canvass return rate (2010): 88.7% Estimated movers in the last 12 months: 16.9% Proportion of population who are BME: 1.6% Private renters: 10.8% Full-time students (aged 18-74): 1.3% Density (person per acre): 1.3

Key pilot data


Service voters (SV) Number of service voters on ER before pilot Number of entries matched with MoD data % match % amended as total SV Registered

% deleted as total SV Registered Properties MoD addresses provided for checking Number of MoD addresses matched % match New MoD addresses identified % new properties as a total of addresses sent 5,642 5,471 97.0% 171 3.0%

Summary table
Objective Assessment in meeting the registration objectives Findings They did not receive information on service voters. Wiltshire found the information on the properties useful as it helped to check the accuracy of service voter properties on the register and identify any duplicate addresses. However, they believe that the pilot suggests that the value of data 143

matching with the MoD is severely compromised if MoD personnel records cannot be provided to authorities. Objections to scheme Ease of administering Savings of time and costs None but they did not get the opportunity to test letters and systems Data was well presented and easily imported in to Excel. However, more IT skills required (e.g. use of secure e-mail systems) and resource (managing large volume of communications). Other than software costs, the only costs were staff time to set the project up and then to analyse the property file against Council records. Total cost: 3,000.

144

Wolverhampton
Local authority information
Population (16+): 192,795 Local government register entries (December 2010): 175,266 Canvass return rate (2010): 90.2% Estimated movers in the last 12 months: 12.8% Proportion of population who are BME: 22.2% Private renters: 6.3% Full-time students (aged 18-74): 3.3% Density (person per acre): 34.0

Description of approach
Database: DWP Centric and DoE (NPD). Parts of register being matched: Whole register. Target groups: Attainers, BME, private renters (mobile population), young people (18-24), ethnic minorities. Localised matching/data cleansing: Checked a sample of individuals potentially missing from ER with Council Tax and Benefits records. Follow up method/approach: Personalised letter. Control group: DWP only. Approach during/after canvass: The majority of data matching letters were sent just before the start of the annual canvass. Other changes were made to the annual canvass.

Key pilot data


Database DWP Centric DfE (NPD) Match score: 65 192,741 192,741 142,510 73.9% 36,566 19.0% No follow-up undertaken 5,517 -

Number of ER records sent ER records matched % match On ER/Not on Database % on ER only Of which followed up

On Database/Not on ER 145

30,788

622

% on Database only Of which followed-up Added to ER New electors as a % of followed up Control group % of control group registered

16.0% 3,868 723 18.7% 6,992 12.7%

0.3% 560 331 59.1% -

Summary table
Objective Findings DWP - matching of the DWP unmatched records against new electors added during the canvass, showed 723 registered (plus 1857 added as result of the annual canvass) out of a possible 3,868 for the 10 matching wards (v 886 electors who registered for the 10 Control wards). DfE - Of the 622 unmatched records, 560 were not found on the register and were sent an application form; 331 of these subsequently registered. Overall, Wolverhampton believe that the changes made to their annual canvass along with the data matching resulted in an increase in registration in Wolverhampton of 1.9% compared with 2010. Feedback from members of the public was not overwhelmingly for or against. About 30 telephone calls received mainly in relation to incorrect information on the follow-up letter (e.g. people no longer lived at the property or that the address was only for correspondence). ain issue was the quantity of data; data returns should only include the unmatched records. However, they had no significant problems with address formats and the lack of unique property identifiers was not a major issue for them. Overall costs: 10,447.

Assessment in meeting the registration objectives

Objections to scheme

Ease of administering

Savings of time and costs

146

Appendix B: Data tables


Summary table
Table B1: The table below provides a summary of the key results for each Database (other than MoD). Accuracy Potential inaccurate Electors entries as % removed of ERO entries sent 22.8% 2,137 33.6% 0 27.2% 0 46.7% 0 96.6% 0 0 72.1% 0 34.2% 2,137 Completeness Potential new Electors electors as added % of ERO entries sent 18.2% 6,573 23.2% 208 2.2% 1,080 2.5% 30 1.0% 2 11.2% 5.8% 24 14.3% 7,917

Database

Match rate 71.2% 60.4% N/A N/A N/A 88.8% N/A 69.3%

Potential inaccurate entries 438,970 158,819 111,985 86,749 122,632 281,985 1,201,140

Deleted as % of followedup 31.6% 0.0% 0.0% 0.0% 0.0% 0.0% 0.0% 31.6%

Potential new electors identified 350,414 109,950 9,022 4,658 1,236 3,997 22,827 502,104

Added as % of followedup 13.2% 4.1% 32.9% 4.0% 5.1% 3.3% 13.3%

DWP DVLA NPD ILR SLC Citizen Account Consolidated data Total

The matching process


The tables below provide data on the matching results between the electoral registers and the external databases.

147

Table B2: Student Loans Company (SLC) data. Local authority Register sent ERO for matching records sent Colchester Total Whole 126,983 126,983

ERO records matched 1,971 1,971

% match

On register / not on SLC 122,632 122,632

% on register only 96.6% 96.6%

On SLC / not on register 1,236 1,236

% on SLC only 1.0% 1.0%

N/A N/A

Table B3: National Pupil Database (NPD) data. Local authority Register sent ERO for matching records sent Blackpool Forest of Dean Greenwich Wolverhampton Total 6 out of 21 wards 16-17 years old Whole Whole 33,210 1,143 184,438 192,741 411,532

ERO records matched 7,896 838 1,573 5,517 15,824

% match

On register / not on NPD 25,314 305 86,366 111,985

% on register only 76.2% 26.7% 46.8% 27.2%

On NPD / not on register 5,167 61 3,172 622 9,022

% on NPD only 15.6% 5.3% 1.7% 0.3% 2.2%

N/A 73.3% N/A N/A 73.3%

148

Table B4: Individual Learner Record (ILR) data. Local authority Register sent ERO for matching records sent Forest of Dean Greenwich Total 16-17 years old Whole 1,143 184,438 185,581

ERO records matched 760 9,210 9,970

% match

On register / not on ILR 383 86,366 86,749

% on register only 33.5% 46.8% 46.7%

On ILR / not on register 130 4,528 4,658

% on ILR only 11.4% 2.5% 2.5%

66.5% N/A 66.5%

Table B5: Consolidated education data. Local authority Databases Register sent for matching

ERO records sent

ERO records matched

% match

On register / not on external data 162,066

% on register only

On external data / not on register 402

% on external data only

Camden

Glasgow

Education: NPD, SLC, ILR DVLA and SLC

Whole

153,290

11,207

N/A

105.7%

0.3%

Wards with lowest response rate where students live Whole

53,109

14,085

N/A

33,553

63.2%

19,133

36.0%

Greenwich Total

DVLA, BISILR, DfE-NPD

184,438 390,837

N/A 25,292

N/A N/A

86,366 281,985

46.8% 72.1%

3,292 22,827

1.8% 5.8%

149

Table B6: Citizens Account (CA). Local authority Part of the register matched

ERO records sent

ERO records matched

% match

On register / not on CA

% on register only

On CA / not on register

% on CAR only

Renfrewshire

30% of ER

35,809

31,812

88.8%

3,997

11.2%

Follow-up activities
The tables below present the results from follow-up activities on individuals identified through data matching. Table B7: Results from Student Loans Company (SLC) data. Local authority Stage of Annual canvass at which follow-up started

Total number followed up

Total added to ER

New electors as a % of followed up

Control group

% of control group registered through normal canvass

Colchester

Final month of annual canvass (November)

39

5.1%

No control group

Table B8: Results from National Pupil Database (NPD) data.

150

Local Authority

Stage of Annual canvass at which follow-up started

Total number followed up

Total added to ER

New electors as a % of followed up

Control group

Blackpool Forest of Dean Greenwich Wolverhampton Total

Alongside annual canvass After annual canvass After annual canvass Majority of letters sent just before annual canvass

2,467 12 244 560 3,283

727 4 18 331 1,080

29.5% 33.3% 7.4% 59.1% 32.9%

2,466 44

% of control group registered through normal canvass 31.2% 79.5%

391 36.3% No control group 2,901 32.6%

Table B9: Results from Individual Learner Records (ILR) data. Local Authority Stage of Annual canvass at which follow-up started

Total number followed up

Total added to ER

New electors as a % of followed up

Control group

Forest of Dean Greenwich Total

After annual canvass After annual canvass

31 724 755

6 24 30

19.4% 3.3% 4.0%

75 850 925

% of control group registered through normal canvass 97.3% 16.3% 22.8%

151

Table B10: Results from consolidated education data Local authority Stage of Annual canvass at which follow-up started

Total number followed up

Total added to ER

New electors as a % of followed up

Control group

% of control group registered through normal canvass

Camden Glasgow

After the first canvass reminder letter After the first canvass reminder letter

383 102

17 3

4.4% 2.9%

No control group Two polling district 80.3%

Greenwich Total

After the first canvass reminder letter

247 732

4 24

1.6% 3.3%

17 17

23.5% 23.5%

Table B11: Results from Citizens Account (CA) data. Local authority Stage of Annual canvass at which follow-up started

Total number followed up

Total added to ER

New electors as a % of followed up

Control group

Renfrewshire

End of annual canvass

628

Follow-up under way

% of control group registered through normal canvass No control group

152

Table B12: Blackpool results from combined DWP Centric and National Pupil Database data. Local Authority Stage of Annual canvass at Total Total added to New Control which follow-up started number ER electors as group followed a % of up followed up

Blackpool

Alongside annual canvass

2,467

727

29.5%

2,466

% of control group registered through normal canvass 31.2%

153

How to contact us
The Electoral Commission 3 Bunhill Row London EC1Y 8YZ Tel: 020 7271 0500 Fax: 020 7271 0505 Textphone: 18001 020 7271 0500 info@electoralcommission.org.uk The Electoral Commission Scotland Office Lothian Chambers 5963 George IV Bridge Edinburgh EH1 1RN Tel: 0131 225 0200 Fax: 0131 225 0205 Textphone: 18001 0131 225 0200 infoscotland@electoralcommission.org.uk The Electoral Commission Wales Office Caradog House 16 Saint Andrews Place Cardiff CF10 3BE Tel: 029 2034 6800 Fax: 029 2034 6805 Textphone: 18001 029 2034 6800 infowales@electoralcommission.org.uk The Electoral Commission Northern Ireland Office Seatem House 2832 Alfred Street Belfast BT2 8EN Tel: 028 9089 4020 Fax: 028 9089 4026 Textphone: 18001 028 9089 4020 infonorthernireland@electoralcommission.org.uk The Electoral Commission Eastern and South East Office 3 Bunhill Row London EC1Y 8YZ Tel: 020 7271 0600 Fax: 020 7271 0505 Textphone: 18001 020 7271 0600 easternandsoutheast @electoralcommission.org.uk The Electoral Commission London Office 3 Bunhill Row London EC1Y 8YZ Tel: 020 7271 0689 Fax: 020 7271 0505 Textphone: 18001 020 7271 0689 london@electoralcommission.org.uk The Electoral Commission Midlands Office, No 2 The Oaks Westwood Way, Westwood Business Park Coventry CV4 8JB Tel: 02476 820086 Fax: 02476 820001 Textphone: 18001 02476 820086 midlands@electoralcommission.org.uk The Electoral Commission North of England Office York Science Park IT Centre Innovation Way Heslington York YO10 5NP Tel: 01904 567990 Fax: 01904 567719 Textphone: 18001 01904 567990 north@electoralcommission.org.uk The Electoral Commission South West Office Regus, 1 Emperor Way Exeter Business Park Exeter EX1 3QS Tel: 01392 314617 Fax: 01392 314001 Textphone: 18001 01392 314617 southwest@electoralcommission.org.uk

The Electoral Commission 3 Bunhill Row London EC1Y 8YZ Tel 020 7271 0500 Fax 020 7271 0505 info@electoralcommission.org.uk www.electoralcommission.org.uk To contact our offices in Scotland, Wales, Northern Ireland and the English regions, see inside back cover for details.

We are an independent body set up by the UK Parliament. Our aim is integrity and public confidence in the democratic process. We regulate party and election finance and set standards for well-run elections.

Democracy matters

Vous aimerez peut-être aussi