Vous êtes sur la page 1sur 8

Bulletin of the Atomic Scientists

ISSN: 0096-3402 (Print) 1938-3282 (Online) Journal homepage: https://www.tandfonline.com/loi/rbul20

Facial recognition and the future of privacy: I


always feel like … somebody’s watching me

Brenda Leong

To cite this article: Brenda Leong (2019) Facial recognition and the future of privacy: I always
feel like … somebody’s watching me, Bulletin of the Atomic Scientists, 75:3, 109-115, DOI:
10.1080/00963402.2019.1604886

To link to this article: https://doi.org/10.1080/00963402.2019.1604886

Published online: 26 Apr 2019.

Submit your article to this journal

View Crossmark data

Full Terms & Conditions of access and use can be found at


https://www.tandfonline.com/action/journalInformation?journalCode=rbul20
BULLETIN OF THE ATOMIC SCIENTISTS
2019, VOL. 75, NO. 3, 109–115
https://doi.org/10.1080/00963402.2019.1604886

Facial recognition and the future of privacy: I always feel like . . . somebody’s
watching me
Brenda Leong

ABSTRACT KEYWORDS
In the 21st century, we live in a world packed with closed-circuit video cameras, facial recognition Artifical intelligence; AI;
systems, radio frequency identification chips, electronic toll collectors, smartphones with location privacy; tracking; facial
tracking, and widespread monitoring of our electronic communications. As deeply as the indus- recognition; surveillance
trial revolution upended 20th century social norms and political structures, so too has modern
information technology been a revolution, giving governments and large private corporations
vast power to keep track of, manipulate, and potentially repress entire populations. China offers
some examples in this area, but even democratically elected governments have shown
a tendency to want to digitally profile and analyze their citizens without sufficient respect for
individual privacy. And just as rampant industrialization had to be reined in to protect human
rights and individual dignity, information technology and digital systems must be controlled to
prevent abuse and exploitation. This “boundary setting” can come none too soon, as rapid
advances in artificial intelligence allow an even greater ability to process and categorize the
vast amounts of data generated by all our electronic devices.

There is no more Orwellian thought than “Big Brother is place, that seems unlikely to happen. Woodrow
Watching.” Even without specifying exactly how people Hartzog, a Northeastern University professor of law
are being watched – whether from tracking cell phones and computer science and Evan Selinger, a professor
or profiling online browsing behavior – it’s a sobering of philosophy, do make a strong case for a complete
thought to imagine constant surveillance. When the ban on such systems, and it is certainly possible that
concept becomes literal, and people believe that cam- individual states or localities may pilot such restrictions
eras are, in fact, watching them everywhere they go, (Hartzog and Selinger 2018). But I believe a general,
possibly even in their homes, certainly in their cars, the national ban is unlikely – and if a ban does occur,
result is a chilling effect on individuals who may no then the same concerns that are outlined here will
longer feel free to live their lives the way they’d like. continue to exist and continue to need to be addressed,
There’s also the opportunity for extreme abuse, by but perhaps with more time available to do so. What’s
governments, corporations, bad actors, international more likely is that society will continue to debate these
officials – and even by fellow citizens. applications “on the fly” as use cases abound.
So what does that mean for people dealing with the
facial recognition systems that are popping up all over?
Individuals can use facial recognition to open their “Say Cheese!”
phones, access bank accounts, authorize payments When is your face just a photo and when is it
and other on-line activity, or organize their photos. a “biometric identifier?” Understanding how the tech
Organizations use these systems for managing facility systems work is a critical foundation for effectively
access, crowd control, and hospitality functions, and understanding and evaluating the risks of facial recog-
governments use them for terrorist tracking, border nition. (While this discussion is focused on facial recog-
security, and criminal investigations (Cullum 2018). nition, there are many options for biometric systems
New uses are being imagined and developed all the that essentially all work following the same process.
time (Bogle 2015). Is this good? Is it terrifying? Should Facial recognition, finger/palm scans, and iris/retinal
we stop using it all together? scans all have different pros and cons, so for one to
While there are certain leading thinkers who have be “better” would very much depend upon the use to
publicly advocated for halting all use of facial recogni- which it was being put.)
tion systems until standards and rules can be put in

CONTACT Brenda Leong bleong@FPF.org


© 2019 Bulletin of the Atomic Scientists
110 B. LEONG

A photo of a face is an image – possibly physically what is a human face versus what is not-a-face, and
printed ink on paper, or perhaps displayed digitally. It is marks it to allow for the camera to focus, or perhaps to
the actual two-dimensional picture of the person, which count people passing a certain spot, or other comple-
another human would look at and say: “That’s my tely non-personalized applications that simply have
friend, Parva!” a need to know when a person is present.
In contrast, a facial recognition system does not The next level is called facial characterization. In
create photos. It creates templates. This means that this case, the camera is still not creating a template or
when the face is scanned, the system isn’t taking your other personal, individualized record, but is collecting
picture, it is creating a web design based on your facial and observing more detailed information than under
structure that is generated by proprietary software. facial detection. Examples include an interactive bill-
(That is, every company’s system will do it differently, board at a bus stop, or a screen mounted above
and they will not be interchangeable.) That web overlay a product display that might be used to collect infor-
is a template that is then turned into a series of 0’s and mation such as gender, approximate age, and poten-
1’s – which can then also be encrypted, if desired. To tial emotional indicators (“smiling,” “sad”); this
“enroll” someone in a database for facial recognition material can then be combined with other data such
purposes, a scan is made of the real person’s live face, as how long the person looked at the screen, or
for which the created template is then stored, along where else they went within the store. Such data
with a tag or code to connect to the full record of any can give advertisers useful information about shop-
other personal information collected or retained. If an pers’ reactions, based on type: young women
actual picture of the person is part of the record, it is responded favorably; older men glanced away quickly.
normally taken separately, and should be stored com- Similar technology can also benefit visually disabled
pletely separately from the template. individuals by describing on-screen images to them:
When the person returns to be identified (or an “A man and a woman seated on a towel on the
image is selected from a video or photo for identifica- beach, laughing and sharing a drink.”
tion to be attempted), then the system scans the per- Neither detection nor characterization systems cre-
son or image, creates a new template, runs that ate or collect personally identifiable information and
template through the algorithmic math, and then com- consequently have very low privacy risks.
pares the binary number against the enrolled file or files The next two levels refer to how the term “facial
for a match. When a matching template is found, the recognition” is often used by a lay audience: verification
person is identified. This process likely takes less than and identification.
a second, and for any reputable system, no image is Verification is a one-to-one matching system, like
ever stored with the template, and no face can be what happens when you access your phone: The screen
recreated from the template. (By “reputable system,” scans your face and tries to match you to the template
I mean that the National Institute of Standards and saved on your phone. Either it matches, or it does not.
Technology evaluates, grades, and publicly lists the Verification can be summed up as, “Is this person who
results for most systems. It is to the industry’s compe- they are claiming to be?”
titive advantage to be extremely accurate across the In another example of verification, a building may
broadest demographic variations possible. Systems are hold a database with all the templates of the building
getting better at an accelerated pace in recent years, employees created and stored (“enrolled”) in it. When
and leading providers are achieving extremely high an employee then attempts to enter the building, the
accuracy levels.) camera scans their face, and a reader checks their ID
But not all cameras running some kind of facial soft- card. The ID card tells them who the person is claiming
ware are “recognition” systems. There are, in fact, at least to be, and the scan is matched against that template,
four levels of facial image software, each with different and that template only. If it matches, the person is
use cases, benefits, risks, and privacy implications. These allowed into the building. If it does not, they are
levels are, in ascending orders of complexity: facial rerouted to a receptionist or other alternate system
detection; characterization; verification; and identifica- for evaluation. This is also the type of system being
tion. They are described in more detail in Figure 1. tested at airports now for international boarding pro-
The most basic is facial detection, such as what you cesses – it compares the traveler’s presented identity
might see through your camera – the small square (their face, along with a government-issued ID) against
overlay that moves around to frame the faces of the the enrolled template tied to photos of passengers on
people in your field of vision. This technology finds the flight manifest. In verification, the system does not
BULLETIN OF THE ATOMIC SCIENTISTS 111

Figure 1. Understanding facial detection, characterization, and recognition technologies. Image courtesy FPF.
112 B. LEONG

look at any other templates or attempt to identify the points on a face and create an three-dimensional
person as anyone else. The output is a simple “yes” or scan. (The 3D technology is one of the ways to prevent
“no” to validate a claimed identity. access by someone simply holding up a picture of the
The final category of facial recognition is identifica- phone’s owner to gain access.) The level of detail
tion, also known as a one-to-many matching process. required in the template, and the level of certainly in
This case can be summed up as: “Can software deter- the match, are roughly at a false positive rate of 1-in-
mine who this unknown person is?” This type of system 10 million. This is an entirely acceptable standard for
is what law enforcement uses, running a collected phone access, but far below the standards that would
image against a database filled with enrolled criminals, be required for terrorist watchlists. FaceID on personal
or driver’s license holders, or other pre-selected data digital device is secure enough to use for digital pay-
sets. The system scans the image – possibly from ments through Apple Pay. It would not be sufficient
a videotape at a public venue, or an image from for criminal prosecution (La 2018).
a camera on-scene – and attempts to match the tem-
plate within whatever dataset is available to it. In this
“But it’s my face!”
case, the system is most likely to come back with multi-
ple possible matches. It can be set to varying levels of One of the greatest areas of angst for those concerned
sensitivity (based on presets that default for false posi- about facial recognition systems is in the security of the
tives or false negatives), and ultimately a human will data. There are at least a couple of ways to consider the
review any suggested matches for a final decision on security. One is “How close is it to “perfect” – can it be
whether this person has been successfully identified. spoofed (access gained without the actual person’s face
Identification is most commonly used in the context present) or hacked (accessing the stored file of tem-
of security – such as identifying someone shoplifting plates). High-quality facial recognition systems rate very
against a data set of known shoplifters or individuals well on such a scale, but no system is perfect, and critics
present in a restricted area at a sports arena against have argued that even the small percentage of incor-
a database of season ticket-holders. And of course, this rect outputs of such a system make them unreasonably
is the type of system at use in criminal investigations. In risky. Perhaps, however, a better way of ranking them is
order for law enforcement to have access to such to compare them to available alternatives, such as
images, a warrant, subpoena, court order, or some passwords.
other legal process needs to be fulfilled, based on con- Biometric data, contained in a database of enrolled
text, and local laws. individuals, is almost certainly a more secure option
Once the legal standard has been met, it still takes than passwords. Passwords can be cracked fairly easily
some time to sift through the imagery. Unlike what by “brute force” methods (such as running software
happens on some popular television shows, a firm ID that attempts patterned combinations of numbers and
cannot be made in a matter of seconds. Instead, the letters in alphanumeric code) if they’re not strong –
reality is that the government must first obtain the which most are not. And people tend to re-use them,
videos, and then use an initial software program to so having someone’s password from one account is
run through them all and create a raw image gallery likely to provide access to other accounts as well,
of all the people contained. These images will be lower which results in the password only being as safe as
quality, isolating images of people in hats, in shadows, the weakest system on which it’s stored. So if a server
or captured from odd angles. Then the templates of file of passwords and a server file of biometric tem-
those images must be run against whatever dataset the plates are each breached, what are the risks?
police feel is applicable, such as mugshots, local Access to passwords yields immediately usable infor-
Department of Motor Vehicle records, or others. Any mation to directly access individual accounts. And, as
potential match or matches will almost certainly be at mentioned, each password might be useful to access
low levels of certainty, and a human official will need to other accounts as well. In contrast, a breach of
review them to see if any seem sufficiently reliable to a database of biometric data will yield only those binary
follow up with further action. numbers which cannot easily – if ever – be “back-
The level of accuracy required in any system varies engineered” into the template or the original image.
based on application and context. For an iPhone, This information cannot be used to access the asso-
Apple’s system is verifying enrollment against ciated account, nor is it likely to be the same system
a gallery of one, stored locally on the device. Apple’s as used on any other accounts, since each platform is
method, called FaceID, uses an infrared camera, probably using a different vendor and the algorithms
a depth sensor and a dot projector to map 30,000 are not interoperable. Actually breaching a database of
BULLETIN OF THE ATOMIC SCIENTISTS 113

biometric data may or may not be harder, depending them to identify the people around them and navigate
on general network security, but if it were breached, with increased independence. Hotels and conferences
there is a much higher likelihood that the data would are working to create a seamless experience based on
be much harder to exploit in any systematic way. facial recognition systems for their members; registrants
Finally, biometrics are almost always part of a 2-factor who have opted-in to such a service can theoretically
system – meaning they are only one piece of travel from taxi, to lobby, to room, or check into
a multiple-step access process – and therefore just a conference, with no delays, lines, or frictions along
having the biometric isn’t enough to gain access. their path.
The law enforcement and national security benefits
are likewise real. Facial recognition systems have
“I’d like a little privacy, please!”
already been key in identifying suspected terrorists or
A core privacy principle is known as “Data Quality and criminals. They can do so generally with decreased
Integrity” – the requirement that individual data and costs, increased efficiencies, and consistently greater
data sets are “accurate, relevant, timely, and complete” accuracy than humans.
(Homeland Security 2008). The challenge for current Some uses may not clearly be either good or bad.
facial recognition systems is to achieve sufficient accu- Profiling shoppers, tracking online preferences, and per-
racy across demographic variations such as race, ethni- sonalizing recommendations or experiences are fea-
city, and gender, although they are improving all the tures some consumers value, but others may not.
time. Nevertheless, the ways in which facial recognition Tying these options closely to the appropriate consent
systems have fallen short, sometimes in rather specta- level is important.
cular fashion, have led to significant reservations about Less beneficial uses, of course, abound. Many con-
their reliability from the standpoints of privacy and civil cerns involve government applications. Academics have
liberties. done in-depth reviews of the impact of surveillance
Privacy concerns cross into both the governmental technology on minorities, religious groups, and other
and the commercial spheres. Government misidentifica- traditionally targeted or vulnerable populations
tion can lead to innocent people on watch lists, with an (Georgetown Law 2017). They have likewise criticized
increased risk of bad results for minorities and other at- the decision to implement facial recognition at airports
risk populations. Likewise, commercial companies may (Nixon 2017) and in police operations.
use facial recognition to unfairly or illegally discrimi- Consider public demonstrations, especially with
nate. For example, a retail chain might create its own the increased occurrence of large-scale marches in
data set of “known” offenders without any clear stan- recent years. Those who march for or against
dards for who is targeted, and no practice by which a cause understand that they’re in public. They realize
they are notified or can appeal their inclusion. What’s that people will see them and possibly recognize
more, retail chains may share such lists with other them, and that other attendees – including those
companies, potentially resulting in individuals being they don’t know, or don’t see – may take their
broadly denied service without any due process. photo, and may post it online in places they’ll never
The ethical considerations of where and how to use know about. Nevertheless, what they probably do not
facial recognition systems even exceed the boundaries expect is that the government will have cameras or
of traditional privacy considerations. By using machine coverage that enable it to later collect and identify
learning programs as the underlying foundation, these the images of many or most people present, and
systems are built on existing data that reflect human keep a file designating those individuals with some
biases, and automate them. Having “humans in the kind of code or tag for future reference.
loop” will not correct this, unless those humans include In some countries, there are potentially few legal
trained programmers who can test and audit systems protections from such omnipresent surveillance of the
for systemic bias and recommend corrective measures. populations. China, notably, seems eager to use facial
The social impacts of this are only beginning to be recognition for everything from identifying jaywalkers
understood. (Zhao 2018) to dispensing toilet paper (Associated
Having said all this, it should be noted that there are Press 2017). A government’s ability to track its citizens
certainly many beneficial use cases for facial recogni- has historically been shown to enable discrimination
tion systems. For individual users, there are many con- against targeted groups or individuals. When this is
venient services, some as simple as tagging people to paired with cultures that espouse and protect fewer
help with sorting and organizing photos, all the way to civil liberties to start with, the outcomes seem
“smart” glasses for visually impaired people, who use ominous.
114 B. LEONG

But commercial entities have shown that their priv- face these challenges will determine whether Orwell
acy practices may not be much better, collecting and was entirely prescient, or we can choose a different
exploiting the personal data of individuals, groups, path, embracing the continued ideal of personal liberty
and entire countries (The Guardian 2018). And the and freedom.
line between commercial and governmental access
and use is frequently blurred and likely to become
Acknowledgments
more so. The expectation that facial recognition sys-
tems will simply make these risks higher seems well- The author would like to thank the Future of Privacy Forum
founded. While there are some examples of putting for its support.
legal frameworks in place to push back against privacy
challenges (EU GDPR 2018), these efforts number only
Disclosure statement
a few so far, and they affect limited areas – such as the
European Union or individual states in the United No potential conflict of interest was reported by the author.
States.
The use of facial recognition systems is not solely
Funding
responsible for the world’s ethical and social privacy pro-
blems, of course. Instead, they are being implemented This research received no specific grant from any funding
into a world already facing the problems of biased human agency in the public, commercial, or not-for-profit sectors.
systems – and even at their worst, facial tracking systems
are still just one more tool in the tech box for govern-
Notes on contributor
ments who are already able to identify, target, and track
individuals beyond anything imaginable in the past. Brenda Leong is senior counsel and director of strategy at the
The same battles that have been fought in the Washington, DC-based thinktank Future of Privacy Forum. She
United States for decades about national identification addresses the ethics and privacy issues around artificial intel-
ligence, as well as biometrics, particularly facial recognition.
cards are raging over facial recognition systems. The She wrote “A Privacy Expert’s Guide To Artificial Intelligence
problem of whether private citizens should be required and Machine Learning” last fall, and co-authored “Beyond
to have government-issued documentation verifying Explainability, A Practical Guide to Managing Risk in Machine
their personal identities in order to access goods and Learning Models.”
services, seek employment, travel, or obtain govern-
ment benefits long predates the current discussions
References
related to digital identity systems and the use of facial
recognition systems. Associated Press. 2017. “China Introduce Facial Recognition
Whether past or present, these challenges are all Technolgoy to Dispense Toilet Paper.” https://www.cbc.ca/
news/technology/china-facial-recognition-toilet-paper-1.
based on the question of how to balance government
4052888
efficiencies and national security against protections for Bogle, A. 2015. “You Can Only Read This High-Tech Book if It
individual freedoms and liberty. Running through this Likes Your Facial Expression.” February 2. https://slate.com/
conversation is the underlying conception of privacy. Is technology/2015/02/this-book-only-opens-if-its-facial-
it a fundamental right? What does it mean? Who gets to recognition-software-decides-you-are-nonjudgmental.html
decide which conveniences are worth the tradeoffs Cullum, J. 2018. “Facial Recognition at Border Nets More
Impostors than at Airports.” Homeland Security, November
they require? Are the protections for personal data 20. https://www.hstoday.us/federal-pages/facial-recognition-
offered by policy and law sufficient, or should technical at-border-nets-more-impostors-than-at-airports/
and security protections always be required? Are some EU GDPR. 2018. “EU General Data Protection Regulation.”
systems simply too high of a risk to implement, regard- https://eugdpr.org
less of perceived benefits? Georgetown Law. 2017. “The Color of Surveillance.” https://
www.law.georgetown.edu/privacy-technology-center
The past may offer a clue as to what to expect. When
/events/color-of-surveillance-2017/
they were first implemented, passport photos were The Guardian. 2018. “The Cambridge Analytica Files.” https://
shockingly contested and denounced (Holmes 2015). www.theguardian.com/news/series/cambridge-analytica-
They were not commonly required until after World files
War One. In the short 100 years since, technology has Hartzog, W., and E. Selinger. 2018. “Facial Recognition Is the
only accelerated the practice of government identifica- Perfect Tool for Oppression.” Medium, August 2. https://
medium.com/s/story/facial-recognition-is-the-perfect-tool-
tion and tracking of people’s movements, until today for-oppression-bc2a08f0fe66
many societies face the realistic possibility of almost Holmes, T. T. 2015. “Passports Were Once Considered Offensive.
ubiquitous surveillance. How societies and cultures Perhaps They Still Are.” Atlas Obscura, December 9. https://
BULLETIN OF THE ATOMIC SCIENTISTS 115

www.atlasobscura.com/articles/passports-were-once-consid Nixon, R. 2017. “Facial Scans at US Airports Violate Americans’


ered-offensive-perhaps-they-still-are Privacy, Report Says.” New York Times, December 2017.
Homeland Security. 2008. “Privacy Policy Guidance https://www.nytimes.com/2017/12/21/us/politics/facial-
Memorandum.” December 29. https://www.dhs.gov/sites/ scans-airports-security-privacy.html
default/files/publications/privacy_policyguide_2008-01_0.pdf Zhao, C. 2018. “Jaywalking in China: Facial Recognition
La, L. 2018. “10 Best Phones with Facial Recognition: IPhone X, Surveillance Will Soon Fine Citizens via Text Message.”
Note 9, LG G7, and More.” August 22. https://www.cnet. Newsweek, March 27. https://www.newsweek.com/jaywalking-
com/news/10-best-phones-with-facial-recognition-iphone china-facial-recognition-surveillance-will-soon-fine-citizens-
-x-note-9-galaxy-s9-lg-g7/ text-861401

Vous aimerez peut-être aussi