Académique Documents
Professionnel Documents
Culture Documents
Today, I’m going to talk about pervasive surveillance in medical wearables, particularly
in the Halo smart hearing aid. This is based on ongoing research that I’ve been doing with Noah
Wilson here at Syracuse as well as Charlotte Tschider at the DePaul University School of Law in
Chicago. We’ve been thinking together about the ways that surveillance is and isn’t addressed in
patient education and disclosure materials and what that means for patient agency in terms of
consent. Our central case study has been the Starkey Halo smart hearing aid, which I’ll tell you
more about in a minute. We’ve examined promotional materials that Starkey published for
patient and practitioner audiences as well as traced the Halo’s surveillance ecology.
But before we get to that, let’s talk a little about algorithmic surveillance. Modern
medical wearables like pacemakers, insulin pumps, bedside monitors, and hearing aids are right
in the mix of big-data surveillance that is all over the news lately. It used to be that medical
calibrated or downloaded. You might remember an elderly relative with a pacemaker who dialed
in on a landline to have their pacemaker calibrated, for instance. Today, medical wearables are
increasingly Internet-tethered and mobile-enabled. They are also reliant on dynamic algorithms
In order to function effectively, these devices require big data collection, including
collection naturally triggers transparency concerns related to the amount and type of data
collected, its use, and its transmission. Medical device regulations in both the US and EU, have
not effectively created a framework for ensuring transparency. Although many countries
Krista Kennedy RSA Presentation
6/2/18, 2:00 – 3:15, Marquette 1 3rd FL
comprehensively regulate medical devices for safety, there are very few international regulations
that sufficiently address privacy and cybersecurity issues. The result is that these technologies
are increasingly opaque to users. Further, surveillance algorithms are purposefully made opaque
to consumers because they are often protected as trade secrets, and manufacturers risk losing this
Smart hearing aids are an interesting site for exploring this because they rely heavily on
multiple forms of data and are both an opt-in and essential medical wearable, depending on the
population. The largest market demographic is late-deafened wearers, people who lose their
hearing later in life. As wearers move toward the sale, fitting, and adoption of the aid, they
receive all sorts of patient education materials about how to learn to work with it, how to take
care of it, and what to expect. But what they’re not necessarily made aware of is the amount of
data they’ll generate and the ways that data will be used.
{SLIDE}Now I’m going to tell you a bit about the Halo, which was introduced in 2014. Smart
hearing aids like it work to rhetorically solve the low rate of adoption not by redesigning the
hearing aid, which has been tried at other times with miniaturization, but instead by adding
another device that is not just socially acceptable but also a status symbol. Adding a smart phone
to the aid opens up a whole different range of possibilities. Older, stand-alone, analog aids
simply amplified sound within the wearer’s range and toggled to a telephone setting. Now, aids
like the Halo offer a combination of automated and wearer-driven functions that filter sound, pull
in sounds through directional mic switching, bluetooth sound from your smart phone, and a lot
more. {SLIDE}The Halo is controlled through an iPhone-based app called Trulink, which
controls all those aspects as well as satellite-based geolocation and automated adjustment of
I’m not going to take you through the whole interface because we just don’t have time.
But the thing to concentrate on here is the fact that you can geotag soundscape memories.
{SLIDE}As you can see up here, I can select from several custom geo-tagged location settings if
for some reason the aid hasn’t automatically switched to them based on telemetry or I need to
tweak them: the grocery store, a couple of local restaurants, or our veterinarian’s office with its
very odd acoustics. {SLIDE} Individual location memories can also be further customized to
decrease, mute, or turn off streaming from phone conversations while the aid is operating in a
specific memory. The same changes can be made for audio streaming, and the car setting can be
{SLIDE}As you can probably guess, in order to do things like this, the system needs to
capture data and rely on a complex network of hardware, data servers, satellites, operating
systems, and algorithms--along with variable involvement from a human wearer and audiologist.
Consequently, the wearer’s body is immersed in a network of algorithms and technologies, with
their iPhone acting as a central nexus between data streams. {SLIDE} Let’s look closely at the
phone side of this ecology. To triangulate position and facilitate movement tracking, the wearer’s
smartphone combines GPS data broadcasted from satellites, obtained from cell phone towers,
and acquired from public Wi-Fi networks and algorithmically calculates anticipated wearer
movement. For example, you’re traveling more than 5 mph, it assumes you’re in a car and starts
filtering out car noise. The data generated by these calculations is then shared with the hearing
fragment this user-generated data into processed data for further calculations by other
algorithms. In this close human/machine collaboration, wearers surrender privacy and control of
Krista Kennedy RSA Presentation
6/2/18, 2:00 – 3:15, Marquette 1 3rd FL
their data in order to access the algorithmic features necessary for nuanced navigation of
soundscapes. This is downloaded by the audiologist during visits for in-house adjustments and
transmitted to Starkey as anonymized data that is used to better understand how wearers are
soundscapes. This data is collected by Apple through iPhone telemetry coupled with the Apple
Maps application.
● Transmitted data that tracks wearer rate of speed. This data automatically toggles the
hearing aid to filter car noise. It is likely tracked by Apple and definitely tracked as setting use
by Starkey.
● Data collected from registration and mobile device use. This data, although not collected
Surveillance Studies scholar Gary Marx defines surveillance as the use of technological
systems for the extraction of individual and group data. He calls for a nuanced examination of
surveillance that accounts for its beneficial uses as well as its significant downsides. It is clearly
beneficial for an audiologist to access data that allows customized adjustment for their patient. It
is less clearly beneficial for that patient’s movements to be aggressively tracked as they rely on
algorithms for vital management of disability disclosure or to retain that data for long periods of
time. This network of surveillance poses complications for the wearer’s performance of
rhetorical agency because it aggressively imposes surveillance through coercive systems. The
Krista Kennedy RSA Presentation
6/2/18, 2:00 – 3:15, Marquette 1 3rd FL
wearer can certainly turn off geolocation features, but at the cost of having to manually navigate
When we discuss agency, we’re relying on Miller’s notion of agency as kinetic energy, as
exchange, and also on Davis’s work on response-ability - the ability to respond as well as to take
responsibility. In order to perform agency, you have to have a basic understanding of the context
possible. In the case of the Halo, the wearer needs to have some basic understanding of the fact
that this collection exists. So, what do the patient materials tell us about any this? Not a lot about
the technology proper, aside from emphasis on the iPhone as a central aspect of the system.
Starkey frames patient agency in connection to the Halo primarily in terms of sociality and
efficiency. As a result, we see lots of images of adults of various ages who have adopted hearing
aids in their daily lives in order to maintain social capital in multiple social spheres. We do not
see them actively interacting with technology. The professional brochure for audiologists says
little about data collection aspects of the system aside from the data available to audiologists for
We also took a look at the patent filing because under the law, patents provide an
important notice function. They do this by trading a limited monopoly on the invention, usually
of twenty years, while simultaneously disclosing the invention to the public, specifically
competitors. Although the average Halo user might not read patent documents, this information
sharing does enable others to critically review Starkey’s practices and device design. However,
Starkey has not included data collection and use details in at least patents that have been issued.
functionality except indirectly, such mobile device app data transmissions and integration into a
Krista Kennedy RSA Presentation
6/2/18, 2:00 – 3:15, Marquette 1 3rd FL
common device data set. This reflects an increasingly common trend for smart devices:
algorithm use opacity. Organizations increasingly keep algorithmic details, including system
functionality, secret, in hopes of relying on trade secret protection. With an increase in trade
secret protection rather than patent disclosure, organizations are relying more often on opacity
And finally, nothing about data collection is disclosed in the User Agreement or Privacy
Statements. In the United States, the Health Insurance Portability and Accountability Act
(HIPAA) requires covered entities and their business associates (BAs) to abide by the privacy
rule. This requires disclosing collection and use and also receiving authorization for special uses.
When a manufacturer is not required to follow HIPAA, such as when Starkey does not receive
insurance reimbursement for devices, the Federal Trade Commission still requires notice and
consent to personal data collection and use. When the user agreement and privacy statements do
not include pertinent information, consent is not “informed,” and that, in the legal sense, deprives
It’s practically impossible for a Halo wearer to have sufficient information available to
exert the sort of agency that is necessary for meaningful choice. The notion that the field has
been talking about concerning algorithmic literacy or understanding the rhetoricity of algorithms
is out the window here, because in order to get to those steps you have to be aware that a)
multiple algorithms are adjusting your aid, b) those algorithms are generating significant data
streams, and c) those data streams are being collected, analyzed, stored, and monetized. The
hardly uneducated about technology. But it also wouldn’t be unusual for any wearer to not be
fully prepared to consider the fact that with this hearing aid, you will live your life with no fewer
Krista Kennedy RSA Presentation
6/2/18, 2:00 – 3:15, Marquette 1 3rd FL
than three robots in your ear that actively process and adjust what you hear. That’s information
And therein lies the rhetorical catch-22 for Starkey. This is the part where we’d normally
make recommendations about patient education. It’s right and good to demand that patient
education about algorithmic data collection be included in advance patient materials, in point-of-
sale materials, in privacy agreements, and the like. In the EU, they will have to do that to some
extent because of new regulations that just went into effect, but that’s not the case here in the US.
But in order to undertake sufficient patient education that would help wearers gain just enough
algorithmic literacy to understand what data is being collected, how it is being collected, and the
implications of that collection, Starkey would have to undercut its own argument. That argument
is: hearing aids can be a naturalized part of your daily existence that don’t impose obtrusive
in the patient materials – and that is one meant to drive adoption of a needed medical wearable.
So the real rhetorical problem here is how to position this technology in a naturalized way so that
people can accept it and gain agency in communicative situations while also educating them
about a hardcore technology sufficiently that they can effectively consent to data collection. And
that’s a very tricky thing to do. How do you rhetorically construct naturalized algorithms?
Naturalized surveillance by a device that is, in fact, rewriting the auditory pathways in your
brain? Rhetorical dimensions of technological intimacy are key here, and that is an element of