Vous êtes sur la page 1sur 24

SO43CH12-Anthony ARI 20 July 2017 13:53

Annual Review of Sociology

Toward a Sociology of Privacy


Denise Anthony,1 Celeste Campos-Castillo,2
Access provided by University of Manchester - John Rylands Library on 08/30/17. For personal use only.

and Christine Horne3


1
Department of Sociology, Dartmouth College, Hanover, New Hampshire 03755;
Annu. Rev. Sociol. 2017.43:249-269. Downloaded from www.annualreviews.org

email: denise.anthony@dartmouth.edu
2
Department of Sociology, University of Wisconsin, Milwaukee, Wisconsin 53211;
email: camposca@uwm.edu
3
Department of Sociology, Washington State University, Pullman, Washington 99163;
email: chorne@wsu.edu

Annu. Rev. Sociol. 2017. 43:249–69 Keywords


First published as a Review in Advance on April context collapse, digital divide, information and communication
26, 2017
technologies, monitoring, privacy norms, surveillance
The Annual Review of Sociology is online at
soc.annualreviews.org Abstract
https://doi.org/10.1146/annurev-soc-060116- What are, and what should be, the boundaries between self and society, in-
053643
dividuals and groups? To address these questions, we synthesize research
Copyright  c 2017 by Annual Reviews. on privacy that is relevant for two foundational sociological issues: social
All rights reserved
order and inequality. By synthesizing work on a narrow yet fundamental set
of issues, we aim to improve our understanding of privacy as well as pro-
vide a foundation for understanding contemporary privacy issues associated
with information and communication technology. We explore the role of
ANNUAL
REVIEWS Further privacy in maintaining social order by examining the connections of privacy
Click here to view this article's with social control and with group cohesion. We also discuss how inequality
online features:
• Download figures as PPT slides
produces variation in privacy and how this variation in turn contributes to
• Navigate linked references inequality. Throughout the review we identify potential directions for soci-
• Download citations
• Explore related articles ological research on privacy generally and in the context of new technolo-
• Search keywords
gies. Our discussion highlights implications of privacy that extend beyond
individual-level concerns to broader social, structural impacts.

249
SO43CH12-Anthony ARI 20 July 2017 13:53

INTRODUCTION
What are, and what should be, the boundaries between self and society, individuals and groups?
Answers to these questions have varied over time and across cultures. In hunter-gatherer soci-
eties families lived, slept, and conducted intimate activities (like having sex) within common living
spaces. In emerging industrial societies, poor families crowded together in tenements, and soli-
tude and quiet contemplation were considered luxuries for only the most learned and wealthy.
Changes in architecture and technologies have also affected these boundaries. The postcard and
party telephone lines enhanced opportunities for communication but also for eavesdropping by
mail carriers and neighbors. Photographic and wire-tapping technologies facilitated the moni-
toring of crime but also raised concerns about the role of media, other corporate actors, and the
Access provided by University of Manchester - John Rylands Library on 08/30/17. For personal use only.

nation-state in surveillance activities. Such tensions highlight the centrality of privacy for complex
questions about the connections between self and others in interpersonal relationships and in the
broader community, as well as the role of personal information in maintaining orderly and just
Annu. Rev. Sociol. 2017.43:249-269. Downloaded from www.annualreviews.org

societies. Privacy concerns draw attention to the appropriate balance between centralized moni-
toring systems and individual well-being in healthy societies, between disclosure and concealment
of personal information in cohesive relationships, and between power and fairness in markets,
governments, and communities.
Recent innovations in information and communications technologies (ICTs), including the
increased availability of ubiquitous wireless devices and networks, inexpensive sensors, Big Data
analysis tools, and massive data storage, have made privacy a topic of increasing concern for
citizens, governments, businesses, and academics (e.g., Nippert-Eng 2010, Nissenbaum 2010,
Podesta et al. 2014, Rule 2007, Solove 2008). These technologies enable connections among
actors and facilitate the monitoring of individuals by multiple institutions and across multiple
spaces and interactions. Information about people is collected as they move throughout their day,
stored in long-term shareable databases, and is aggregated, analyzed, and disseminated. In this
age of ubiquitous ICTs, it is clichéd to say that privacy is dead. However, while it is true that
new technologies have created an unprecedented set of privacy challenges, people across cultures
and time periods have always sought to manage privacy (e.g., Altman 1977, Moore 1984, Murphy
1964, Roberts & Gregor 1971, Shils 1966, Westin 1970, Zureik et al. 2010).
Similarly, academic attention to privacy has a long history, although there is no recognized sin-
gle literature on privacy. Instead, privacy expertise spans multiple academic disciplines, including
law, anthropology, psychology, philosophy, political science, computer science, and criminology.
In sociology, the study of privacy occurs within multiple subdisciplines that do not always converse
with one another. In this review, we synthesize the research on privacy that is relevant for the
foundational sociological issues of social order and inequality, without attempting to cover the
vast multidisciplinary literature on privacy. We include work that, for the most part, has been
conducted by sociologists or published in sociology outlets. However, because privacy research
occurs in disparate fields, readers will also see citations of scholars in other disciplines when their
work addresses the sociological questions that are the focus here. By synthesizing work on a nar-
row yet fundamental set of issues, we aim to increase our understanding of privacy and to provide
a foundation for analyzing the contemporary privacy issues associated with ICTs.
We begin by clarifying our definitions of privacy and associated concepts. We then describe
the intersections between privacy and social order. We explore the role of privacy and monitoring
in maintaining control, a key element of social order. We also look at social order in terms
of the tensions between disclosure and concealment in cohesive relationships and communities.
We then discuss privacy and inequality, exploring how existing inequality produces variation in
privacy and how this variation in turn contributes to inequality. Throughout the review, we identify

250 Anthony · Campos-Castillo · Horne


SO43CH12-Anthony ARI 20 July 2017 13:53

opportunities for sociological research on privacy. We conclude by discussing the implications of


the sociological literature for understanding contemporary privacy issues not just for the individual
but for society as a whole.

DEFINING PRIVACY
Partly because of its interdisciplinary nature, privacy can be conceptualized in many ways (e.g.,
Margulis 1977, 2003). We define privacy as the access of one actor (individual, group, or orga-
nization) to another. Access is a potentially valuable resource. It includes, but is not limited to,
access to information (Shils 1966, Wilsnack 1980) as well as the way information is used. Access
Access provided by University of Manchester - John Rylands Library on 08/30/17. For personal use only.

varies across multiple dimensions, including level (e.g., amount), type (e.g., face-to-face, online),
and content (e.g., medical diagnosis, credit history). Defining privacy as access of one actor to
another means that privacy refers to what people conceal and reveal and what others acquire and
ignore.
Annu. Rev. Sociol. 2017.43:249-269. Downloaded from www.annualreviews.org

Privacy is affected by a range of factors. Laws define the contents, levels, and types of access that
are legal and illegal. Social practices, such as levels of supervision or interaction patterns, also shape
accessibility (e.g., Acquisti et al. 2015, Anthony et al. 2007, Donath & boyd 2004, Nippert-Eng
2010). Technology—ranging from simple architectural elements such as doors or office cubicles
(e.g., Saval 2014) to complex smart technologies and systems such as social media platforms, smart
meters, wearable sensors, and driverless cars—affects ease of access and structures who has access
to whom, as well as the possibilities for storage, aggregation, manipulation, and dissemination of
personal information. Privacy norms may also affect access (Nissenbaum 2010).
Privacy norms identify the characteristics of access that are deemed appropriate within a con-
text. For example, privacy norms allow observing strangers’ nearly naked bodies at the beach but
not through a neighbor’s bedroom window. Similarly, privacy norms discourage taking and dis-
seminating pictures of a roommate’s sexual behavior but encourage taking and sharing pictures of
newborns—although there is much debate as to whether the dissemination of personal informa-
tion via social media sites should be viewed as appropriate or not (e.g., boyd & Hargittai 2010).
Thus, violations of privacy norms include levels of access that are too high or too low, access
to the wrong kind of information, access through inappropriate channels, and inappropriate use
of information. When privacy norms are followed, people feel that they have privacy (see Bates
1964 for a discussion of the feeling of privacy; also Nippert-Eng 2010). When norms are violated,
individuals may feel invaded or isolated, or they may feel that another actor is being too secretive
or exposing too much. In addition, individuals may have their own privacy preferences; that is,
they may prefer broader, stricter, or different modes of access than norms dictate (Acquisti et al.
2015, Nippert-Eng 2010).
Privacy norms vary depending on a variety of contextual factors (Bates 1964, Horne et al.
2015, Nissenbaum 2010), including the characteristics of the recipient, the relationships between
the actors involved, the purpose of the access (e.g., benefits to the subject, collectors, or group),
and the way information is used (e.g., collected, stored, aggregated, analyzed, and disseminated).
Privacy norms are affected by aspects of the broader environment—for example, demands the
group imposes on the individual (Moore 1984), the level of threat in a society (Brooks & Manza
2013), laws (Solove 2008), the “surveillant assemblage” of the state and corporations (Haggerty
& Ericson 2000), and physical infrastructures and technologies in general (Nissenbaum 2010).
Societal changes, such as the recent uptake of ICTs, have the potential to spur a renegotiation of
privacy norms across contexts. Laws typically cannot keep pace with rapid technological changes
and so are less able to regulate behavior during such periods (Cate 1997, Pasquale 2015, Podesta
et al. 2014, Solove 2008).

www.annualreviews.org • Toward a Sociology of Privacy 251


SO43CH12-Anthony ARI 20 July 2017 13:53

Privacy management is an actor’s control over access to the self as well as the ability to access
others. For organizations, this includes managing levels of organizational transparency as well
as seeking information about employees and other actors. Although the ability to manage both
aspects of privacy is necessary for interaction and the maintenance of social order, contemporary
concerns about privacy typically focus on limiting access to the self only. Our conceptualization
of management as the actor’s ability to regulate access to the self and others highlights the problem
of determining the appropriate balance of giving and seeking information. According to this view,
privacy management is the actors’ effort to achieve such a balance by complying with privacy
norms and ensuring that others do not violate those norms, as well as implementing their own
privacy preferences. Thus, privacy management is more than simply limiting access to the self; it
Access provided by University of Manchester - John Rylands Library on 08/30/17. For personal use only.

involves a range of strategies for regulating access to the self and others that intersect with social
norms and actors’ preferences and structural positions.

PRIVACY AND SOCIAL CONTROL


Annu. Rev. Sociol. 2017.43:249-269. Downloaded from www.annualreviews.org

Privacy, privacy norms, and privacy management have implications for social order, and order is
fundamental to functioning societies. Social order refers to the extent to which members of a social
group cooperate to achieve collective ends (Hechter & Horne 2009). Privacy and social order are
interconnected, in large part because visibility (a key component of access) is thought to be key to
social control: Violations can only be punished if others are aware the violation occurred (Hechter
1987; see also Coser 1961). In addition, individuals who are aware of being monitored may control
their behaviors (Foucault 1979, Waldo et al. 2007). High levels of monitoring, however, can have
negative effects, and particularly if they violate privacy norms, they may lead to resistance and
backlash. Thus, achieving social order requires managing privacy in a way that allows for an
optimal balance between revealing and concealing (Simmel 1950, p. 361; see also Etzioni 1999,
Shils 1956). Below we describe contexts in which monitoring facilitates control as well as situations
in which monitoring leads to resistance and backlash.

Monitoring and Compliance


Levels of monitoring have varied across time and place in ways that reflect, in part, the ease (or
costs) of collecting information (Hechter 1987). Thus, monitoring shifts with the structure of social
relationships and technologies (such as architectural elements) that are in place. In small, face-to-
face communities with lots of interpersonal interaction, monitoring is relatively straightforward.
People are able to observe and talk about each other as they go through their daily routines. In an
Israeli kibbutz, for example, where members eat in a collective dining hall, share shower facilities,
and live and work in close physical proximity, individual behaviors are publicly visible, and there
are many opportunities to gossip about shared acquaintances (Schwartz 1954). By contrast, in a
moshav, where many activities occur in the home rather than in public, information travels less
quickly. Repeat interactions also foster monitoring. When relationships are structured so that
individuals engage in repeated transactions over time, they learn about each other’s histories and
reputations (Bernstein 1992, Coleman 1990).
ICTs provide new sites of social monitoring and control. Individuals have always collected and
disseminated information about others. People watch each other, they gossip, and they react to the
behaviors they observe (Feinberg et al. 2014). Today, they are increasingly using ICTs for these
purposes—using social media for public criticism and shaming (e.g., Dewey 2014, Harrington
& Bielby 1995) and taking advantage of new venues that facilitate reputation formation among
large numbers of individuals who otherwise have no contact with each other (e.g., Diekmann et al.
2014).

252 Anthony · Campos-Castillo · Horne


SO43CH12-Anthony ARI 20 July 2017 13:53

Like individuals, governments have always sought to collect information to ensure compliance
with relevant rules (Beniger 1986). In addition to relying on direct observation and reports from
informants and undercover agents, governments use technology to collect information about
suspects. In the past, such tools aimed at monitoring behavior in the home (or business) included
wiretaps, telephone records, and utility records. Now, popular technologies include recordings
of behavior in public places and even greater capture of communications. Governments justify
the monitoring of communications with the need to fight terrorism (Brooks & Manza 2013), and
the recording of behavior with the need to increase safety and security. Yet, evidence for the
effectiveness of increased monitoring to enhance security is mixed (e.g., Goold et al. 2013, Welsh
& Farrington 2009). Security forces appear not to change their practices to take full advantage
Access provided by University of Manchester - John Rylands Library on 08/30/17. For personal use only.

of the information they collect (Goold 2004), and doing so may be very expensive. For example,
it took 100 officers four days to review tapes of London’s underground in an attempt to identify
the attackers in a suicide bombing (Doyle et al. 2012, p. 7). Similarly, a report by London’s
Metropolitan Police found that it takes 1,000 public cameras to catch one criminal (Bennett et al.
Annu. Rev. Sociol. 2017.43:249-269. Downloaded from www.annualreviews.org

2014, p. 32). Despite a lack of evidence of their effectiveness and the significant costs of using
them, technological surveillance tools are increasingly popular with governments (Doyle et al.
2012, Hier 2010).
Monitoring also occurs in organizations. In the workplace, employers have an interest in watch-
ing employees to protect company assets, control public communication, and ensure that em-
ployees are as productive as possible (Attewell 1987, Ball 2010). UK Nissan, for example, traces
problems to work teams and individuals and gives weekly scores to each individual (Sewell 1998,
Sewell & Wilkinson 1992). When employers monitor, employees know that flaws in production
or service delivery can be traced back to them and lead to potential sanctions (e.g., withhold-
ing raises or bonuses, firing). Organizations may also harness the power of social pressure. For
example, trucking companies may post performance data where workers can see them, creat-
ing social pressure from peers and even families for low-performing workers to improve (Levy
2015).
However, although monitoring may enable employers or governments to appropriately sanc-
tion individuals, it can also lead to punishments that are poorly calibrated to the severity of the
deviance. For example, a substantial body of research shows that felony convictions and criminal
records generally disadvantage job applicants (Pager 2003, 2007). Something as minor as a single
arrest that was never prosecuted can create a disadvantage (Uggen et al. 2014). The fact that
records are permanent and easy to obtain potentially punishes people far beyond what their initial
deviant behavior warranted. Increase in storage and search capacities means that arrest records are
now easily available, and thus a tempting source of information. Records of possible past crimes
are not always accurate, however, and as such, employers may encounter information that is sug-
gestive rather than factual. For example, research shows that when employers search online for
background information about potential employees, they are more likely to come across insinu-
ations of criminal records for racial and ethnic minority candidates, even when such records do
not exist (Sweeney 2013). There is no consistency in how employers use this kind of information
(Lageson et al. 2015).
In addition, increased information about others’ deviance may paradoxically increase the likeli-
hood of future deviant behavior. Evidence suggests, for example, that visible deviance is contagious
(Diekmann et al. 2015) and undermines existing rules (Keizer et al. 2008), whereas ignorance of
violations maintains norms (Kitts 2003, Ostrom 1990). Such research suggests that privacy main-
tains ignorance about the extent to which individuals are engaging in deviant behavior, thereby
upholding social norms: “No doubt, a publication of all of the sins, crimes, and errors that take
place in a social unit would jeopardize its stability” (Schwartz 1968, p. 744).

www.annualreviews.org • Toward a Sociology of Privacy 253


SO43CH12-Anthony ARI 20 July 2017 13:53

In summary, research on social control suggests that although institutions may decide to install
new technologies due to their relatively low cost, they may not use them effectively. Moreover,
although monitoring may increase control and compliance, it may also produce unanticipated
consequences such as normalizing deviance and weakening the association between the seriousness
of a deviant act and the severity of the punishment. Thus, monitoring is not always commensurate
with its returns to compliance.

Resistance and Backlash


In addition to these consequences, monitoring may lead to resistance and backlash (Haggerty
Access provided by University of Manchester - John Rylands Library on 08/30/17. For personal use only.

2006; Marx 2003, 2009), thereby disrupting rather than maintaining social order. Although there
is generally little organized resistance to contemporary surveillance in the United States (Huey
2010), individuals frequently resent what they view as privacy violations (Nippert-Eng 2010, Stark
2016). Organizations are often surprised by the backlash against their actions and may resort
Annu. Rev. Sociol. 2017.43:249-269. Downloaded from www.annualreviews.org

to hasty backtracking (e.g., Tabuchi 2015). Part of the problem in predicting reactions is that
individuals are not always consistent; rather, their reactions appear to vary with privacy norms and
personal preferences.
Concerns about governments’ and institutions’ use of technology for surveillance have varied
across countries and over time (Anthony et al. 2015, Rule 1973, Westin 2003, Zureik et al. 2010).
In general, monitoring is more accepted when it is perceived as contributing to the collective
good. For example, people may be more willing to accept high levels of monitoring by utility
companies in their homes (e.g., through use of “smart meter” thermostats) if this is meant to
reduce environmental damage and the risk of blackouts for everyone (Horne et al. 2015) rather than
simply increase utility company profits. Similarly, people tend to accept government surveillance
if they believe that it contributes to collective safety and security (Brooks & Manza 2013, Goold
et al. 2013).
Monitoring also produces resentment when it conveys lack of trust (Ball 2010). Particularly
in the workplace, employer monitoring of employees suggests suspicion. The trucking industry,
for example, is increasingly replacing trucker logbooks with automated monitoring. Although the
employer may benefit from increased accuracy, such monitoring produces resentment because
employees feel that employers do not trust them to know how to do their job. As one trucker said,
“If you can’t trust me to go out there and be safe and honest, then take me out of the game and put
somebody in there that you think you can. Either that or put a robot in the truck!” (Levy 2015,
p. 169).
People are more accepting of monitoring that appears to target the “other”—that is, members
of an out-group rather than themselves. Thus, people are more willing to accept CCTVs that they
believe target criminals and monitoring of telephone and Internet activity that they believe targets
terrorists (e.g., Brooks & Manza 2013, Goold 2004). However, when monitoring seems directed
at them, they may be less sanguine. For example, monitoring by the Israeli government produces
resentment and frustration among Palestinians (Shalhoub-Kevorkian 2010); similarly, Muslims
in the United States have reason to be suspicious of government surveillance (Shamas & Arastu
2016). When people feel that they are being targeted, they may resist monitoring. For example,
the use of speed cameras in Quebec led people to use false license plate numbers (Bennett et al.
2014), and teenagers online speak in code to their friends to evade parental monitoring (boyd
2014).
People also dislike monitoring when they have something to hide, particularly when that
something is considered immoral or illegal. In such situations, individuals and groups attempt

254 Anthony · Campos-Castillo · Horne


SO43CH12-Anthony ARI 20 July 2017 13:53

to keep secrets (Warren & Laslett 1977). They may also keep secrets when engaged in illegal
or clandestine activities aimed at resisting, weakening, or overthrowing authorities. Because such
clandestine activities are risky, groups must work to ensure that individuals can correctly identify
fellow group members (e.g., Gambetta 2009) and that individuals do not betray the collective by
sharing secrets (Simmel 1906, 1950).
Conversely, the powerful may seek to keep secrets in order to maintain their positions
(Gibson 2014; Lowry 1972; Simmel 1950, p. 365; Zerubavel 2006) and their legitimacy
[Weber 1978 (1921)]. Bureaucracies frequently want to keep secrets from outsiders [Weber 1978
(1921), p. 992], and employers may want to keep information (for example, about company fi-
nancials) from employees (Rosenfeld & Denice 2015). But such efforts by the powerful often fail
Access provided by University of Manchester - John Rylands Library on 08/30/17. For personal use only.

(Gibson 2014, Simmel 1950), partly because keeping a secret requires great control over informa-
tion (Lowry 1972), and such control efforts may divert resources from more productive purposes.
In addition, secret keeping can never be perfect because people have reasons to leak information
to bolster their own power. Thus, secret keepers risk exposure. The implication is that secrecy
Annu. Rev. Sociol. 2017.43:249-269. Downloaded from www.annualreviews.org

might increase social order in the short run but reduce it in the long run (Gibson 2014).

Future Research on Privacy and Social Control


Research provides evidence that monitoring has mixed effects. It may, under some conditions,
deter deviance or facilitate the capture of bad actors. However, it may also lead to negative out-
comes. Despite the mixed record concerning surveillance technologies, monitoring efforts and
associated reductions in privacy are proliferating without a full understanding of the implications
for compliance and resistance and, in turn, for social order.
We have already described how monitoring may lead to various forms of resistance and back-
lash. Changes in privacy may also have implications for other kinds of counterproductive behavior.
In his classic work on suicide, Durkheim (1951) noted in a footnote a rare type of suicide that
occurs when social control is very high. Since then, scholars have studied the negative effects of
total institutions, such as psychiatric hospitals and prisons, in which individuals have little pri-
vacy (e.g., Goffman 1961). The proliferation of ICTs raises the possibility of reduced privacy for
individuals outside of these specific contexts. Because monitoring in industrialized societies has
traditionally been costly, individuals have been able to maintain some level of privacy. But the in-
creased efficiency and reduced costs of ICTs have encouraged their widespread adoption in ways
that now threaten that privacy. Research should examine how institutions (including state and
nonstate entities) use these new technologies as well as how individuals manage their privacy in
response. Scholars should assess the effects of ubiquitous monitoring on individual and collective
dysfunctions and on social order in general.

PRIVACY AND COHESION


Social control is one aspect of social order; cohesion is another. Cohesion refers to the intimacy of
relationships (that is, people’s affective ties to each other) as well as the extent to which people are
dependent on each other (Lawler et al. 2011). Group cohesion and related concepts (e.g., social
capital and collective efficacy) have implications for both the quality of interpersonal relationships
and the ability of groups to produce collective goods. Privacy has implications for cohesion because
privacy norms dictate the kinds of disclosure that are appropriate for different types of relationships,
and because privacy management shapes and is shaped by the intimacy of relationships.

www.annualreviews.org • Toward a Sociology of Privacy 255


SO43CH12-Anthony ARI 20 July 2017 13:53

Interpersonal Relationships
In interpersonal relationships, disclosure (granting another access to oneself ) varies with the actors
involved and the type of relationship (e.g., Balswick & Balkwell 1977, Rubin et al. 1980). In general,
people tend to disclose more in close relationships; in turn, disclosure can maintain or increase
closeness. Revealing personal information builds relationships, in part, because people need to
know something about each other in order to interact successfully and develop close bonds (e.g.,
Blumer 1969, Lawler et al. 2011). Sharing also encourages reciprocal disclosure, because when
one person takes a risk and discloses, the other is more likely to do the same (Cozby 1972).
Thus, “[t]here is both mutual revelation and mutual gratification” (Schwartz 1968, p. 751). Such
mutual disclosures may flatten status hierarchies and increase intimacy (Richardson 1988, Schwartz
Access provided by University of Manchester - John Rylands Library on 08/30/17. For personal use only.

1968).
In addition, disclosure both requires and increases trust. When individuals disclose personal
information, they make themselves vulnerable to the risk that others may react negatively or
Annu. Rev. Sociol. 2017.43:249-269. Downloaded from www.annualreviews.org

reveal the information to third parties (Eder & Hallinan 1978, Nippert-Eng 2010, Petronio 2002,
Richardson 1988). At the same time, however, by assuming the risk of disclosure, individuals may
also signal commitment and thereby increase their likelihood of being accepted (Kanter 1977).
Disclosure also provides the recipient an opportunity to display trustworthiness by maintaining
confidentiality, thereby increasing the level of trust in a relationship (Petronio 2002).
Concealing information (limiting access to oneself ) is just as essential to managing relationships
as disclosure, particularly for maintaining the absence of a relationship (Shils 1966). Privacy norms
dictate that in public places people should not be accessible to strangers. Accordingly, people often
assume that they will not be monitored or recorded in public (Grimshaw 1982). People also manage
their access to others. In streets, subways, and elevators, people limit the information they accept
by engaging in “civil inattention,” that is, by purposely ignoring available information about others
(Goffman 1966, Hintz & Miller 1995). In a taxi, for example, both the driver and the passenger(s)
may act as if the other were not present in order to create a veil of privacy (Zerubavel 2006).
Likewise, when one person accidentally makes eye contact with another in an elevator, both may
divert their eyes quickly (Hirschauer 2005), and people are uncomfortable if they inadvertently see
into a neighbor’s window (Nippert-Eng 2010). Thus, individuals work to manage privacy in ways
that adhere to privacy norms by limiting access to and by strangers (Campos-Castillo & Hitlin
2013; Goffman 1966).
The desire to conceal also occurs within relationships. Individuals need to withhold information
that might lead the other to see too clearly their differences and imperfections (Bates 1964):
“[T]he more one person involves himself with another on an emotional basis the more both will
need private facilities to conceal nasty habits and self-defaming information from each other”
(Schwartz 1968, p. 744). Stigmatized individuals, in particular, may seek to hide their discrediting
condition to maintain their standing in a relationship (Goffman 1963, Petronio 2002, Stablein
et al. 2015). There are also limits to people’s willingness to accept the disclosures of others:
Too much disclosure may make people uncomfortable (Cozby 1972; see also Argyle & Kendon
1967, Jourard 1964). To avoid learning too much about others, people may engage in “pretense
awareness,” pretending not to know information about the other (Glaser & Strauss 1964). Those
who disclose too much may even be subject to social sanctions (Merten 1999, Miall 1989). Thus,
research suggests that some inaccessibility may be necessary for the existence and longevity of
relationships (Simmel 1906).
Such privacy management efforts are relatively straightforward in societies in which social struc-
tures are concentric and communities are relatively homogeneous (Simmel 1955). Urbanization
complicates matters, however, because it allows both greater individuation and the segmentation

256 Anthony · Campos-Castillo · Horne


SO43CH12-Anthony ARI 20 July 2017 13:53

of relationships. In urban settings, individuals can have ties with people from different contexts
who do not know each other. Further, because of the increased anonymity in urban settings, indi-
viduals can share some aspects of themselves in some relationships (e.g., in the family) and other
aspects in other relationships (e.g., at work) (Nock 1993). One challenge associated with these
multiple networks is that each network has its own social expectations that may be inconsistent
with those of another context.
ICTs further complicate matters. Although in some ways people today have greater capacity to
manage their self-presentation through social media, in other ways the barriers between different
domains of their lives are breaking down, and social contexts are collapsing (Hampton 2016).
When there is a collapse across contexts that carry multiple, conflicting expectations, it may
Access provided by University of Manchester - John Rylands Library on 08/30/17. For personal use only.

be impossible for individuals to meet those expectations (Davis 2014, Davis & Jurgenson 2014,
Ellison et al. 2006, Marwick & boyd 2011). Such failures may change people’s perceptions of
each other, thus damaging relationships. Context collapse may be particularly problematic when
it is unintentional (Davis & Jurgenson 2014). Unintentional collapses may occur as a result of a
Annu. Rev. Sociol. 2017.43:249-269. Downloaded from www.annualreviews.org

mistake by the individual, an ICT network breach, or a change in information-sharing features


of online social networking sites (Hoadley et al 2010), or behavior by a friend or someone in
an individual’s network who uses the information in an unexpected way. In these cases, personal
information flows into domains the individual did not intend. Individuals can partially manage
competing expectations by manipulating the privacy settings available on social media platforms,
but those settings do not permit the fine-grained adjustments that are necessary to manage diverse
relationships (Vitak 2012). Further, algorithms used by social media sites do not necessarily match
individuals’ understandings of or preferences for the dissemination of information (Acquisti et al.
2015, boyd & Hargittai 2010).

Community
Privacy affects not only one-on-one interpersonal relationships, but also groups and communities
more broadly. It has implications for group boundaries, cohesion, and collective action.
Patterns of disclosure strengthen ties among group members and create stronger boundaries
between the group and outsiders. For example, adolescent girls may use secrets as a social currency
to enhance their status within peer groups at the same time as they keep information from their
families, thus strengthening the boundaries separating their friends and their families (Merten
1999; see also boyd 2014). Similarly, individuals may increasingly keep personal thoughts and
feelings from their friends and instead share them with their romantic partners to signify the
couple’s relationship status ( Johnson & Leslie 1982). Thus, privacy management can increase
group cohesion.
In turn, because group cohesion fosters social norms and cooperation (Coleman 1990, Hechter
1987, Horne 2009, Lawler et al. 2011), changes in privacy may have implications for the provision
of collective goods, although this process is not clear. On the one hand, cohesive groups may
be in decline, replaced by social networks in which the individual is the focus of action (Rainie
& Wellman 2012). Without cohesive groups, the potential for collective action may decrease
(Horne 2009, Sampson et al. 1997). On the other hand, networked individuals may be a new
focus of collective action. Online venues may facilitate collective protest by enabling people to
take action anonymously, without putting themselves at risk (Earl 2012). Networked individuals
can also at times call on their networks for action (Davis & Jurgenson 2014). Thus there is some
reason to believe that networks, and the information that flows through them, may be effective for
producing cooperative behavior in some contexts. Given the potentially disparate effects of group
cohesion and networks, the implications of privacy for order are complex.

www.annualreviews.org • Toward a Sociology of Privacy 257


SO43CH12-Anthony ARI 20 July 2017 13:53

Privacy changes may also have implications for civic life. The fine-grained data collection and
social sorting that facilitates market discrimination (see below) is also used by political actors.
The result is that people who fall into different classifications are exposed to different kinds of
political information (Danna & Gandy 2001). Whereas too much privacy may damage civic life
(Etzioni 1999), too little privacy and the associated targeted dissemination of information may
have consequences that are not fully understood.
Finally, privacy violations have implications for people’s trust in each other and in essential
social institutions. Frequent media stories describe hacks of individuals’ data that companies and
governments are supposed to keep secure (e.g., Collins 2015, Davis 2015). Of course, people have
always gotten access to private and secret information; but reliance on ICTs for record keeping
Access provided by University of Manchester - John Rylands Library on 08/30/17. For personal use only.

may increase the frequency of such violations because of the large volumes of stored data and the
fact that digital data are never completely secure (Waldo et al. 2007). Continued news of breaches
may reduce confidence in institutions, and in response people may withhold information (Campos-
Castillo & Anthony 2015, Stablein et al. 2015). Revelations of bad behavior may also decrease
Annu. Rev. Sociol. 2017.43:249-269. Downloaded from www.annualreviews.org

trust, both in institutions and individuals. For example, the 2016 leak of the Panama Papers
revealed secrets about how the wealthy—including government officials and public individuals—
hide their money. Greater transparency about leaders’ bad governance may change the costs and
benefits of taking action against a government, thus making such action more likely (Bennett &
Segerberg 2013, Howard 2015). In general, disclosures about government and other leaders may
have implications for the legitimacy of existing institutions and governance structures.

Future Research on Privacy and Cohesion


Although several scholars and pundits have described the rise of the networked individual (Bennett
& Segerberg 2013, Rainie & Wellman 2012), there is relatively little work on the implications
of this change for fundamental social behaviors such as cooperation. As discussed above, scholars
have highlighted the issue of context collapse associated with ICTs. Context collapse increases
access across relationships, and it might seem unproblematic given the high levels of transparency
in traditional face-to-face groups. However, expectations are more likely to be consistent in face-
to-face groups, which are more likely to be homogenous, than across social networks that may
be larger and entail more variability in roles and other characteristics. Because social structures
have changed, transparency today does not necessarily have the same implications as in the past.
The implications of context collapse, not only for individuals and their relationships but also
for cooperation and community cohesion, are unclear. Similarly, scholars suggest that politicians
use the fine-grained classifications made possible through new technologies to target political
messages, but the consequences of such widespread systematic information differences for civic
life are not fully understood (Etzioni 1999, Rainie & Wellman 2012).
Much of the research on privacy focuses on micro-level outcomes—individual concerns, rela-
tionships, and disadvantages, for example. Changes in privacy, however, also affect the kinds of
information people receive about government and other significant institutions, and thus have
implications for trust and institutional legitimacy. Information flows also affect the relationships
that underlie at least some forms of collective action and challenges to authority. More research is
needed to explore the implications of shifts in network structures across social contexts for group
cohesion, social capital, and social order.

PRIVACY AND INEQUALITY


Privacy also intersects with inequality. As discussed above, we define privacy as one actor’s access
to another. Such access is a resource that can be exchanged (Schwartz 1968), and like other

258 Anthony · Campos-Castillo · Horne


SO43CH12-Anthony ARI 20 July 2017 13:53

resources, it is unequally distributed in society. Privacy is “a scarce social commodity . . . [whose]


possession reflects and clarifies status divisions” (prestige or esteem accorded by society) and power
differences (the ability to acquire resources despite others’ resistance) (Schwartz 1968, p. 744).
Thus, the distribution of privacy reflects inequality. In addition to being unequally distributed,
the production and management of privacy may also create inequality among social actors.

Unequal Distribution of Privacy


Because socioeconomic and moral status and power shape who has privacy, privacy is unequally dis-
tributed (Warren & Laslett 1977); and because privacy management requires skills and resources,
Access provided by University of Manchester - John Rylands Library on 08/30/17. For personal use only.

actors vary in their ability to limit access to themselves and to gain access to others.
In general, visibility (and more broadly, accessibility) is unequally distributed within
hierarchies—from traditional families to bureaucratic institutions like schools and hospitals.
Poorer and lower-status actors are less able than higher-status actors to manage “information
Annu. Rev. Sociol. 2017.43:249-269. Downloaded from www.annualreviews.org

leakage” (Schwartz 1968), in part because social circumstances make their behaviors more ob-
servable. The size and the structure of lower-class homes, for example, provide less opportunity
to differentiate between front stage (more public) and back stage (more private) spaces (Goffman
1959, p. 123). Children have less privacy than adults (Warren & Laslett 1977), and the sick have
less privacy than the healthy (Fox 1959, Goffman 1961). In addition, lower-status actors may be
subject to bureaucratic institutions that require them to relinquish their privacy to receive services
from others, such as government officials, police, and social workers (Brayne 2014, Levine 2013):
“[I]nsulation from observability, and access to it, are just as important structural elements in a
bureaucracy as the distribution and delimitation of authority” (Coser 1961, p. 29).
People in different positions in a status hierarchy may also use different strategies for managing
privacy. In part, these differences in strategies reflect resource constraints. Lower-status individuals
generally have more difficulty both preventing others’ access and strategically sharing information
with others (Schradie 2012). Lower-status actors who experience stigma may simply withhold
information (e.g., Stablein et al. 2015); in contrast, higher-status individuals have a greater range
of options for managing access to themselves, including how audiences perceive them and interpret
their actions (Martin 2009, Phillips & Kim 2009, Sauder et al. 2012). For example, higher-income
and more educated households are more likely to sign up for “do-not-call” lists (Varian et al.
2005), and high-ranking officials use administrative assistants to screen and decide to whom access
is granted (Zerubavel 1979).
High-status actors also have greater access to lower-status individuals than the reverse: “[T]he
allocation of privacy . . . is a clear measure of one’s status and power in any given situation”
(Nippert-Eng 2010, p. 164; see also Schwartz 1968). Thus, one reason that privacy violations
cause negative and visceral reactions is that they convey a denial of status (Nippert-Eng 2010,
Schwartz 1968). In some contexts, access to others may even be a duty or privilege (e.g., of a pro-
fession) rather than a transgression of privacy norms (Parsons 1939). For example, it is not only
acceptable but also expected that social actors like physicians, lawyers, and psychoanalysts gain
intimate information about their clients. Similarly, welfare case workers have high levels of access
to applicants’ lives and affairs (Piven & Cloward 1971, Levine 2013). Such actors are conferred
higher status in part to enable a level of access to individuals that would otherwise be socially
unacceptable (Schwartz 1968).
Today, organizations, corporations, and governments, rather than individuals, have the greatest
ability to access others and potentially invade privacy (Beniger 1986, Pasquale 2015, Podesta et al.
2014). Decades ago, Shils (1966, p. 305) recognized the potential of these entities: “Privacy has
become a problem because it has become engulfed in the expansion of the powers and ambitions

www.annualreviews.org • Toward a Sociology of Privacy 259


SO43CH12-Anthony ARI 20 July 2017 13:53

of elites and in the difficulties that they encounter in attempting to govern and protect and please
vast collectivities.” Institutions now pose an even greater threat because of their increased ability to
gather, stockpile, and analyze information about individuals. Individuals may value the protection
and services provided by, for example, driver’s licenses, credit cards, grocery club cards, social
media accounts and the like: These seem not only benign but necessary to modern life. At the
same time, however, “the inability to keep information private” that is associated with these
amenities reflects a lack of power of citizens and consumers (Kasper 2007, p. 181; see also Westin
1970).

Privacy Creates Inequality


Access provided by University of Manchester - John Rylands Library on 08/30/17. For personal use only.

Not only is privacy unequally distributed across social groups and social conditions, but the con-
sequences of the differential ability to gather and use private information may in turn affect
inequality. That is, variation in privacy may weaken, perpetuate, or strengthen existing status and
Annu. Rev. Sociol. 2017.43:249-269. Downloaded from www.annualreviews.org

power hierarchies and may create new ones.


One way that variation in privacy affects inequality is through combining sources of information
about people, via so-called Big Data techniques (boyd & Crawford 2012) that sort them into
categories. These “classifications and profiles . . . correspond with, and reinforce, differential levels
of access, treatment and mobility” (Haggerty & Ericson 2000, p. 618; see also Brayne 2014,
Giddens 1990, Lyon 2003). Social sorting assigns “worth or risk, in ways that have real effects
on [people’s] life chances” (Lyon 2003, p. 1; see also Foucault 1979, Gandy 1993, Haggerty &
Ericson 2000, Lyon 1994, Marx 2004, Rule 2007, Westin 1970).
Such social sorting enables statistical discrimination. Statistical discrimination occurs when
decision makers rely on objectively accurate correlations between group characteristics and out-
comes (such as risk of credit default or susceptibility to disease) and apply the correlation to all the
individuals within the group category. New ICTs and Big Data techniques enable fine-grained
classification of characteristics and calculation of correlations. According to the 2014 White House
report on Big Data (Podesta et al. 2014, p. 53), “big data could enable new forms of discrimination
and predatory practices.” These classifications may either reflect traditional categories of discrim-
ination (such as race, gender, or income) or generate new categories that become encoded into the
design and implementation of data-driven decision making (Barocas & Selbst 2016, Pasquale 2015,
Stuart 2003) and frame the interpretation of information (Lyon 2003, McCarthy 2016, Newman
2014). Thus, enhanced information regarding a greater range of individual differences (e.g., dis-
ease risk, default risk, consumption patterns) can lead to the identification of new categories, and
in turn to new hierarchies.
Some existing evidence suggests that statistical discrimination based on detailed distinctions
harms people who fall into traditional categories of disadvantage. There is evidence, for example,
that, whereas it is true that cameras watch everyone, the people who monitor the camera feeds do
not. Instead, they focus their attention on categories of people they believe are more suspect (e.g.,
young minority men or homeless people) (e.g., Goold 2004). Fine-grained classifications may also
be used to target individuals who are more likely to respond to deceptive or unfair offers. Banks,
for example, marketed subprime mortgages to racial and ethnic minorities and people from low
socioeconomic backgrounds (Consum. Watch. 2011, Williams et al. 2005).
Even if not purposefully predatory, classifications, especially those based on proprietary data
and algorithms, may create disadvantage (Barocas & Selbst 2016, Newman 2014, Pasquale 2015)
because “secret scores can hide discrimination, unfairness and bias” (Dixon & Gellman 2014,
p. 7). For example, online advertisers bundle and sell users’ browser search data to categorize
potential buyers for targeted pricing based on behavior, income, or location. However, unlike

260 Anthony · Campos-Castillo · Horne


SO43CH12-Anthony ARI 20 July 2017 13:53

price discrimination in the face of transparent pricing information, which economists argue can
be favorable to consumers (e.g., Varian 1985), when pricing information is obfuscated, as in online
targeted advertising, consumers are harmed (see, e.g., Ellison & Ellison 2009, Salop & Stiglitz
1977). Individuals have no way of knowing whether the price they are offered is different from the
prices offered to others, and they may as a result accept offers they would otherwise refuse.
However, detailed information about individuals may also decrease inequality. It may, for exam-
ple, increase the likelihood that no two individuals are members of the same categories, thereby
disrupting the association between broad categories and resources that underlies status hierar-
chies (e.g., the association between gender and leadership) (Lamont & Molnár 2002, Ridgeway
2011). Another possibility is that fine-grained classifications may loosen individuals’ attachments to
Access provided by University of Manchester - John Rylands Library on 08/30/17. For personal use only.

broader categories, in turn facilitating communication and interaction across traditional category
divisions (DiBenigno & Kellogg 2014).
The distribution of privacy may also affect inequality through mechanisms other than social
sorting. There is evidence that individuals who are heavily monitored may refrain from seeking
Annu. Rev. Sociol. 2017.43:249-269. Downloaded from www.annualreviews.org

services they need (and are legally entitled to) from social welfare institutions and health care
providers, and may even avoid friends and family members. People who have had contact with the
criminal justice system have fewer interactions with other institutions that are required to mon-
itor behavior to execute their functions—for example, hospitals, formal employment, or schools
(Brayne 2014). Similarly, low-income families concerned with being deemed ineligible for ser-
vices avoid transactions at financial institutions (O’Brien 2008), and many immigrants avoid even
health services for fear of revealing information that may get them deported ( Joseph 2011). In
neighborhoods with high incarceration rates, the fear of pervasive monitoring can lead people to
avoid any contact that might expose them to law enforcement: “[Y]oung men’s compromised legal
status transforms the basic institutions of work, friendship, and family into a net of entrapment.
Hospitals become dangerous places to visit, as do jobs. Their mother’s home becomes a last-known
address: the first place the police will look” (Goffman 2014, p. 196). The lack of privacy of poor
individuals leads to further disadvantages for them and damages the social fabric around them.

Future Research on Privacy and Inequality


Existing research provides empirical evidence of some of the micro- and macro-level effects of
unequal privacy (i.e., unequal observability or access to pricing information) as well as theoreti-
cal insights into its likely consequences. But much research is still needed to elucidate how the
fine-grained categorization and discrimination enabled by ICTs affect the structure of social hi-
erarchies, as well as the connections between privacy and inequality more broadly. Sociologists
studying the digital divide have contributed to our understanding of the connections between
technology, information, and inequality (Hargittai & Hinnant 2008, McCarthy 2016). More work
remains to be done to understand the role of privacy in increasing, maintaining, flattening, or even
creating new status and power hierarchies. New ICTs can enable actors with resources to have
greater access to those lower in the status hierarchy than the reverse. More generally, changes
in the distribution of access may shift who has power over whom, as well as how satisfied people
are with existing hierarchies. Sociological theories of status and power may produce insights into
these and other consequences of changes in levels and distribution of privacy.

CONCLUSION
Sociological research highlights the ways in which privacy intersects with social structures and
institutions to affect individuals, groups, and communities. Much of the research on privacy focuses

www.annualreviews.org • Toward a Sociology of Privacy 261


SO43CH12-Anthony ARI 20 July 2017 13:53

on what we might call first-order effects: the effects of privacy threats, such as those associated with
new ICTs and the rise of Big Data, on individuals. The work described here emphasizes second-
order effects: the consequences of (changes in) privacy for the organization and functioning of
society. This work shows that privacy has implications for achieving just societies that are able to
meet the collective needs of their citizens. And, just as privacy affects societies as well as individuals,
reactions to privacy threats not only are the result of individual dismay but also reflect the larger
social structures in which threats occur.
Our review of the literature shows that privacy issues arise across multiple social conditions
and contexts. Research highlights how institutional actors such as governments and businesses
manage privacy to achieve their ends—whether collective ends such as safety and security or
Access provided by University of Manchester - John Rylands Library on 08/30/17. For personal use only.

organizational ends such as profit maximization—but also shows that such efforts may not achieve
the desired outcomes, for example, when monitoring creates resistance and backlash. Similarly,
individuals seek to manage their privacy both in their interpersonal relationships and in their
interactions with institutions. Institutional and individual privacy management strategies may
Annu. Rev. Sociol. 2017.43:249-269. Downloaded from www.annualreviews.org

complement or conflict with each other. Accordingly, the effects of monitoring on social order
are not straightforward; more monitoring may not necessarily lead to more order. Further, the
association between the sharing of personal information and social cohesion is complex, and the
context collapse that is exacerbated by ICTs complicates the management of relationships. Finally,
the relative ability of actors to manage privacy as well as the effects of privacy changes are unevenly
distributed, and these inequalities may have particularly negative consequences for individuals and
communities with low status and few resources.
These tensions are not new. The problem of finding the appropriate balance of disclosure
and monitoring on the one hand, and concealment on the other, did not emerge with new ICTs.
New technologies in some ways highlight (and perhaps exacerbate) existing social dynamics, but
they do not fundamentally alter actors’ concerns. However, these technologies may be reducing
the ability of actors to manage privacy, with unknown effects. Privacy management has become
increasingly complex, in part, because it is mediated by technology in (often) invisible and not
completely controllable ways. Further, because ICTs are rapidly evolving, privacy norms may
be shifting and ambiguous. These complexities and uncertainties, in conjunction with emerging
strategies for addressing them (such as policies that assign responsibility for privacy management
to organizations rather than individuals), may have unintended and unanticipated consequences
for social order and inequality.
Sociology has much to contribute to a scholarly understanding of these macro-level impli-
cations of privacy. Sociologists have produced a range of scholarship relevant for understanding
privacy issues (including research on privacy issues not included in this review, such as privacy at-
titudes, the public and private spheres, and so forth). However, systematic, theoretically informed,
and empirically rigorous research on the sociology of privacy remains underdeveloped. In part,
this is because privacy research has occurred in subdisciplines that rarely speak to each other.
Relevant literature has been produced, for example, in social psychology, criminology, media and
technology studies, political sociology, welfare and poverty studies, and elsewhere. Thus, insights
regarding the implications of privacy for macro-level outcomes are scattered, and the full range of
sociological expertise on inequality and social order has not yet been brought to bear on privacy
issues. Communication among disparate researchers with interests in privacy would contribute to
greater accumulation of knowledge.
Part of the enthusiasm for new ICTs seems to be driven by the faith that timely access to more
information will enable social actors and policy makers to solve a host of problems—ranging from
traffic jams and global warming to public safety and improved health—all while reducing costs and
increasing profits. Smart technology is being incorporated into the electric grid as well as into cars,

262 Anthony · Campos-Castillo · Horne


SO43CH12-Anthony ARI 20 July 2017 13:53

home management systems, wearable sensors, and smart phones. Databases are being integrated
across the domains of education, criminal justice, health care, public transportation, and utilities.
Smart cities and the Internet of Things are seen as the wave of the future. However, it is not at
all clear that increased information and the associated losses in privacy will solve pressing social
issues. Further, reductions in privacy may have unanticipated consequences. Without research on
the full range of consequences associated with changes in privacy, it is impossible to evaluate the
trade-offs between privacy and other desired goods, such as safety, health, and so forth. Evaluating
privacy concerns does not just require balancing individual privacy against collective goods (Etzioni
1999, Rule 2007); rather, it involves understanding the individual and collective goods and bads
associated with privacy changes. Sociologists have important and distinct contributions to make
Access provided by University of Manchester - John Rylands Library on 08/30/17. For personal use only.

to such understanding.

SUMMARY POINTS
Annu. Rev. Sociol. 2017.43:249-269. Downloaded from www.annualreviews.org

1. We define privacy as access of one social actor to another, including what is revealed
or concealed as well as what is sought or ignored. Privacy norms define expectations
about appropriate access within a given context. Privacy management describes an actor’s
strategies and ability to regulate access to the self and others.
2. Privacy is related to social order through social control via the monitoring of actors,
which can produce cooperation but also resistance and backlash.
3. Privacy has implications for cohesion because privacy norms dictate appropriate disclo-
sure for different types of relationships and groups, and because privacy management
shapes and is shaped by the intimacy of relationships.
4. Privacy is a resource that is unequally distributed in society, and the production and
management of privacy may create inequality among social actors.
5. Much work on privacy focuses on first-order effects, that is, the impact of privacy on
individuals. Sociological work is needed to examine the second-order effects, that is, the
consequences of (changes in) privacy for the organization and functioning of society.

FUTURE ISSUES
1. How do the state and other institutions use ICTs to monitor actors, including individuals,
groups (e.g., employees, felons, immigrants), and organizations (e.g., nongovernmental
organizations)? What are the implications of changes in, and increasing levels of, moni-
toring for privacy norms, privacy management, and social order overall?
2. How do individuals and groups (attempt to) manage their privacy in response to changes
in institutional monitoring? What are the implications of changes in monitoring for
individual behavior and collective outcomes?
3. How does the spread and use of ICTs affect trust in social institutions?
4. How do changes in social network structures affect privacy norms and privacy manage-
ment? What role does the spread of ICTs play in changes to privacy norms and privacy
management within social relationships and networks? And what are the implications of
such changes for social capital, group cohesion, and social order?

www.annualreviews.org • Toward a Sociology of Privacy 263


SO43CH12-Anthony ARI 20 July 2017 13:53

5. What is the role of privacy in maintaining, increasing, or flattening status hierarchies?


How does the spread and use of ICTs affect the creation of new status hierarchies?

DISCLOSURE STATEMENT
The authors are not aware of any affiliations, memberships, funding, or financial holdings that
might be perceived as affecting the objectivity of this review.
Access provided by University of Manchester - John Rylands Library on 08/30/17. For personal use only.

ACKNOWLEDGMENTS
The authors contributed equally to this review and are listed alphabetically. The authors thank
Jennifer Earl, Steve Hitlin, James Kitts, and anonymous reviewers for comments on previous
Annu. Rev. Sociol. 2017.43:249-269. Downloaded from www.annualreviews.org

drafts.

LITERATURE CITED
Acquisti A, Brandimarte L, Loewenstein G. 2015. Privacy and human behavior in the age of information.
Science 347(6221):509–14
Altman I. 1977. Privacy regulation: culturally universal or culturally specific. J. Soc. Issues 33:66–84
Anthony D, Kotz D, Henderson T. 2007. Privacy in location-aware computing environments. IEEE Pervasive
Comput. 6(4):64–72
Anthony D, Stablein T, Carian EK. 2015. Big Brother in the Information Age: concerns about government
information gathering over time. IEEE Secur. Priv. 13(4):12–19
Argyle M, Kendon A. 1967. The experimental analysis of social performance. Adv. Exp. Soc. Psychol. 3:55–98
Attewell P. 1987. Big Brother and the sweatshop: computer surveillance in the automated office. Sociol. Theory
5(1):87–100
Ball K. 2010. Workplace surveillance: an overview. Lab. Hist. 51(1):87–106
Balswick JO, Balkwell JW. 1977. Self-disclosure to same- and opposite-sex parents: an empirical test of insights
from role theory. Sociometry 40(3):282–86
Barocas S, Selbst AD. 2016. Big Data’s disparate impact. Calif. Law Rev. 104:671–732
Bates AP. 1964. Privacy—a useful concept? Soc. Forces 42:429–34
Beniger JR. 1986. The Control Revolution: Technological and Economic Origins of the Information Society.
Cambridge, MA: Harvard Univ. Press
Bennett C, Haggerty KD, Lyon D, Steeves V, eds. 2014. Transparent Lives: Surveillance in Canada. Athabasca,
Can.: Athabasca Univ.
Bennett WL, Segerberg A. 2013. The Logic of Connective Action: Digital Media and the Personalization of Con-
tentious Politics. Cambridge, UK: Cambridge Univ. Press
Bernstein L. 1992. Opting out of the legal system: extralegal contractual relations in the diamond industry.
J. Leg. Stud. 21(1):115–57
Blumer H. 1969. Symbolic Interactionism: Perspective and Method. Berkeley: Univ. Calif. Press
boyd d. 2014. It’s Complicated: The Social Lives of Networked Teens. New Haven, CT: Yale Univ. Press
boyd d, Crawford K. 2012. Critical questions for Big Data. Inf. Commun. Soc. 15(5):662–79
boyd d, Hargittai E. 2010. Facebook privacy settings: Who cares? First Monday, Aug. 2. http://firstmonday.
org/article/view/3086/2589
Brayne S. 2014. Surveillance and system avoidance: criminal justice contact and institutional attachment. Am.
Sociol. Rev. 79(3):367–91
Brooks C, Manza J. 2013. Whose Rights: Counterterrorism and the Dark Side of American Public Opinion. New York:
Russell Sage Found.
Campos-Castillo C, Anthony DL. 2015. The double-edged sword of electronic health records: implications
for patient disclosure. J. Am. Med. Inform. Assoc. 22(e1):e130–40

264 Anthony · Campos-Castillo · Horne


SO43CH12-Anthony ARI 20 July 2017 13:53

Campos-Castillo C, Hitlin S. 2013. Copresence: revisiting a building block for social interaction theories.
Sociol. Theory 31:168–92
Cate FH. 1997. Privacy in the Information Age. Washington, DC: Brookings Inst.
Coleman JS. 1990. Foundations of Social Theory. Cambridge, MA: Belknap Press
Collins K. 2015. A quick guide to the worst corporate hack attacks. Bloomberg, March 18. http://www.
bloomberg.com/graphics/2014-data-breaches/
Consum. Watch. 2011. Liars and loans: how deceptive advertisers use Google. Consum. Watch. Inside
Google Rep., Consum. Watch., Washington, DC. http://www.consumerwatchdog.org/resources/
liarsandloansplus021011.pdf
Coser RL. 1961. Insulation from observability and types of social conformity. Am. Sociol. Rev. 26(1):28–39
Cozby PC. 1972. Self-disclosure, reciprocity, and liking. Sociometry 35(1):151–60
Access provided by University of Manchester - John Rylands Library on 08/30/17. For personal use only.

Danna A, Gandy OH Jr. 2001. All that glitters is not gold: digging beneath the surface of data mining. J. Bus.
Ethics 40:373–86
Davis J. 2015. Hacking of government computers exposed 21.5 million people. New York Times, July 9.
http://www.nytimes.com/2015/07/10/us/office-of-personnel-management-hackers-got-data-of-
Annu. Rev. Sociol. 2017.43:249-269. Downloaded from www.annualreviews.org

millions.html?_r = 0
Davis JL. 2014. Triangulating the self: identity processes in a connected era. Symb. Interact. 37(4):500–23
Davis JL, Jurgenson N. 2014. Context collapse: theorizing context collusions and collisions. Inf. Commun. Soc.
17(4):476–85
Dewey C. 2014. The only guide to Gamergate you will ever need to read. The Washington Post,
Oct. 14. https://www.washingtonpost.com/news/the-intersect/wp/2014/10/14/the-only-guide-to-
gamergate-you-will-ever-need-to-read/
DiBenigno J, Kellogg KC. 2014. Beyond occupational differences: the importance of cross-cutting demo-
graphics and dyadic toolkits for collaboration in a U.S. hospital. Adm. Sci. Q. 59(3):375–408
Diekmann A, Jann B, Przepiorka W, Wehrli S. 2014. Reputation formation and the evolution of cooperation
in anonymous online markets. Am. Sociol. Rev. 79:65–85
Diekmann A, Przepiorka W, Rauhut H. 2015. Lifting the veil of ignorance: an experiment on the contagious-
ness of norm violations. Ration. Soc. 27(3):309–33
Dixon P, Gellman R. 2014. The scoring of America: how secret consumer scores threaten your privacy and your
future. Rep., World. Priv. Forum, San Diego, CA. http://www.worldprivacyforum.org/wp-content/
uploads/2014/04/WPF_Scoring_of_America_April2014_fs.pdf
Donath J, boyd d. 2004. Public displays of connection. BT Technol. J. 22(4):71–82
Doyle A, Lippert R, Lyon D, eds. 2012. Eyes Everywhere: The Global Growth of Camera Surveillance. London:
Routledge
Durkheim E. 1951. Suicide. New York: Free Press
Earl J. 2012. Private protest? Public and private engagement online. Inf. Commun. Soc. 15(4):591–608
Eder D, Hallinan MT. 1978. Sex differences in children’s friendships. Am. Sociol. Rev. 43(2):237–50
Ellison G, Ellison SF. 2009. Search, obfuscation, and price elasticities on the Internet. Econometrica 77:427–29
Ellison N, Heino R, Gibbs J. 2006. Managing impressions online: self-presentation processes in the online
dating environment. J. Comput.-Mediat. Commun. 11(2):415–41
Etzioni A. 1999. The Limits of Privacy. New York: Basic Books
Feinberg M, Willer R, Schultz M. 2014. Gossip and ostracism promote cooperation in groups. Psychol. Sci.
25(3):656–64
Foucault M. 1979. Discipline and Punish: The Birth of the Prison. New York: Vintage Books
Fox RC. 1959. Experiment Perilous: Physicians and Patients Facing the Unknown. New York: Free Press
Gambetta D. 2009. Codes of the Underworld. Princeton, NJ: Princeton Univ. Press
Gandy OH Jr. 1993. The Panoptic Sort: A Political Economy of Personal Information. Boulder, CO: Westview
Press
Gibson DR. 2014. Enduring illusions: the social organization of secrecy and deception. Sociol. Theory 32(4):283–
306
Giddens A. 1990. The Consequences of Modernity. Stanford, CA: Stanford Univ. Press
Glaser BG, Strauss AL. 1964. Awareness contexts and social interaction. Am. Sociol. Rev. 29(5):669–79

www.annualreviews.org • Toward a Sociology of Privacy 265


SO43CH12-Anthony ARI 20 July 2017 13:53

Goffman A. 2014. On the Run: Fugitive Life in an American City. Chicago: Univ. Chicago Press
Goffman E. 1959. The Presentation of Self in Everyday Life. Garden City, NY: Doubleday
Goffman E. 1961. Asylums: Essays on the Social Situation of Mental Patients and Other Inmates. New York: Anchor
Books
Goffman E. 1963. Stigma: Notes on the Management of Spoiled Identity. Englewood Cliffs, NJ: Prentice-Hall
Goffman E. 1966. Behavior in Public Places: Notes on the Social Organization of Gatherings. New York: Free Press
Goold BJ. 2004. CCTV and Policing. Oxford, UK: Oxford Univ. Press
Goold BJ, Loader I, Thumala A. 2013. The banality of security: the curious case of surveillance cameras. Br.
J. Criminol. 53:977–96
Grimshaw AD. 1982. Whose privacy? What harm? Sociol. Methods Res. 11(2):233–47
Haggerty KD. 2006. Tear down the walls: on demolishing the panopticon. In Theorizing Surveillance: The
Access provided by University of Manchester - John Rylands Library on 08/30/17. For personal use only.

Panopticon and Beyond, ed. D Lyon, pp. 23–45. Cullompton, UK: Willan Publ.
Haggerty KD, Ericson RV. 2000. The surveillant assemblage. Br. J. Sociol. 51(4):605–22
Hampton KN. 2016. Persistent and pervasive community: new communication technologies and the future
of community. Am. Behav. Sci. 60(1):101–24
Annu. Rev. Sociol. 2017.43:249-269. Downloaded from www.annualreviews.org

Hargittai E, Hinnant A. 2008. Digital inequality differences in young adults’ use of the Internet. Commun. Res.
35(5):602–21
Harrington CL, Bielby DD. 1995. Where did you hear that? Technology and the social organization of gossip.
Sociol. Q. 36(3):607–28
Hechter M. 1987. Principles of Group Solidarity. Berkeley: Univ. Calif. Press
Hechter M, Horne C. 2009. Theories of Social Order. Palo Alto, CA: Stanford Univ. Press
Hier SP. 2010. Panoptic Dreams. Vancouver, Can.: Univ. British Columbia Press
Hintz RA Jr., Miller DE. 1995. Openings revisited: the foundations of social interaction. Symb. Interact.
18(3):355–69
Hirschauer S. 2005. On doing being a stranger: the practical constitution of civil inattention. J. Theory Soc.
Behav. 35(1):41–67
Hoadley CM, Xu H, Lee JJ, Rosson MB. 2010. Privacy as information access and illusory control: the case of
the Facebook news feed privacy outcry. Electron. Commer. Res. Appl. 9:50–60
Horne C. 2009. The Rewards of Punishment. Stanford, CA: Stanford Univ. Press
Horne C, Darras B, Bean E, Srivastava A, Frickel S. 2015. Privacy, technology, and norms: the case of smart
meters. Soc. Sci. Res. 51:64–76
Howard P. 2015. Pax Technica: How the Internet of Things May Set Us Free or Lock Us Up. New Haven, CT:
Yale Univ. Press
Huey L. 2010. A social movement for privacy/against surveillance? Some difficulties in engendering mass
resistance in a land of Twitter and tweets. Case West. Res. J. Int. Law 42:699–709
Johnson MP, Leslie L. 1982. Couple involvement and network structure: a test of the dyadic withdrawal
hypothesis. Soc. Psychol. Q. 45(1):34–43
Joseph TD. 2011. “My life was filled with constant anxiety”: anti-immigrant discrimination, undocumented
status, and their mental health implications for Brazilian immigrants. Race Soc. Probl. 3:170–81
Jourard SM. 1964. The Transparent Self: Self-Disclosure and Well-Being. New York: Van Nostrand Reinhold
Kanter RM. 1977. Men and Women of the Corporation. New York: Basic Books
Kasper DVS. 2007. Privacy as a social good. Soc. Thought Res. 28:165–89
Keizer K, Lindenberg S, Steg L. 2008. The spreading of disorder. Science 322:1681–85
Kitts JA. 2003. Egocentric bias or information management? Selective disclosure and the social roots of norm
misperception. Soc. Psychol. Q. 66(3):222–37
Lageson SE, Vuolo M, Uggen C. 2015. Legal ambiguity in managerial assessments of criminal records. Law
Soc. Inq. 40(1):175–204
Lamont M, Molnár V. 2002. The study of boundaries in the social sciences. Annu. Rev. Sociol. 28:167–95
Lawler EJ, Thye SR, Yoon J. 2011. Social Commitments in a Depersonalized World. New York: Russell Sage
Found.
Levine JA. 2013. Ain’t No Trust. Berkeley: Univ. Calif. Press
Levy KEC. 2015. The contexts of control: information, power, and truck-driving work. Inf. Soc. 31(2):160–74

266 Anthony · Campos-Castillo · Horne


SO43CH12-Anthony ARI 20 July 2017 13:53

Lowry RP. 1972. Toward a sociology of secrecy and security systems. Soc. Probl. 19(4):437–50
Lyon D. 1994. The Electronic Eye: The Rise of Surveillance Society. Minneapolis: Univ. Minn. Press
Lyon D, ed. 2003. Surveillance as Social Sorting: Privacy, Risk, and Digital Discrimination. New York: Routledge
Margulis ST. 1977. Conceptions of privacy: current status and next steps. J. Soc. Issues 33(3):5–21
Margulis ST. 2003. Privacy as a social issue and behavioral concept. J. Soc. Issues 59(2):243–61
Martin JL. 2009. Social Structures. Princeton, NJ: Princeton Univ. Press
Marwick A, boyd d. 2011. I tweet honestly, I tweet passionately: Twitter users, context collapse, and the
imagined audience. New Media Soc. 13(1):114–33
Marx GT. 2003. A tack in the shoe: neutralizing and resisting the new surveillance. J. Soc. Issues 59(2):369–90
Marx GT. 2004. What’s new about the “new surveillance”? Classifying for change and continuity. Knowl.
Technol. Priv. 17(1):18–37
Access provided by University of Manchester - John Rylands Library on 08/30/17. For personal use only.

Marx GT. 2009. A tack in the shoe and taking off the shoe: neutralization and counter- neutralization dynamics.
Surveill. Soc. 6(3):294–306
McCarthy MT. 2016. The big data divide and its consequences. Sociol. Compass 10:1131–40
Merten DE. 1999. Enculturation into secrecy among junior high school girls. J. Contemp. Ethnogr. 28(2):107–
Annu. Rev. Sociol. 2017.43:249-269. Downloaded from www.annualreviews.org

37
Miall CE. 1989. Authenticity and the disclosure of the information preserve: the case of adoptive parenthood.
Q. Sociol. 12(3):279–302
Moore B Jr. 1984. Privacy: Studies in Social and Cultural History. Armonk, NY: M.E. Sharpe
Murphy RF. 1964. Social distance and the veil. Am. Anthropol. 66:1257–74
Newman N. 2014. The costs of lost privacy: consumer harm and rising economic inequality in the age of
Google. William Mitchell Law Rev. 40(2):849–89
Nippert-Eng C. 2010. Islands of Privacy. Chicago: Univ. Chicago Press
Nissenbaum H. 2010. Privacy in Context: Technology, Policy, and the Integrity of Social Life. Stanford, CA: Stanford
Univ. Press
Nock S. 1993. The Costs of Privacy: Surveillance and Reputation in America. New York: Aldine de Gruyter
O’Brien RL. 2008. Ineligible to save? Asset limits and the saving behavior of welfare recipients. J. Community
Pract. 16:183–99
Ostrom E. 1990. Governing the Commons. New York: Cambridge Univ. Press
Pager D. 2003. The mark of a criminal record. Am. J. Sociol. 108:937–75
Pager D. 2007. Marked: Race, Crime, and Finding Work in an Era of Mass Incarceration. Chicago: Univ. Chicago
Press
Parsons T. 1939. The professions and social structure. Soc. Forces 17(4):457–67
Pasquale F. 2015. The Black Box Society. Cambridge, MA: Harvard Univ. Press
Petronio S. 2002. Boundaries of Privacy: Dialectics of Disclosure. New York: SUNY Press
Phillips DJ, Kim YK. 2009. Why pseudonyms? Deception as identity preservation among jazz record compa-
nies, 1920–1929. Organ. Sci. 20(3):481–99
Piven FF, Cloward R. 1971. Regulating the Poor. New York: Pantheon
Podesta J, Pritzer P, Moniz EJ, Holdren J, Zients J. 2014. Big data: seizing opportunities, preserving val-
ues. Rep., Exec. Off. Pres., White House, Washington, DC. https://bigdatawg.nist.gov/pdf/big_data_
privacy_report_may_1_2014.pdf
Rainie L, Wellman B. 2012. Networked: The New Social Operating System. Cambridge, MA: MIT Press
Richardson L. 1988. Secrecy and status: the social construction of forbidden relationships. Am. Sociol. Rev.
53(2):209–19
Ridgeway CL. 2011. Framed by Gender: How Gender Inequality Persists in the Modern World. New York: Oxford
Univ. Press
Roberts JM, Gregor T. 1971. Privacy: a cultural view. In Privacy, ed. JR Pennock, JW Chapman, pp. 189–225.
New York: Atherton Press
Rosenfeld J, Denice P. 2015. The power of transparency: evidence from a British workplace survey. Am. Sociol.
Rev. 80(5):1045–68
Rubin Z, Hill CT, Peplau LA, Dunkel-Schetter C. 1980. Self-disclosure in dating couples: sex roles and the
ethic of openness. J. Marriage Fam. 42(2):305–17

www.annualreviews.org • Toward a Sociology of Privacy 267


SO43CH12-Anthony ARI 20 July 2017 13:53

Rule JB. 1973. Private Lives and Public Surveillance. London: Allen Lane
Rule JB. 2007. Privacy in Peril: How We Are Sacrificing a Fundamental Right in Exchange for Security and
Convenience. New York: Oxford Univ. Press
Salop S, Stiglitz J. 1977. Bargains and ripoffs: a model of monopolistically competitive price dispersion. Rev.
Econ. Stud. 44:493–510
Sampson RJ, Raudenbush SW, Earls F. 1997. Neighborhoods and violent crime: a multilevel study of collective
efficacy. Science 277(5328):918–24
Sauder M, Lynn F, Podolny JM. 2012. Status: insights from organizational sociology. Annu. Rev. Sociol.
37:267–83
Saval N. 2014. Cubed: A Secret History of the Workplace. New York: Doubleday
Schradie J. 2012. The trends of class, race, and ethnicity in social media inequality: Who still cannot afford to
Access provided by University of Manchester - John Rylands Library on 08/30/17. For personal use only.

blog? Inf. Commun. Soc. 14(4):555–71


Schwartz B. 1968. The social psychology of privacy. Am. J. Sociol. 73(6):741–52
Schwartz RD. 1954. Social factors in the development of legal control: a case study of two Israeli settlements.
Yale Law J. 63(4):471–91
Annu. Rev. Sociol. 2017.43:249-269. Downloaded from www.annualreviews.org

Sewell G. 1998. The discipline of teams: the control of team-based industrial work through electronic and
peer surveillance. Adm. Sci. Q. 43(2):397–428
Sewell G, Wilkinson B. 1992. Someone to watch over me: surveillance, discipline, and the just- in-time labour
process. Sociology 26(2):271–89
Shalhoub-Kevorkian N. 2010. E-resistance and technological in/security in everyday life: the Palestinian case.
Br. J. Criminol. 52:55–72
Shamas D, Arastu N. 2016. Mapping Muslims: NYPD spying and its impact on American Muslims. Rep., Muslim
Am. Civ. Lib. Coalit., Creating Law Enforc. Account. Responsib., Asian Am. Leg. Def. Educ. Fund,
New York. http://www.law.cuny.edu/academics/clinics/immigration/clear/Mapping-Muslims.pdf
Shils EA. 1956. The Torment of Secrecy: The Background and Consequences of American Security Policies. Glencoe,
IL: The Free Press
Shils EA. 1966. Privacy: its constitution and vicissitudes. Law Contemp. Probl. 31(2):281–306
Simmel G. 1906. The sociology of secrecy and of secret societies. Am. J. Sociol. 11(4):441–98
Simmel G. 1950. The Sociology of Georg Simmel. New York: The Free Press
Simmel G. 1955. Conflict and the Web of Group Affiliations. New York: The Free Press
Solove DJ. 2008. Understanding Privacy. Cambridge, MA: Harvard Univ. Press
Stablein T, Hall JL, Pervis C, Anthony DL. 2015. Negotiating stigma in health care: disclosure and the role
of electronic health records. Health Sociol. Rev. 24:227–41
Stark L. 2016. The emotional context of information privacy. Inf. Soc. 32(1):14–27
Stuart G. 2003. Discriminating Risk: The U.S. Mortgage Lending Industry in the Twentieth Century. Ithaca, NY:
Cornell Univ. Press
Sweeney L. 2013. Discrimination in online ad delivery. Comm. ACM 56(5):44–54
Tabuchi H. 2015. $10 million settlement in target data breach gets preliminary approval. New York Times,
March 19. http://www.nytimes.com/2015/03/20/business/target-settlement-on-data-breach.html?_
r=0
Uggen C, Vuolo M, Lageson S, Ruhland E, Whitham HK. 2014. The edge of stigma: an experimental audit
of the effects of low-level criminal records on unemployment. Criminology 52(4):627–54
Varian HR. 1985. Price discrimination and social welfare. Am. Econ. Rev. 75(4):870–75
Varian HR, Wallenberg F, Woroch G. 2005. The demographics of the Do-Not-Call list. IEEE Secur. Priv.
3:34–39
Vitak J. 2012. The impact of context collapse and privacy on social network site disclosures. J. Broadcast.
Electron. Media 56(4):451–70
Waldo J, Lin HS, Millett LI, eds. 2007. Engaging Privacy and Information Technology in a Digital Age.
Washington, DC: Natl. Acad. Press
Warren C, Laslett B. 1977. Privacy and secrecy: a conceptual comparison. J. Soc. Issues 33(3):43–51
Weber M. 1978 (1921). Economy and Society, Vol. 2, ed. G Roth, C Wittich. Berkeley: Univ. Calif. Press
Welsh BC, Farrington DP. 2009. Public area CCTV and crime prevention: an updated systematic review and
meta-analysis. Justice Q. 26(4):716–45

268 Anthony · Campos-Castillo · Horne


SO43CH12-Anthony ARI 20 July 2017 13:53

Westin AF. 1970. Privacy and Freedom. New York: Atheneum


Westin AF. 2003. Social and political dimensions of privacy. J. Soc. Issues 59(2):431–53
Williams R, Nesiba R, McConnell ED. 2005. The changing face of inequality in home mortgage lending. Soc.
Probl. 52(2):181–208
Wilsnack RW. 1980. Information control: a conceptual framework for sociological analysis. Urban Life
8(4):467–99
Zerubavel E. 1979. Private time and public time: the temporal structure of social accessibility and professional
commitments. Soc. Forces 58(1):38–58
Zerubavel E. 2006. The Elephant in the Room: Silence and Denial in Everyday Life. New York: Oxford Univ.
Press
Zureik E, Stalker LH, Smith E, Lyon D, Chan YE. 2010. Surveillance, Privacy, and the Globalization of Personal
Access provided by University of Manchester - John Rylands Library on 08/30/17. For personal use only.

Information: International Comparisons. Montreal: McGill Queens Univ. Press


Annu. Rev. Sociol. 2017.43:249-269. Downloaded from www.annualreviews.org

www.annualreviews.org • Toward a Sociology of Privacy 269


SO43-FrontMatter ARI 5 July 2017 14:1

Annual Review
of Sociology

Contents Volume 43, 2017

Prefatory Article
Access provided by University of Manchester - John Rylands Library on 08/30/17. For personal use only.

A Life in Sociology
Robert M. Hauser p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p 1
Annu. Rev. Sociol. 2017.43:249-269. Downloaded from www.annualreviews.org

Theory and Methods


Data ex Machina: Introduction to Big Data
David Lazer and Jason Radford p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p19
Field Experiments Across the Social Sciences
Delia Baldassarri and Maria Abascal p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p41
Genealogical Microdata and Their Significance for Social Science
Xi Song and Cameron D. Campbell p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p75
Network Sampling: From Snowball and Multiplicity to
Respondent-Driven Sampling
Douglas D. Heckathorn and Christopher J. Cameron p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p 101
New Developments in Survey Data Collection
Mick P. Couper p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p 121
Replication in Social Science
Jeremy Freese and David Peterson p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p 147
Studying the Digital: Directions and Challenges for Digital Methods
Keith N. Hampton p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p 167
Theorizing in Sociological Research: A New Perspective, a New
Departure?
Richard Swedberg p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p 189
Social Processes
Decision-Making Processes in Social Contexts
Elizabeth Bruch and Fred Feinberg p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p 207
Social Networks and Macrosocial Change
Emily Erikson and Nicholas Occhiuto p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p 229
Toward a Sociology of Privacy
Denise Anthony, Celeste Campos-Castillo, and Christine Horne p p p p p p p p p p p p p p p p p p p p p p p p p 249

v
SO43-FrontMatter ARI 5 July 2017 14:1

Formal Organizations
The Social Bases of Philanthropy
Emily Barman p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p 271
Political and Economic Sociology
The Demand Side of Hiring: Employers in the Labor Market
David B. Bills, Valentina Di Stasio, and Klarita Gërxhani p p p p p p p p p p p p p p p p p p p p p p p p p p p p p 291
Differentiation and Stratification
Access provided by University of Manchester - John Rylands Library on 08/30/17. For personal use only.

Categorical Inequality: Schools as Sorting Machines


Thurston Domina, Andrew Penner, and Emily Penner p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p 311
Gender Quotas for Legislatures and Corporate Boards
Annu. Rev. Sociol. 2017.43:249-269. Downloaded from www.annualreviews.org

Melanie M. Hughes, Pamela Paxton, and Mona Lena Krook p p p p p p p p p p p p p p p p p p p p p p p p p p p 331


Graduate Education and Social Stratification
Julie R. Posselt and Eric Grodsky p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p 353
Wealth Inequality and Accumulation
Alexandra Killewald, Fabian T. Pfeffer, and Jared N. Schachner p p p p p p p p p p p p p p p p p p p p p p p 379
Individual and Society
Skin Color and Colorism: Global Research, Concepts, and
Measurement
Angela R. Dixon and Edward E. Telles p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p 405
The Development of Transgender Studies in Sociology
Kristen Schilt and Danya Lagos p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p 425
Demography
Social Structure, Adversity, Toxic Stress, and Intergenerational
Poverty: An Early Childhood Model
Craig A. McEwen and Bruce S. McEwen p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p 445
The Second Demographic Transition Theory: A Review and Appraisal
Batool Zaidi and S. Philip Morgan p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p 473
Urban and Rural Community Sociology
Ethnographies of Race, Crime, and Justice: Toward a Sociological
Double-Consciousness
Victor M. Rios, Nikita Carney, and Jasmine Kelekay p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p 493
Explicating Divided Approaches to Gentrification and Growing
Income Inequality
Japonica Brown-Saracino p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p 515

vi Contents
SO43-FrontMatter ARI 5 July 2017 14:1

Policy
The Social Safety Net After Welfare Reform: Recent Developments
and Consequences for Household Dynamics
Laura Tach and Kathryn Edin p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p 541

Indexes

Cumulative Index of Contributing Authors, Volumes 34–43 p p p p p p p p p p p p p p p p p p p p p p p p p p p 563


Cumulative Index of Article Titles, Volumes 34–43 p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p p 567
Access provided by University of Manchester - John Rylands Library on 08/30/17. For personal use only.

Errata
Annu. Rev. Sociol. 2017.43:249-269. Downloaded from www.annualreviews.org

An online log of corrections to Annual Review of Sociology articles may be found at


http://www.annualreviews.org/errata/soc

Contents vii

Vous aimerez peut-être aussi