Académique Documents
Professionnel Documents
Culture Documents
Protection:
Kristin Klinger
Jennifer Neidig
Jamie Snavely
Kristin M. Roth
Carole Coulson
Chris Hrobak
Rebecca Beistline
Joy Langel
Lisa Tosheff
Yurchak Printing Inc.
Table of Contents
Section I
Background
Chapter I
Google: Technological Convenience vs. Technological Intrusion ......................................................... 1
Andrew Pauxtis, Quinnipiac University, USA
Bruce White, Quinnipiac University, USA
Chapter II
A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation .............. 16
Angelena M. Secor, Western Michigan University, USA
J. Michael Tarn, Western Michigan University, USA
Chapter III
Online Privacy, Vulnerabilities, and Threats: A Managers Perspective............................................... 33
Hy Sockel, DIKW Management Group, USA
Louis K. Falk, University of Texas at Brownsville, USA
Section II
Frameworks and Models
Chapter IV
Practical Privacy Assessments .............................................................................................................. 57
Thejs Willem Jansen, Technical University of Denmark, Denmark
Sren Peen, Technical University of Denmark, Denmark
Christian Damsgaard Jensen, Technical University of Denmark, Denmark
Chapter V
Privacy and Trust in Online Interactions .............................................................................................. 85
Leszek Lilien, Western Michigan University, USA
Bharat Bhargava, Purdue University, USA
Chapter VI
Current Measures to Protect E-Consumers Privacy in Australia ...................................................... 123
Huong Ha, Monash University, Australia
Ken Coghill, Monash University, Australia
Elizabeth Ann Maharaj, Monash University, Australia
Chapter VII
Antecedents of Online Privacy Protection Behavior: Towards an Integrative Model ........................ 151
Anil Gurung, Neumann College, USA
Anurag Jain, Salem State College, USA
Section III
Empirical Assessments
Chapter VIII
Privacy Control and Assurance: Does Gender Influence Online Information Exchange? ................ 165
Alan Rea, Western Michigan University, USA
Kuanchin Chen, Western Michigan University, USA
Chapter IX
A Profile of the Demographics, Psychological Predispositions, and Social/Behavioral Patterns
of Computer Hacker Insiders and Outsiders ...................................................................................... 190
Bernadette H. Schell, University of Ontario Institute of Technology, Canada
Thomas J. Holt, The University of North Carolina at Charlotte, USA
Chapter X
Privacy or Performance Matters on the Internet: Revisiting Privacy Toward a Situational
Paradigm ............................................................................................................................................ 214
Chiung-wen (Julia) Hsu, National Cheng Chi University, Taiwan
Section IV
Consumer Privacy in Business
Chapter XI
Online Consumer Privacy and Digital Rights Management Systems ............................................... 240
Tom S. Chan, Southern New Hampshire University, USA
J. Stephanie Collins, Southern New Hampshire University, USA
Shahriar Movafaghi, Southern New Hampshire University, USA
Chapter XII
Online Privacy and Marketing: Current Issues for Consumers and Marketers ................................. 256
Betty J. Parker, Western Michigan University, USA
Chapter XIII
An Analysis of Online Privacy Policies of Fortune 100 Companies ................................................. 269
Suhong Li, Bryant University, USA
Chen Zhang, Bryant University, USA
Chapter XIV
Cross Cultural Perceptions on Privacy in The United States, Vietnam, Indonesia, and Taiwan ....... 284
Andy Chiou, National Cheng Kung University, Taiwan
Jeng-chung V. Chen, National Cheng Kung University, Taiwan
Craig Bisset, National Cheng Kung University, Taiwan
Section V
Policies, Techniques, and Laws for Protection
Chapter XV
Biometric Controls and Privacy ......................................................................................................... 300
Sean Lancaster, Miami University, USA
David C. Yen, Miami University, USA
Chapter XVI
Government Stewardship of Online Information: FOIA Requirements and Other
Considerations .................................................................................................................................... 310
G. Scott Erickson, Ithaca College, USA
Chapter XVII
The Legal Framework for Data and Consumer Protection in Europe ............................................... 326
Charles OMahony, Law Reform Commission of Ireland, Ireland
Philip Flaherty, Law Reform Commission of Ireland, Ireland
Chapter XVIII
Cybermedicine, Telemedicine, and Data Protection in the United States ......................................... 347
Karin Mika, Cleveland State University, USA
Barbara J. Tyler, Cleveland State University, USA
Chapter XIX
Online Privacy Protection in Japan: The Current Status and Practices ............................................. 370
J. Michael Tarn, Western Michigan University, USA
Naoki Hamamoto, Western Michigan University, USA
Section I
Background
Chapter I
Google: Technological Convenience vs. Technological Intrusion ......................................................... 1
Andrew Pauxtis, Quinnipiac University, USA
Bruce White, Quinnipiac University, USA
Search engines can log and stamp each search made by end-users and use that collected data for an
assortment of business advantages. In a world where technology gives users many conveniences, one
must weigh the benefits of those conveniences against the potential intrusions of personal privacy.
With Googles eyes on moving into radio, television, print, and other technologies, one must back up
and examine the potential privacy risks associated with the technological conveniences being provided
by Google. This chapter gives an overview of Googles services and how they are related to personal
privacy online
Chapter II
A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation .............. 16
Angelena M. Secor, Western Michigan University, USA
J. Michael Tarn, Western Michigan University, USA
This chapter offers a review of online privacy issues, particularly in the areas of consumer online privacy
legislation and litigation, relationship among the privacy issues, legal protections, and risks for privacy
violations. A survey into the privacy literature provides insights on privacy protection and privacy concern. Results show a need for a stronger intervention by the government and the business community.
Consumers privacy awareness is also the key to a successful protection online. This chapter is concluded
with a call for consumer privacy education to promote privacy awareness, and for government and businesses timely responses to privacy violations.
Chapter III
Online Privacy, Vulnerabilities, and Threats: A Managers Perspective............................................... 33
Hy Sockel, DIKW Management Group, USA
Louis K. Falk, University of Texas at Brownsville, USA
Victims of online threats are not necessarily just individual consumers anymore. Management must be
ready to neutralize, reduce, and prevent these threats if the organization is going to maintain its viability in todays business environment. This chapter is designed to give managers an overview needed to
have a working knowledge of online privacy, vulnerabilities, and threats. The chapter also highlights
techniques that are commonly used to impede attacks and protect the privacy of the organization, its
customers, and employees.
Section II
Frameworks and Models
Chapter IV
Practical Privacy Assessments .............................................................................................................. 57
Thejs Willem Jansen, Technical University of Denmark, Denmark
Sren Peen, Technical University of Denmark, Denmark
Christian Damsgaard Jensen, Technical University of Denmark, Denmark
This chapter proposes a privacy assessment model called the operational privacy assessment model
that includes organizational, operational, and technical factors for the protection of personal data stored
in an IT system. The factors can be evaluated in a simple scale so that not only the resulting graphical
depiction can be easily created for an IT system, but graphical comparisons across multiple IT systems
are also possible. Examples of factors presented in a Kiviat graph are also presented. This assessment
tool may be used to standardize privacy assessment criteria, making it less painful for the management
to assess privacy risks on their systems
Chapter V
Privacy and Trust in Online Interactions .............................................................................................. 85
Leszek Lilien, Western Michigan University, USA
Bharat Bhargava, Purdue University, USA
Trust is an essential ingredient for a successful interaction or collaboration among different parties. Trust
is also built upon the belief that the privacy of the involved parties is protected before, during, and after
the interaction. This chapter presents different trust models, the interplay between trust and privacy, and
the metrics for these two related concepts. In particular, it shows how ones degree of privacy can be
traded for a gain in the level of trust perceived by the interaction partner. The idea and mechanisms of
trading privacy for trust are also explored.
Chapter VI
Current Measures to Protect E-Consumers Privacy in Australia ...................................................... 123
Huong Ha, Monash University, Australia
Ken Coghill, Monash University, Australia
Elizabeth Ann Maharaj, Monash University, Australia
Australia uses regulation/legislation, guidelines, codes of practice, and activities of consumer associations
and the private sector to enhance protection of consumers privacy. This chapter is designed to report
Australians experience in privacy protection. In particular, the four main areas of protection outlined
above are analyzed to draw implications. Recommendations include areas in coverage of legislation,
uniformity of regulations, relationships among guidelines and legislation, and consumer awareness
Chapter VII
Antecedents of Online Privacy Protection Behavior: Towards an Integrative Model ........................ 151
Anil Gurung, Neumann College, USA
Anurag Jain, Salem State College, USA
This chapter proposes an integrated framework to model online privacy protection behavior. Factors
in this framework are drawn from recent Internet and online privacy studies. Although many possible
factors can be included in the framework, the authors took a very conservative approach to include in
their framework only those factors that were formally studied in the academic literature. This framework
serves as the basis for future extensions or empirical assessments
Section III
Empirical Assessments
Chapter VIII
Privacy Control and Assurance: Does Gender Influence Online Information Exchange? ................ 165
Alan Rea, Western Michigan University, USA
Kuanchin Chen, Western Michigan University, USA
One main reason that online users are wary of providing personal information is because they lack trust
in e-businesses personal information policies and practices. As a result, they exercise several forms of
privacy control as a way to protect their personal data online. This chapter presents survey results of
how the two genders differ in their ways to control their private data on the Internet. Findings provide
guidelines for e-businesses to adjust their privacy policies and practices to increase information and
transactional exchanges
Chapter IX
A Profile of the Demographics, Psychological Predispositions, and Social/Behavioral Patterns
of Computer Hacker Insiders and Outsiders ...................................................................................... 190
Bernadette H. Schell, University of Ontario Institute of Technology, Canada
Thomas J. Holt, The University of North Carolina at Charlotte, USA
Research about hackers is scarce, but the impact on privacy that hackers bring to the Internet world should
not be underestimated. This chapter looks at the demographics, psychological predispositions, and social/
behavioral patterns of computer hacker to better understand the harms that can be caused. Results show that
online breaches and online concerns regarding privacy, security, and trust will require much more complex
solutions than currently exist. Teams of experts in fields such as psychology, criminology, law, and information technology security need to collaborate to bring about more effective solutions for the virtual world.
Chapter X
Privacy or Performance Matters on the Internet: Revisiting Privacy Toward a Situational
Paradigm ............................................................................................................................................ 214
Chiung-wen (Julia) Hsu, National Cheng Chi University, Taiwan
This chapter introduces a situational paradigm to study online privacy. Online privacy concerns and
practices are examined within two contexts: technology platforms and users motivations. Results show
a distinctive staging phenomenon under the theory of uses and gratifications, and a priori theoretical framework. Diffused audience was concerned less about privacy but they did not disclose their
personal information any more than the other groups. Users may act differently in diverse platforms or
environments, implying that treating Internet users as a homogeneous group or considering them to act
the same way across different environments is a problematic assumption.
Section IV
Consumer Privacy in Business
Chapter XI
Online Consumer Privacy and Digital Rights Management Systems ............................................... 240
Tom S. Chan, Southern New Hampshire University, USA
J. Stephanie Collins, Southern New Hampshire University, USA
Shahriar Movafaghi, Southern New Hampshire University, USA
The business values of using the Internet for the delivery of soft media may be hampered when the owners
risk losing control of their intellectual property. Any business that wishes to control access to and use of
its intellectual property is a potential user of digital rights management (DRM) technologies. Managing,
preserving, and distributing digital content through DRM is not without problems. This chapter offers
a critical review of DRM and issues surrounding its use
Chapter XII
Online Privacy and Marketing: Current Issues for Consumers and Marketers ................................. 256
Betty J. Parker, Western Michigan University, USA
Certain marketing practices may sometimes cause privacy conflicts between businesses and consumers.
This chapter offers insights into privacy concerns from todays marketing practices on the Internet. Specifically, areas of focus include current privacy issues, the use of spyware and cookies, word-of-mouth
marketing, online marketing to children, and the use of social networks. Related privacy practices,
concerns, and recommendations are presented from the perspectives of Internet users, marketers, and
government agencies.
Chapter XIII
An Analysis of Online Privacy Policies of Fortune 100 Companies ................................................. 269
Suhong Li, Bryant University, USA
Chen Zhang, Bryant University, USA
This chapter examines the current status of online privacy policies of Fortune 100 companies. Results
show that 94% of the surveyed companies have posted an online privacy policy and 82% collect personal
information from consumers. Additionally, the majority of the companies only partially follow the four
principles (notice, choice, access, and security) of fair information practices. Few organizations have
obtained third-party privacy seals including TRUSTe, BBBOnline Privacy, and Safe Harbor
Chapter XIV
Cross Cultural Perceptions on Privacy in The United States, Vietnam, Indonesia, and Taiwan ....... 284
Andy Chiou, National Cheng Kung University, Taiwan
Jeng-chung V. Chen, National Cheng Kung University, Taiwan
Craig Bisset, National Cheng Kung University, Taiwan
This chapter studies concerns of Internet privacy across multiple cultures. Students from several countries were recruited to participate in the focus group study in order to discover the differences of their
privacy concerns. Collectivistic cultures appear to be less sensitive to the violation of personal privacy;
while the individualistic cultures are found to be more proactive in privacy protection. Implications are
provided.
Section V
Policies, Techniques, and Laws for Protection
Chapter XV
Biometric Controls and Privacy ......................................................................................................... 300
Sean Lancaster, Miami University, USA
David C. Yen, Miami University, USA
This chapter provides an overview of biometric controls to protect individual privacy. Although much of
the discussion targets protection of physical privacy, some may also apply to online consumer privacy.
Discussion is focused in four main areas, technological soundness, economic values, business applications, and legal/ethical concerns. Further insights are provided
Chapter XVI
Government Stewardship of Online Information: FOIA Requirements and Other
Considerations .................................................................................................................................... 310
G. Scott Erickson, Ithaca College, USA
This chapter focuses on the issues surrounding the federal Freedom of Information Act and associated
state and local laws for their implications on personal privacy. Despite the good intentions of these laws
to enable openness in government, confidential business information and private personal information
may be vulnerable when data are in government hands. This chapter offers the readers a better understanding of the several trends regarding the statutes and their interpretations
Chapter XVII
The Legal Framework for Data and Consumer Protection in Europe ............................................... 326
Charles OMahony, Law Reform Commission of Ireland, Ireland
Philip Flaherty, Law Reform Commission of Ireland, Ireland
This chapter discusses the legal framework and the law of the European Union (EU) for consumer and
data protection. The creation of legal frameworks in Europe aims to secure the protection of consumers
while simultaneously facilitating economic growth in the European Union. This chapter outlines the main
sources of privacy protection law and critically analyzes the important provisions in these sources of law.
Gaps and deficiencies in the legal structures for consumer and data protection are also discussed
Chapter XVIII
Cybermedicine, Telemedicine, and Data Protection in the United States ......................................... 347
Karin Mika, Cleveland State University, USA
Barbara J. Tyler, Cleveland State University, USA
This chapter provides an overview of law relating to Internet medical practice, data protection, and consumer information privacy. It provides a comprehensive overview of federal (HIPAA) and state privacy
laws. Readers are given advice to the legal and data protection problems consumers will encounter in
purchasing medical and health services on the Internet. Furthermore, actual case studies and expert advice
are provided to offer a safer online experience. The authors also advocate that the United States must
enact more federal protection for the consumer in order to deter privacy violations and punish criminal,
negligent, and willful violations of personal consumer privacy.
Chapter XIX
Online Privacy Protection in Japan: The Current Status and Practices ............................................. 370
J. Michael Tarn, Western Michigan University, USA
Naoki Hamamoto, Western Michigan University, USA
This chapter reports the current status and practices of online privacy protection in Japan. It offers a
perspective of an eastern culture regarding the concept of privacy, its current practices, and how it is
protected. Following the discussion of the Japanese privacy law called Act on the Protection of Personal
Information, Japans privacy protection mechanisms to support and implement the new act are examined.
The authors also offer a four-stage privacy protection solution model as well as two case studies to show
readers the problems, dilemmas, and solutions for privacy protection from Japans experience.
xiv
Preface
Privacy, the right to be left alone, is a fundamental human right. Risks of the contraryprivacy invasionhave increased in significant proportions in a world increasingly turning online. In todays networked
world, a fast growing number of users are hopping on and off the Internet superhighways, multiple times
everydaymore so than they hop on and off physical expressways. Internet users are also doing more
diverse activities online, including browsing, shopping, communicating, chatting, gaming, and even
working. With so much online presence, users find themselves, in many situations, divulging information
that they would otherwise may not due to privacy concerns. Users may even be wary of getting online
because of fear of possible privacy invasion from the many preying eyes on the Internet. The issue is
not whether privacy should be protected or not, rather the issue is how it should be protected in the vast
online world where information can be intercepted, stolen, quickly transported, shared unknowingly to
the user, or even sold for profit. Compared to an offline environment, the Internet enables collection of
more information from users cost effectively, sometimes even without their consent. Thus, the Internet
poses greater privacy threat for users as their personal information is transmitted over the Internet if an
organization does not have a good security mechanism in place. Furthermore, the connectivity of the
Internet allows capturing, building, and linking of electronic profiles and behaviors of users.
Online privacy is a multidimensional concept and thus has been addressed in research from a multiplicity of angles, albeit not equally thoroughly. Much research effort has focused on addressing privacy
as a technological factor and hence proposed technical solutions to privacy protection. Although this is
an important dimension of online privacy, there are equally, if not more, important dimensions, such as
context, culture, perceptions, and legislation. Such softer (non-technological) aspects of privacy cannot be understood by only looking at the technological aspects of privacy. The human dimension is as
complex and as important for getting a more complete understanding of privacy. At a micro level, not
only that individuals have varying requirements for privacy, but the same individuals requirements
may change over time or between situational contexts. Response to privacy invasion may be very different between individuals and situational contexts. There may also be a gap between what individuals
desired and actual behaviors in relation to their privacy concerns. Individuals may have more stringent
privacy requirements than what their actual online practice reflects. Online privacy researchers offered
less coverage to these human factors, but understanding these factors, and many more, is key to gaining
a better understanding of online privacyhence the human relativism.
At a macro level, privacy requirements and response to privacy invasion may vary across cultures,
societies, and business situations. Organizational practices of privacy policies and responses to incidents of privacy invasion affect peoples perceptions on the current state of privacy, and consequently
affect their trust in the organization. People are generally concerned about how personal information is
collected, used, and distributed beyond its original purpose and beyond the parties originally involved.
Breaches to how their information is collected, used, or shared and response to such breaches directly
xv
impact their privacy concerns and their trust. There is still not sufficient empirical evidence to answer
many privacy questions at these macro levels, and many human aspects of online privacy in some social
and cultural settings have not yet received enough research attention. Consequently, our understanding
of the relationships between online privacy and dimensions such as culture, user characteristics, business
context, technology use, and education is still limited.
The world is increasingly turning online and there will be no reversal of this trend. To protect the
privacy of online users and to consequently achieve the full potential of transacting in an online world,
the issue of online privacy needs to be understood from multiple facets. The challenge is to minimize
constraints of online dealings without compromising users privacy. Such delicate balancing cannot
be achieved without a broad understanding of online privacy. This book is an attempt to provide such
understanding by offering a comprehensive and balanced coverage of the various dimensions of online
privacy. Many previously published books either treat privacy as a sub-topic under a broader topic of
end-user computing or information systems or focus primarily on technical issues or managerial strategies. Many others focus on end users and offer only introductory material or general guidelines to
enhance personal online security and privacy. While this treatment of these important topics of privacy
is appropriate for their intended use and audience, it does not allow for a broader and a more extensive
examination of online privacy and how it guides practice.
Furthermore, many gaps in privacy, threats, and fraud theories have not yet been filled. The most
prominent such gaps include linking privacy theories to other established theories and frameworks in
information technology or related disciplines. For example, culture, social, and behavioral issues in
privacy have not received enough attention. Research on human aspects as well as empirical assessments of privacy issues are lacking. Research on linking privacy considerations to business practices,
educational curriculum development/assessment, and legislative impacts are also scarce. Many studies
have focused on technological advancements, such as security protection and cryptography to offer
technical tools for privacy protection and for assessing risks of privacy invasion. Although such focus
is a must to protect users from these risks, technology is not equivalent to protection. For a protection
scheme to work well, both technical and human aspects have to work in harmony. A major goal of this
book is to provide a view of privacy that integrates the technical, human, cultural, and legal aspects of
online privacy protection as well as risks and threats to privacy invasion.
The book aims for (a) promoting research and practice in various areas of online privacy, threats assessment, and privacy invasion prevention, (b) offering a better understanding on human issues in these
areas, and (c) furthering the development of online privacy education and legislation. The book goes
beyond introductory coverage and includes contemporary research on the various dimensions of online
privacy. It aims to be a reference for professionals, academics, researchers, and practitioners interested
in online privacy protection, threats, and prevention mechanisms. The book is the result of research
efforts from content experts, and thus it is an essential reference for graduate courses and professional
seminars.
There are 19 great chapters in the book, grouped into five sections: (1) background, (2) frameworks
and models, (3) empirical assessments, (4) consumer privacy in business, and (5) policies, techniques,
and laws for protection.
The background section provides an overview of privacy for those who prefer a short introduction
to the subject. In Chapter I, Pauxtis and White point out the serious privacy implications of online
searches. Search engines can log and stamp each search made by end-users and use that collected data
for an assortment of business advantages. In a world where technology gives users many conveniences,
one must weigh the benefits of those conveniences against the potential intrusions of personal privacy.
Nevertheless, end-users will always use search engines. They will always Google something on their
xvi
mind. The authors conclude that while the vast majority of casual Internet users either do not know
Googles data collection policies, or simply do not care, at the end of the day it comes down to the
simple fact that we as a society must put our trust into the technological innovations that have become
commonplace conveniences.
In Chapter II, Angelina and Tarn brought to the forefront the importance of legal protection and
privacy awareness and presented a taxonomic view to explore the relationship of the issues, legal protections, and the remedies and risks for not complying with the legal requirements. The authors used two
survey studies to reinforce the vital need for a stronger role by the government and business community
as well as the privacy awareness from online consumers themselves. The chapter is concluded with a vital
call for consumer privacy education and awareness, government and legislators attention, and timely
responses with legislation that protects consumers against those who would misuse the technology.
In Chapter III, Sockel and Falk highlighted the gravity of vulnerabilities to privacy in that it is not
uncommon for employees to work offsite, at home, or out of a hotel room, often using less than secure
Internet connectionsdial-up, cable, Internet cafs, libraries, and wireless. The chapter highlights the
relationship between vulnerability, threats, and action in what the authors termed risk triangle. It
delves into techniques that are commonly used to thwart attacks and protect individuals privacy, and
discussed how in the age of unrest and terrorism, privacy has grown even more important, as freedoms
are compromised for security. The chapter provides an overview of the various vulnerabilities, threats,
and actions to ameliorate them.
Section II consists of four chapters that offer frameworks or models to study various privacy issues.
In Chapter IV, Jansen, Peen, and Jensen turn the attention to the claim that Most of the current work
has focused on technical solutions to anonymous communications and pseudonymous interactions, but,
in reality, the majority of privacy violations involve careless management of government IT-systems,
inadequate procedures or insecure data storage. The authors introduced a privacy assessment model,
called the Operational Privacy Assessment Model that includes organizational, operational, and technical
factors. The factors can be evaluated in a simple scale so that not only the resulting graphical depiction
can be easily created for an IT system, but graphical comparisons across multiple IT systems are also
possible. Although their method has been developed in the context of government IT-systems in Europe, they believe that it may also apply to other government systems, non-governmental organisations
(NGOs), and large private companies.
In Chapter V, Lilien and Bhargava underline the strong relationship between privacy and trust. The
authors contend that the role of trust and privacy is as fundamental in computing environments as it is
in social systems. The chapter presents this role in online interactions, emphasizing the close relationship between trust and privacy, and shows how ones degree of privacy can be traded for a gain in the
level of trust perceived by ones interaction partner. The chapter explores in detail the mechanisms of
this core theme of trading privacy for trust. It also presents different trust models, the interplay between
trust and privacy, and the metrics for these two related concepts.
In Chapter VI, Ha, Coghill, and Maharaj offer an Australian perspective on measures to protect econsumers privacy, the current state of e-consumer privacy protection, and discuss policy implications
for the protection of e-consumers privacy. The authors suggest that although privacy protection measures
in the form of legislation, guidelines, and codes of practice are available, their effectiveness is limited in
alleviating consumers privacy and security concerns. The authors contend that protection of consumers
personal information also depends on how e-retailers exercise their corporate social responsibility to
provide protection to e-consumers.
In Chapter VII, Gurung and Jain review the existing literature and analyze the existing online privacy theories, frameworks, and models to understand the variables that are used in the context of online
xvii
privacy protection. The authors developed an integrative framework to encapsulate the antecedents to
online privacy protection behavior.
Section III includes research studies that report empirical findings on various privacy topics. One
main reason that online users are wary of providing personal information is because they lack trust in
e-businesses personal information policies and practices. As a result, they exercise several forms of
privacy control as a way to protect their personal data online. In Chapter VIII, Rea and Chen report
survey results of how the two genders differ in their ways to control their private data on the Internet.
Findings provide guidelines for e-businesses to adjust their privacy policies and practices to increase
information and transactional exchanges.
Discussion on privacy is incomplete without a glimpse into hackers and crackersthe elite corps
of computer designers and programmers, according to Schell and Holt in Chapter IX. Schell and Holt
argue that it is vital that researchers understand the psychological and behavioral composition of network attackers and the social dynamics that they operate within. This understanding can improve our
knowledge of cyber intruders and aid in the development of effective techniques and best practices
to stop them in their tracks. Such techniques can minimize damage to consumer confidence, privacy,
and security in e-commerce Web sites and general information-sharing within and across organizations.
The authors discuss known demographic and behavioral profiles of hackers and crackers, psychological
myths, and truths about those in the computer underground, and how present strategies for dealing with
online privacy, security, and trust issues need to be improved.
In Chapter X, Hsu adds a perspective from communications to the ongoing debate on online privacy.
She examines why online privacy researchers failed to explain why users asserting to have higher privacy
concerns still disclose sensitive information. The author argues that this is due to ignoring the social
context (what the author terms situational paradigm) in the research on online privacy. The author tries to
offer more support for the argument of the situational paradigm from the newly-emerging phenomenon
of online photo album Web sites in Taiwan.
Section IV focuses on consumer privacy in business and consists of four chapters. In Chapter XI,
Chan, Collins, and Movafaghi tackle the issue of online consumer privacy and digital rights management
(DRM) systems of protecting digitally stored content. This protection may be accomplished through
different strategies or combinations of strategies including: identifying authorized users, identifying
genuine content, verifying proof of ownership and purchase, uniquely identifying each copy of the
content, preventing content copying, tracking content usage and distribution, and hiding content from
unauthorized users. The authors argue that DRM systems may change the business model from a traditional buy-and-own to a pay-per-use, but caution that this may pose great risks to consumers and society
as DRM technologies may weaken the rights to privacy, fair use, and threaten the freedom of expression.
The chapter discusses the conflict between the rights of content owners and the privacy rights of content
users, and explores several DRM techniques and how their use could affect consumer privacy.
In Chapter XII, Parker offers views on online privacy from a marketing perspective in the context
of consumer marketing. The chapter provides insights into the ways that online privacy has become a
balancing act in which the needs of businesses are oftentimes balanced against the needs of consumers.
A number of privacy issues that affect the marketing of products and services are presented, along with
recommended best practices. The issues discussed include: (1) consumer, marketer, and government
perspectives on data collection, ownership and dissemination; (2) online advertising and the use of
cookies and spyware; (3) word-of-mouth marketing and the use of blogs, sponsored chat, and bulletin
boards; (4) marketing online to children; and (5) privacy issues in social networks and online communities. The chapter represents one of the first analyses of online marketing practices and their associated
privacy issues.
xviii
In Chapter XIII, Li and Zhang offer analysis of online privacy policies of Fortune 100 companies
within the context of the four principles (notice, choice, access, and security) of fair information practices. The authors found that 94% of the surveyed companies posted an online privacy policy and 82%
of them collect personal information from consumers. The majority of the companies only partially
follow the four principles of fair information practices. In particular, organizations fall short in security
requirementsonly 19% mention that they have taken steps to provide security for information both
during transmission and after their sites have received the information. The authors conclude that a well
designed privacy policy by itself is not adequate to guarantee privacy protection, effective implementation is as important. Consumer education and awareness are also essential for privacy protection.
In Chapter XIV, Chiou, Chen, and Bisset focus attention on the important question of online privacy
across cultures by analyzing cultural perceptions on privacy in the United States, Vietnam, Indonesia,
and Taiwan. The authors point out clear differences between how personal information is viewed in the
United States and Asia. For example, an American in Taiwan might feel suspicious if asked to provide
his passport number by a community Web site, while a Taiwanese in the United States might be puzzled
and alienated by the fierceness at which people guard their private lives. The authors argue that such
differences should be considered in cross-culture online privacy research and legislation. Furthermore,
due to the various cultural differences and backgrounds that form privacy perceptions, great care and
sensitivity should be taken into consideration when conducting privacy studies across cultures.
Section IV deals with policies, techniques, and laws for privacy protection. In Chapter XV, Lancaster and Yen focus on the important linkage between biometric controls and privacy. Biometrics is an
application of technology to authenticate users identities through the measurement of physiological or
behavioral patterns, and thus do not suffer from the shortcoming of external authentication techniques that
rely on items that can be lost, forgotten, stolen, or duplicated. The authors conclude that, with adequate
communication, users are likely to appreciate systems that allow them the ease of use and convenience
that biometric systems offer, and hence their use will continue to grow in the future.
In Chapter XVI, Erickson discusses the important issue of the tension between openness in government and personal privacy. The trend in the federal legislature has been to continually strengthen the
FOIA and openness by reaffirming a presumption that government records should be released unless
there is a compelling reason not to. Alternatively, the trend in agency practice and the courts has been
toward more privacy, allowing use of certain exemptions in the FOIA to deny records to individuals or
organizations seeking them. This balance has been clarified somewhat by legislation on electronic records,
agency practice, and a number of court cases suggesting agencies can limit releases to central purpose
activities and records not including individually identifiable information. The author also considers the
status and vulnerability of confidential business information passed on to governments and the status
and vulnerability of government databases concerning individual citizens. The main conclusion of the
chapter is that matters remain in flux in the legal aspects of privacy, and regardless of which way the
balance tips (openness vs. privacy), more certainty will help government, organizations, and individuals
better plan how and when to share their own information resources.
In Chapter XVII, OMahony and Flaherty discuss the legal framework for consumer and data protection in Europe which seeks to secure the protection of consumers while simultaneously facilitating
economic growth in the European Union. The chapter outlines the main sources of law which protect
consumers and their privacy, the important provisions in these sources of law and critically analyzes
them, and points the gaps and deficiencies in the consumer and data protection legal structures. The
authors argue that the creation of these legal rights and legal protections will only stem the misuse of
personal data if people know about the law and their rights and know how to access legal protections.
Thus, more needs to be done to ensure that citizens of the European Union are equipped with the nec-
xix
essary knowledge to ensure that their personal data is treated with respect and in accordance with law.
The authors conclude that more focus needs to be put on ensuring greater compliance with the law,
particularly from businesses who have benefited from the free flow of data.
In Chapter XVIII, Mika and Tyler provide an overview of the law relating to cybermedicine and
telemedicine in terms of data protection and other legal complications related to licensing and a conflict of
state laws. The authors examine the laws applicable to Web sites where medical diagnosis or the purchase
of medical services (including prescriptions) is available. They discuss how the new methodology of
acquiring medical care is at odds with traditional notions of state regulation and how current laws, both
federal and state, leave many gaps related to any consumer protections or potential causes of action when
privacy is compromised. The authors posit some expert advice for consumers regarding using websites
for medical purposes as well as protecting their own privacy. Lastly, the authors advocate a federal law
more punitive that HIPAA; one that regulates and protects patient information, medical transactions,
and interactions on the Internet and deters violations of patient privacy by mandating significant fines
and imprisonment for negligent or criminal and willful violations of that privacy.
In Chapter XIX, Tarn and Hamamoto emphasized trans-border differences in the concepts of privacy; namely, that the concept of privacy in Japan is different than that in the western countries. They
explained how, after more and more privacy-related problems were revealed by the media, consumers
began to pay attention to the protection of their private information, and, in response, the Japanese government enacted legislation to protect consumers and regulate companies business activities associated
with customers private information. This exposed many weaknesses in companies privacy protection
systems and revealed unethical uses of private data.
We cannot claim perfection of this book on online privacy, a broad and multidimensional concept.
Nevertheless, we believe it fills a major gap in the coverage of privacy by providing a comprehensive
treatment of the topic. Thus, it provides a single integrated source of information on a multitude of privacy dimensions including technical, human, cultural, personal, and legal aspects. Research on privacy
is still evolving and a varied and broad coverage as presented in this book is a valuable reference for
researchers, practitioners, professionals, and students.
xx
Acknowledgment
We are grateful to numerous individuals whose assistance and contributions to the development of this
scholarly book either made this book possible or helped to make it better.
First, we would like to thank all chapter reviewers for their invaluable comments, which helped
ensure the intellectual value of this book. We would also like to express gratitude to our chapter authors
for their excellent contributions to this book.
Special thanks are due to the publishing team at IGI Global, in particular to our Managing Development Editor, Ms. Kristin Roth, who allowed her staff to provide invaluable support to keep the project on
schedule and in high quality, and to Dr. Mehdi Khosrow-Pour whose vision motivated the development
of this pioneering project. This project would not have been successful without Ross Miller, Deborah
Yahnke, and Rebecca Beistline, who tirelessly offered their professional assistance during the development of this project.
Finally, we would like to give our heart-felt thanks to Kuanchins wife, Jiajiun, and Adams family
for their understanding and encouragement during the development of this book.
Kuanchin Chen and Adam Fadlalla
Section I
Background
Chapter I
Google:
AbstrAct
What began as simple homepages that listed favorite Web sites in the early 1990s have grown into some
of the most sophisticated, enormous collections of searchable, organized data in history. These Web
sites are search enginesthe golden gateways to the Internetand they are used by virtually everyone.
Search engines, particularly Google, log and stamp each and every search made by end-users and use
that collected data for their own purposes. The data is used for an assortment of business advantages,
some which the general population is not privy too, and most of which the casual end-user is typically
unfamiliar with. In a world where technology gives users many conveniences, one must weigh the benefits of those conveniences against the potential intrusions of personal privacy. Googles main stream of
revenue is their content-targeted AdWords program. AdWordswhile not a direct instance of personal
privacy breachmarks a growing trend in invading personal space in order to deliver personalized
content. Gmail, Googles free Web-based e-mail service, marked a new evolution in these procedures,
scanning personal e-mail messages to deliver targeted advertisements. Google has an appetite for data,
and their hundreds of millions of users deliver that every week. With their eyes on moving into radio,
television, print, establishing an Internet service provider, furthering yet the technology of AdWords, as
well as creating and furthering technology in many other ventures, one must back up and examine the
potential privacy and intrusion risks associated with the technological conveniences being provided.
Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.
millions each and every day. Nevertheless, endusers will always use search engines. End-users
will always Google something on their mind.
Search engines are quick, convenient, and always
yield a precise or near-precise result. Who would
want to give that up?
A brIef hIstory of
contemporAry seArch
engInes
End users were not always as comfortable with the
Internet as they are today. Shopping on Amazon.
com, running a Google search, and online banking
are second-nature to most Americansbut back
in the 1990s, they were not. In a 1992 Equifax
study, 79% of Americans noted that they were
concerned about their personal privacy on the
Internet. Moreover, 55% of Americans felt that
privacy and the security of personal information
would get worse in the new millennium (Salehnia,
2002). A March 1999 Federal Trade Commission
(FTC) survey of 361 Web sites revealed that 92.8%
of the sites were collecting at least one type of
identifying information explains Salehnia.
Examining this statement more closely, nine out of
every ten Web sites not only detected but stored a
visitors name, address, IP, or something else that
Clearly marked in this simple string of information is the end-users Internet protocol address,
date and time of access, the URL complete
with keywords (search?q=colleges), browser and
browser version, operating system, and unique
user-identifying cookie. Google generates and
stores over one billion of these strings a week,
which equals approximately fifty billion a year. If
Google has not deleted one of these queries since
they began logging, then one could estimate that
they currently store upwards of half a trillion of
these strings within their servers.
Google provides a rather by-the-book explanation in regards to the reason why they log every
search. In a CNET news article by Elinor Mills,
she asked Google executive Nicole Wong the need
for such a collection of data. The answer: Google
uses the log information to analyze traffic in order
to prevent people from rigging search results, for
blocking denial-of-service attacks and to improve
search services. This data is then stored on many
servers, for an unknown period of time. According
to David F. Carr of Baseline Magazine, Google is
believed have anywhere from 150,000 to 450,000
servers around the worldwith the majority being
in the United States.
Googles privacy policy, which clearly outlines
all of their data collection practices, is a mere two
clicks away from any site on their entire network.
Their search engine policy is divided up into nine
clear, simple, easy to understand sections that
span no more than two screens and clocks in at
approximately 980 words. The policy makes it
clear that no personally identifiable information
AdWords
The value of the sheer amounts of data Google
has amassed is priceless. It is data that can prove
trends, display top ranking search queries, illustrate what region searches for what item
morethe list is long, but not as long as the list
of marketing groups or advertisers that would
do anything to have such data. Should the data
Google has amassed ever be released, the damage
could be devastating. In the summer of 2006, AOL
released the search queries of 528,000 of its users
that used AOLs generic search engine. While not
nearly as detrimental as could have been if it was
Google releasing data, the damage was already
done. People were openly identified by their search
termsseveral even being convicted of crimes
and sentenced to prison, as their search queries
either supported evidence of their wrongdoings,
or put the person in the spotlight and exposed
them for committing a crime.
Google uses much of their data to sculpt their
pride and joy: AdWords. AdWords accounts for
nearly $3.5 billion in yearly revenue, and is the
main stream of Googles income. Google utilizes
their users search queries to aggressively and
competitively price keywords. Advertisers then
bid on these keywords for positions on the top
and right side of the SERPs. The more relevant
the ad, the more likelihood a targeted searcher
will click on it. While this is hardly an invasion
argues the possibility of Gmail compiling a database of what email addresses use which keywords.
What further complicates Gmails e-mail privacy
is the fact that people who do not even subscribe
to the Gmail service are having their e-mails
scanned as well. If a Gmail user receives an email from a Comcast e-mail address, the person
with the Comcast e-mail is having their message
scanned for advertising opportunities by Google.
The sender does not agree to Googles invasive
scanning, nor do they agree to have their e-mail
infiltrated with ads pertaining to what they wrote
in private. Gmail furthers the notion of invasive
and targeted advertising, and merely builds some
framework to the forms of advertising Google
may be capable of in the future.
0
In google We trust
In a world where technology keeps creating conveniences, we must sometimes hit the brakes so we
can make sure nothing is getting missed. Google
is amassing one of the largest collections of search
data in history. It remains to be seen whose hands
this data will end up in, or how Google will use
it to their maximum advantage. Services like AdWords and Gmail are slowly pushing the notions
references
Bankston, K. (2006, February 9). Press releases:
February, 2006 | electronic frontier foundation.
Retrieved November 11, 2006, from http://www.
eff.org/news/archives/2006_02.php
Brandt, D. (n.d.). Google as big brother. Retrieved
November 11, 2006, from http://www.googlewatch.org
Carr, D. (2006, July 6). How Google works. Retrieved November 17, 2006, from http://www.baselinemag.com/article2/0,1397,1985040,00.asp
Chester, J. (2006, March 26). Googles wi-fi privacy ploy. Retrieved November 14, 2006, from
www.thenation.com/doc/20060410/chester
AddItIonAl reAdIng
Battelle, J. (2005). The search: How google and
its rivals rewrote the rules of business and transformed our culture. (Portfolio Hardcover ISBN
1-59184-088-0).
Brin, S., Motwani, R., Page, L., & Winograd, T.
(1999). The PageRank citation ranking: Bringing
order to the Web. Retrieved from http://dbpubs.
stanford.edu:8090/pub/showDoc.Fulltext?lang=e
n&doc=1999-66&format=pdf&compression=
Brin, S., & Page, L. (1998). The anatomy of a
large-scale hypertextual Web search engine.
Retrieved from http://dbpubs.stanford.edu:8090/
pub/1998-8
Electronic Frontier Foundation (EFF): http://www.
eff.org/issues/privacy
Google PageRank Patent: http://patft.uspto.gov/
netacgi/nph-Parser?patentnumber=7058628
Google-Watch.org: http://www.google-watch.
org
Chapter II
A Taxonomic View of
Consumer Online Privacy Legal
Issues, Legislation, and
Litigation
Angelena M. Secor
Western Michigan University, USA
J. Michael Tarn
Western Michigan University, USA
AbstrAct
In this chapter, consumer online privacy legal issues are identified and discussed. Followed by the literature review in consumer online privacy legislation and litigation, a relational model is presented to
explore the relationship of the issues, legal protections, and the remedies and risks for not complying
with the legal requirements. Two survey studies are used to reinforce the vital need for a stronger role
by the government and business community as well as the privacy awareness from online consumers
themselves. This chapter is concluded with a vital call for consumer privacy education and awareness
and government and legislators attention and timely responses with legislation that protects consumers
against those who would misuse the technology.
Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.
A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation
IntroductIon
Information privacy is defined as the right of individuals to control information about themselves
(Richards, 2006). As the Internet becomes more
popular and more people are using it as a daily
means of communication, information sharing,
entertainment, and commerce, there are more
opportunities for breaches of privacy and malicious intent attacks. There have been numerous
bills introduced in the House of Representatives
and the Senate in recent years attempting to legislate protections for consumers regarding online
privacy. Many of these attempts at legislation fail
to become laws. This study aims to examine consumer online privacy legal issues, recent litigation
topics, and the present active legislation. The topic
will be of interest because some of the legislation
does not provide more consumer protection but is
instead taking away consumer privacy such as the
USA Patriot Act and the Homeland Security Act
enacted after the terrorist attacks of September
11, 2001. These laws give government more access to private information instead of providing
consumers with increased protections.
Some relevant privacy issues are underage
consumer protections, health information privacy,
lack of consumer control over information stored
in databases, information security breaches, and
identity theft. Recent litigation in the United States
in the information security area has been over the
lack of protection over the information gathered
and stored by companies from consumers. The
Federal Trade Commission (FTC) has initiated
lawsuits against companies not providing the
level of information protection they should. The
FTC charged Petco with Web site security flaws
that allowed a structured query language (SQL)
injection attacker to gain consumer credit card
information (FTC File No. 032 3221, 2004). The
FTC also charged BJs Wholesale Club with failing
to secure credit card magnetic stripe information
appropriately (FTC v. BJs Wholesale Club, Inc,
2005). There was also a class action suit filed on
A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation
Author
2006
Swartz, Nikki
Health Information
Privacy
Issue
A survey of 1,117 hospitals and health systems that was conducted in January by
the American Health Information Management Association (AHIMA) found that
compliance with the three-year-old federal rules governing the privacy of patients
medical records declined in the past year. --The survey results also suggest that
patients are becoming more concerned about the privacy of their medical records.
According to 30% of respondents, more patients are asking questions.
Contribution
2006
Vasek, S.
Information Privacy
2004
Milne et al.
Information Security
Breaches
Directly hacking into company databases and stealing personal or financial data,
such as consumer credit card or social security information
2004
Milne et al.
Identity Theft
2004
Milne et al.
Spyware, Malware,
Viruses & SPAM
2003
Bagner et al.
Underage Consumer
Protection
8
A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation
9
A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation
0
A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation
A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation
A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation
Law
Applications
Contribution
2006
2003
2003
CAN-SPAM Act
2002
2001
2001
A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation
Table 2. continued
Year
Law
Applications
Contribution
2000
Childrens Online
Privacy Protection
Act
1996
Health Insurance
Portability and
Accountability
Act
1974
Federal Privacy
Act
Case
Issues
2007
FTC v. Albert
Spyware: the code interfered with the functioning of the computer, and was difficult for
consumers to uninstall or remove. In addition,
the code tracked consumers Internet activity,
changed their home page settings, inserted new
toolbars onto their browsers, inserted a large
side frameor window onto browser windows that in turn displayed ads, and displayed
pop-up ads, even when consumers Internet
browsers were not activated
Contribution
Permanently bars him from interfering with consumers computer use, including distributing software code
that tracks consumers Internet activity or collects other
personal information, changes their preferred homepage
or other browser settings, inserts new toolbars onto their
browsers, installs dialer programs, inserts advertising
hyperlinks into third-party Web pages, or installs other
advertising software. It also prohibits him from making
false or misleading representations; prohibits him from
distributing advertising software and spyware; and requires he perform substantial due diligence and monitoring if he is to participate in any affiliate program
A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation
Table 3. continued
Year
Case
Issues
Contribution
2006
FTC v. Petco
2006
FTC v. BJs
Wholesale
Club
2006
Banknorth,
N.A. v. BJs
Wholesale
Club
2005
FTC v.
Cartmanager
International
an information security breach event on a business. Identity theft could be a result from most
of the categories, but could also occur without
any interaction with an online entity through
non-shredding of personal documents therefore,
it is a category in and of itself.
The model illustrates a top-down view of
the consumer online privacy protection flow.
health Information
privacy p ers on ally
id entifiab le inform ation
b ein g releas ed
underage consumer
p ers on al inf orm ation b ein g
releas ed b y you th
Register complaint at
ftc.gov
Register complaint at
ftc.gov
Register complaint at
ftc.gov
ftc v. Xanga.com
c ollec ted and d is c los ed
un d erag e p ers on al
inf orm ation w ith ou t p arental
c ons en t
Information privacy
s ens itiv e or c onfid ential
inf orm ation releas ed
in a p p ro p riatel y
Information security
breaches h ac king ,
s tealin g p ers on al
inf orm ation, or Intern et or
e-m ail interc eptions
Register complaint at
ftc.gov
cAn-spAm Act
c ontrollin g n on-s olic ited em ail ag ains t b ein g
m is lead in g or d ec e p tive
spyware, malware,
viruses, cookies, & spAm
to trac k c lic ks , h is tory, or
ins tall m alic ious program s
A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation
A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation
The following two survey studies examine consumer online behaviors and privacy concerns,
reinforcing the vital need for a stronger role by
the government and business community as well
as the privacy awareness from online consumers
themselves.
survey studIes
survey study I: consumers
protection of online privacy and
Identity
The study by Milne, Rohm, and Bahl (2004)
examined results from three consumer surveys
wherein attitudinal, behavioral, and demographic
antecedents that predict the tendency to protect
ones privacy and identity online were explored.
The research looks at online behaviors increasing or reducing risk of online identity theft and
indicates the propensity to protect oneself form
online identity theft varies by population. The
survey findings are summarized as follows:
There was a positive significant relationship between privacy concern and active
resistance.
Those who had bought online, provided
e-mail, and registered for a Web-site had
higher rates of protection and higher number of hours on the Web.
Males were more likely to protect their
information online than females.
Protection behavior increased with years of
schooling.
Younger online adults were more vigilant
than older adults in protecting information
online.
A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation
8
future trends
The growth of the online community does not
seem to be slowing. There are new and inventive
ways to use the Internet and the degree to which
consumers can interact online is also growing.
There are now ways to collaborate online that
did not exist just a few short years ago. This new
interaction online allows anyone to voice opinions
or work together. The success of the auction Web
site Ebay has been mostly due to the ability of
consumers to voice their opinion by rating their
online transactions. The rating system allows positive, negative, or neutral comments to be posted
regarding specific transactions. The rating and
comments are viewable by anyone and are used
A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation
Legislators need to be aware of what is happening within the technology area and respond
with legislation that protects consumers against
those who would misuse the technology. Each
year many proposals are introduced to further
define consumer privacy and protections, but
most do not become enacted into laws. Consumer
groups need to remain vigilant in their education
of legislators on what misuses of information are
occurring and why it is imperative they act.
How can consumers protect their information? Finding ways to make opt-in and optout easier and giving consumers options
on how a business stores and handles their
information. Should consumers be able to
more specifically control the storage time
and be informed exactly what information
is being stored?
Investigating a new model for personal information online: creating a secure personal
profile with your information and when
interacting with online businesses, you
provide an encrypted transaction number
that authenticates to your profile and allows
a transaction to be created and stored under
your profile. You are identified by the transaction number and not a name to the online
business. While the business is still storing
the transaction, their records do not include
personally identifiable information.
Gaps in legislative protections for consumers: recommendations for legislative actions
9
A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation
on how to fill the gaps in the current legislations to bring them up to current business
model expectations. HIPAA was created
before offshore outsourcing began an upswing in healthcare. There are currently no
provisions in the rules for sending consumer
information beyond our borders.
Investigation within different age groups
on specific concerns about online privacy.
Younger generations have concerns, but
still interact online because it is part of their
culture. Other generations choose to not
interact online for a couple of reasons. They
may lack understanding of the technology
and/or their judgments are usually based
more on personal interactions to create trust.
How can they trust an online entity when
there is no personal interaction? What would
it take for other generations to trust online
transactions?
references
Ambrose, S., & Gelb, J. (2006). Consumer privacy
litigation and enforcement actions in the United
States. The Business Lawyer, 61, 2.
Ashworth, L., & Free, C. (2006). Marketing
dataveillance and digital privacy: Using theories of justice to understand consumers online
privacy concerns. Journal of Business Ethics,
67, 07-123.
Baratz, A., & McLaughlin, C. (2004). Malware:
what it is and how to prevent it. Retrieved November 11, from http://arstechnica.com/articles/
paedia/malware.ars
BJs Wholesale Club settles FTC charges. (2005).
Retrieved from http://www.ftc.gov/opa/2005/06/
bjswholesale.htm
Brooke, J., & Robbins, C. (2007). Programmer
gives up all the money he made distributing
0
A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation
The CAN-SPAM Act: Requirements for Commercial Emailers. (2004). Retrieved from http://www.
ftc.gov/bcp/conline/pubs/buspubs/canspam.htm
AddItIonAl reAdIngs
37 States give consumers the right to freeze credit
files to prevent identity theft; consumers union
offers online guide on how to take advantage
of new state security freeze laws. (2007). PR
Newswire, July 16.
Anthony, B. D. (2007). Protecting consumer
information. Document Processing Technology,
15(4), 7.
Carlson, C. Poll reveals data safety fears. eWeek,
22(50), 29.
Chellappa, R., & Sin, R. (2005). Personalization
versus privacy: An empirical examination of the
online consumers dilemma. Information Technology and Management 6(2-3), 181-202.
de Kervenoael, R., Soopramanien, D., Hallsworth,
A., & Elms, J. Personal privacy as a positive experience of shopping: An illustration through the
case of online grocery shopping. International
A Taxonomic View of Consumer Online Privacy Legal Issues, Legislation, and Litigation
Chapter III
AbstrAct
There are many potential threats that come with conducting business in an online environment. Management must find a way to neutralize or at least reduce these threats if the organization is going to maintain
viability. This chapter is designed to give managers an understanding, as well as the vocabulary needed
to have a working knowledge of online privacy, vulnerabilities, and threats. The chapter also highlights
techniques that are commonly used to impede attacks and protect the privacy of the organization, its
customers, and employees. With the advancements in computing technology, any and all conceivable
steps should be taken to protect an organizations data from outside and inside threats.
IntroductIon
The Internet provides organizations unparalleled
opportunities to perform research and conduct
business beyond their physical borders. It has
proven to be a vital medium for worldwide commerce. Even small organizations now rely on
Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.
vulnerAbIlIty
Software vulnerabilities are not going away, in
fact they are increasing. According to the Coor-
Vulnerabilities do not have to be broken program code; Norman (1983) indicated that errors in
system designs, which provoke erroneous entries
by users can also be considered as vulnerabilities
that can be intentionally exploited by attackers.
Individually and collectively vulnerabilities
can create major risks for organizations. Weak
policies and protection can result in the release
of personal private information (PII). The release
of PII is not the only the problem. Another issue
is that hackers can obtain important data and
modify it. Suddenly, there are additional names
on the preferred lists, payroll, and accounts payable; and outsiders could be given authority or
consideration that they are not entitled to. An
organizations strategic plans could be compromised. Additionally, the release of PII can weaken
the publics confidence in the organization, subject
the organization to litigation, large fines/reparation costs, and to rigorous investigations, as well
as oversight.
threAts
Threats are not the same as vulnerabilities; threats
are things that take can advantage of vulnerabilities. Security threats, broadly, can directly
or indirectly lead to system vulnerabilities (Im &
Baskerville, 2005). An analogy might be an army
fort surrounded by the enemy where someone accidently left the forts front gate wide open. The
open gate is a vulnerability and the threat is the
opposing force. Translating this analogy to datainformation, the vulnerability would be a poorly
socIAl engIneerIng
It is not always a technical issuea perpetrator
can use chicanery and/or persuasion to manipulate
unsuspecting people into either revealing sensitive information (such as logon and password)
or compromise perimeter defenses by installing
inappropriate software or portable storage devices
(that are seeded with malware) on computer networks. For example, an approach of phishing is
to ask a user to fill out a simple fake online form.
The form itself asks almost no personal informa-
rIsk
Risk is involved in everything, every process, and
every system. Operational risk is often defined
as the risk of loss resulting from inadequate or
failed internal processes, people and systems, or
from external events. Risk is one of those things
that no one can escape and is hard to define. In
general, risk is the probability of a negative outcome because some form of threat will be able
to exploit vulnerabilities against an asset. Many
define the value of a risk attack as: the value of
an asset times the probability of a threat times
the probability of an undiscovered vulnerability
times some impact factor (representing reparations) times the possibility of the event. While
Elimination of risk is categorically impossible; the best that can be hoped for to get it
under control. Even if it were possible, the cost
and scalability issues of risk avoidance have to
be weighed against the cost of the probable losses
resulting from having accepted rather than having
eliminated risk (Pai & Basu, 2007).
Qualys, Inc. (2006) analyzed a global data
pool of more than 40 million IP scans with their
product QualysGuard. Data analysis revealed
the six axioms of vulnerabilities. These axioms
are important because they help management
understand the nature of possible attacks and
why and how their data could be at risk of being
compromised. Qualys Inc. (2006) believes that
2.
3.
4.
5.
6.
8
IDC indicates that strong corporate governance is the foundation of successful protection
of corporate assets from a wide variety of threats
(CNET, 2004). To that end, organizations need to
establish, educate, and enforce their policies to
effectively ensure the protection they need.
1.
2.
3.
mAlWAre
The term malware (malicious software) is typically used as a catch-all to refer to a variety of
forms of hostile, intrusive, or annoying software
designed to infiltrate or interrupt services from
a single computer, server, or computer network
without the owners informed consent. The term
malware includes all types of trouble makers: such
as: viruses, worms, kiddy scripts, Trojan horses,
and macro (scriptcontext) viruses. Malware seeks
to exploit existing vulnerabilities on systems.
Malware can utilize communication tools to
spread and oftentimes it goes unnoticed. McAfee
Avert Labs (Bernard, 2006) has recorded more
than 225,000 unique computer/network threats.
In just 10 months between January and November
of 2006, they found 50,000 new threats. Google
researchers (as part of the Ghost in the Browser
research) warned that one in 10 Web pages is
hiding embedded malware (Provos, McNamee,
Mavrommatis, Wang, & Modadugu, 2007).
The term malware is often associated with the
characteristic attributes of a virus; self-replicating,
something that embeds itself into other programs,
which in turn can infect other programs. The no-
9
0
Adware/spyware
A particular annoying and dangerous form of
malware is adware/spyware. The terms are communally used as interchangeably. The goal of this
technology is to gather information without the
target persons knowledge or permission. This
type of software is used to watch and record
which Web sites and items on the Internet the
user visits in hopes of developing a behavioral
profile of the user that can later be exploited.
The slight difference between the two terms is
the intent of the software agent. Adware has an
advertising aspect in the information it collects,
while spyware tracks and record user behavior
(in the traditional sense of the word spy).
The problem with spyware is that users
typically store all sorts of sensitive and personal
information in their machines that should not be
made public. Some information is protected by
law, trade secrets, and financial data. The loss of
personnel and customer information could wreak
botnets
Vent Cerf, one of the founding fathers of the
Internet, believes that one in four computers (approximately 150 million out of 600 million) connected to the Internet are compromised and likely
to be unwilling members of a botnet (Fielding,
2007). These machines are often used as proxies
for illegal activities like spamming and credit card
fraud. Botnets have been a growing problem on
the Internet since at least 2002. A bot (short for
robot) is a software agent released onto a computer
connected to the Internet. The bot can download
malicious binary code that compromises the host
turning it into a zombie machine. The collection
of zombies is called a botnet. The servers hosting
the bot binaries are usually located in countries
unfriendly to the United States. The bots are
transparent and run in the background. Bots can
open a channel to a bot controller machine,
which is the device used by the perpetrator (the
bot herder) to issue commands to the bots (Baylor
& Brown, 2006).
Bot herders typically use bot controllers to
harvest user accounts via screen-capture, packet-
Storm
Rbot
Rbot is generally considered the second largest
botnet. It employs an old-style communication
structure using Internet relay chat. Because it
and The Botnet Task Force, (a low-profile organization initiated by Microsoft in 2004 that acts
as a means of building awareness and providing
training for law enforcement).
In the process, the FBI has identified more
than 1 million hijacked personal computers. The
majority of victims are not aware that their computers have been compromised or their personal
information exploited. The FBI said because of
the widely distributed abilities of botnets they not
only harm individuals but are now considered a
threat to national security, as well as the information infrastructure and the economy.
seArch engInes
A problem with most search engines is that they
are ambivalent to content permissions. Certain
individuals (such as the head of payroll) may
have permission to view all of the companys
information. While other individuals (such as the
head of personnel) are limited in the type of data
are allowed to see. An employee may be given
permission to see their own information but not
that of the person working next to them. There
may also be certain individuals that are not allowed to see any information at all. Because search
engines typically can not take data ownership and
coordinate it with user permissions, problems can
arise when responding to a request.
When implemented carelessly, search engines
have the potential to uncover flaws in existing security frameworks and can expose either restricted
content itself or verify the existence of hidden
information to unauthorized users (Vivisimo,
2006). In this regard, poorly implemented search
engines could release large amount of personal
identification information. Imagine typing the
name of the CEO in a search engine and receiving a page that lists his personal phone number,
salary, and home address.
WIreless medIA
Organizations may think their mobile workers are
safe with their new wireless notebooks, but recent
WLAN tracking at the RSA security conference
showed a multitude of vulnerabilities. Some
common faults were that many users were using
hotspots, but had no idea who was sponsoring
the ports. In some cases, it was discovered that
the users were actually talking to other local
computers that also had their connections active
(Shaw & Rushing, 2007).
Wireless devices often remember the last
good site they were connected to and attempt to
use them first. Which means that if the user did
not shutdown the port (disconnect from a hot spot
correctly), the computer will look for that spot first,
even if there is a more secure connection available. Another issue is that the port will continue
to actively search for a signal. A critical situation
can arise if the user forgets to disable the wireless
card, and then plugs his/her device into a wired
network. A couple of things could happenthe
network will see the other port and might adjust
its routing information to accommodate it, in
the process it could bypass firewalls and border
security. Another thing that may happen is the
device might also connect to another device via
the wireless port, again bypassing some security,
but elevating the permissions and authority of the
newly connected user to that of the legitimate user.
In either case, the result is a huge hole in security
(Shaw & Rushing, 2007).
Organizations are paying a very high price
for wireless management. The Aberdeen Group
estimates that it costs nearly 10 times more to
manage wireless services and devices compared
to wired-lines (Basili, 2007). In spite of that,
Aberdeen found that 80% of respondents were
planning increases in mobile wireless access.
The RSA Conference is an event that draws
thousands of computer users. Many of them
bring their wireless laptops (and other devices).
AirDefense (2005), a wireless security company,
telephones
Wireless telephones with computer-enabled features (such as e-mail and Internet access) have
been compromised; Trend Micro Inc. announced
it had found security flaws on MS Windows
Mobile, a popular operating system used in the
smartphone. Many individuals that used these
devices are executives who routinely access sensitive information. In this case, the main risk is not
malware, but the risk of lost devices.
mobile encryption
The news regularly repots that laptops with
thousands of sensitive records on customers or
employees are lost or stolen each month. Organizations know the risks and the threats. These threats
are easy to understand but most organizations do
not allocate the resources necessary to protect
themselves. Encryption is an effective safe guard
for most mobile devices, and one that will relieve
some of the legislative pressures. However, it is
far from being fully adopted; a survey by Credant
(see McGillicuddy, 2006) asked respondents to
list reasons why their companies had not adopted
encryption for mobile devices.
dAtA
Organizations accumulate a wide breath of data,
that if stolen could potentially hurt the enterprise.
Loss or theft of confidential information: such
as blueprints and engineering plans, tenders,
budgets, client lists, e-mails and pricelists, credit
card and other financial information, medical or
other confidential personally identifiable records,
classified, restricted or personal information,
scripts, storyboards, source code, database
schemas, or proprietary trade secrets can severely
impact the integrity and profitability of a corporation. This risk is amplified by the prevalence of
portable computing devices as a part of normal
business activities and by the increasing levels of
online transactions that occur routinely (GFI-2,
2007).
Fundamentally, there are two types of security.
The first type is concerned with the integrity of the
data. In this case the modification of the records
is strictly controlled. The second type of security
is the protection of the information content from
inappropriate visibility. Names, addresses, phone
numbers, and credit card details are good examples
endpoInt (perImeter-bAsed)
securIty
The term endpoint, as its name implies, is any
place that a device can interact with another device.
Generally speaking, an endpoint is an individual
computer system or device that acts as a network
endpoint components
Firewalls
In general terms, a firewall is software or a
hardware device that controls the flow of traffic
between two networks or entities. A packet filter
firewall works by inspecting the contents of each
network packet header and determining whether
it is allowed to traverse the network. There are
basically three types of firewalls: packet filter,
stateful inspection, and application proxy.
preventIve meAsures
The open nature of PCs in most organizations
has resulted in users installing a wide variety
of applications that they use to get through their
day, and several that they should not. Some IT
managers attempt to prohibit the use of unauthorized peripherals (removable media) and
applications with the hope that this process will
shut out malware. The usage of portable devices
at work could impact corporate network security
through the intentional or unintentional introduction of viruses, malware, or crimeware that can
bring down the corporate network and or disrupt
business activity.
In an organizational environment, the mentioned still applies. However, the user is usually
burdened by user names and passwords. The
number one suggestion is pick a strong password
and do not share it with anyone for any reason.
If you need to have multiple sign-ons, tailor the
passwords for each application. For example your
password for accounts payable may begin with
AP. The easiest way to pick strong passwords is
to create an acronym out of your favorite song
lyrics. Take the first letter of each of the first 12
words, your application code and some important
number, like the middle digits of your first home
address.
According to CompTIAs IT security survey, human error, either alone or in combination with a
technical malfunction, was blamed for 74% of
the IT security breaches (Cochetti, 2007). Human
involvement in systems is not limited to making
8
Figure 2.
9
vulnerability management
The process of patch management can be complex, difficult, and is often sacrificed when an
organization is in a crisis mode. If shortcuts
are taken, they will almost always comes back
to haunt the organization. Patching in the programming has long been defined as trading an
error that is known for one that is unknown. It
is not the thing to rush through. Vendors spend
considerable time researching vulnerabilities and
devising repairs or work-arounds. Many of the
repairs are dependent on updates being already
applied. Failure to stay current on updates is one
of the main reasons that enterprises struggle with
bot infections (Symantec).
Patching is a trade off between the time required to repair a problem responsibly and completely versus the hackers window of opportunity
to exploit a specific vulnerability. Vulnerability
management has become a critical aspect in
managing application security. Patching vulnerabilities (depending on the severity) can be a time
consuming job. To do it safely, the patches should
be applied and tested in an isolated environment
against a copy of the system.
0
conclusIon
No matter how hardened a network perimeter is,
there are a number of weaknesses that can allow
breaches to occur. It is usually recommended that
a layer defense approach be adopted to strengthen
protection. However, care needs to be taken that
additional layers actually add protection instead
of just protecting against the exact same vulnerabilities or threats. Reckless implementation or
selection of software may not produce the desired
outcome. A layered approach may be more like
buying overlapping warranty coverage. The harm
is that businesses may confuse this approach for
real security. Ultimately, they could end up spending more money and resources on implementing
the wrong security mechanisms without gaining
complete security (Ou, 2007).
Remember the organization is responsible for
maintaining the privacy of the stakeholders consumer while also preserving a harassment-free,
discrimination-free, crime free, and civil business
environment. The development, implementation,
and enforcement of a comprehensive Internet
policy can help in that goal. Whether employees
intentionally violate Internet policy or accidentally surf to an objectionable Web site, under the
legal principle known as vicarious liability, the
employer can be held responsible for the misconduct of the organizations employeeseven
if the employer is completely unaware that there
is a problem.
Simply following security best practice by
limiting access rights may be a good first step,
but it is just a step. No single approach is going to
be totally viable against all malware and protect
privacy. The best protection comes from using a
layer approach. In addition to using technology
it is important to:
Of the four, they indicate that the most important focus area is the managing of data and
knowledge to improve results.
This chapter presented an overview of the
concerns that organizations must address while
working within the Internet community. It was
meant to inform management of the potential
references
Aberdeen Group. (2005). Third brigadebusiness
value research seriesmost important security
action: Limiting access to corporate and customer
data. Whitepaper. Retrieved October 2007, from
http://www.thirdbrigade.com/uploadedFiles/
Company/Resources/Aberdeen%20White%20P
aper%20--%20Limiting%20Access%20to%20
Data.pdf
org/congressional_testimony/Shimeall_testimony_Aug23.html
Privacy Rights Clearinghouse. (2007). A chronology of data breaches. Retrieved October 2007,
from http://www.privacyrights.org/ar/ChronDataBreaches.htm
Vivisimo. (2006). Restricted access: Is your enterprise search solution revealing too much? Retrieved
October 2007, from via http://Vivisimo.com/ or
http://www.webbuyersguide.com/bguide/whitepaper/wpDetails.asp_Q_wpId_E_NzYyMQ
Wang, H., Lee, M., & Wang, C. (1998, March).
Consumer privacy concerns about internet marketing. CACM 41(3), 63-70.
Webex.(2006). On-demand vs. On-premise instant
messaging. Webex Communications, Ease of
CommunicationsOn Demand EIM Solutions.
Retrieved October 2007, from http://www.webbuyersguide.com/bguide/Whitepaper/WpDetails.
asp?wpId=Nzc4MQ&hidrestypeid=1&categor
y=
Wilson, T. (2007, November 12). ID thief admits
using botnets to steal data. Retrieved November
2007, from http://www.darkreading.com/document.asp?doc_id=138856
Yank, G. C. (2004 December 21). Canning spam:
Consumer protection or a lid on free speech? Retrieved October 2007 from http://www.law.duke.
edu/journals/dltr/articles/2004dltr0016.html
AddItIonAl reAdIng
Bcher, P., Holz, T., Ktter, M., & Wicherski,
G. (2005). Know your enemy: tracking botnets;
Section II
Chapter IV
AbstrAct
Governments and large companies are increasingly relying on information technology to provide enhanced services to the citizens and customers and reduce their operational costs. This means that an
increasing amount of information about ordinary citizens is collected in a growing number of databases.
As the amount of collected information grows and the ability to correlate information from many different databases increases, the risk that some or all of this information is disclosed to unauthorised
third parties grows as well. Although most people appear unaware or unconcerned about this risk, both
governments and large companies have started to worry about the dangers of privacy violations on a
major scale. In this chapter, we present a new method of assessing the privacy protection offered by a
specific IT system. The operational privacy assessment model, presented here, is based on an evaluation
of all the organisational, operational and technical factors that are relevant to the protection of personal
data stored and managed in an IT system. The different factors are measured on a simple scale and the
results presented in a simple graphical form, which makes it easy to compare two systems to each other
or to identify the factors that benefit most from improved privacy enhancing technologies.A standardised
assessment of the privacy protection offered by a particular IT system; serve to help system owners understand the privacy risks in their IT system as well as help individuals, whose data is being processed,
to understand their personal privacy situation. This will facilitate the development and procurement of
IT systems with acceptable privacy levels, but the simple standard assessment result may also provide
the basis for a certification scheme, which may help raise the confidence in the IT systems ability to
protect the privacy of the data stored and processed in the system.
Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.
IntroductIon
Existing research into privacy enhancing technology (PET) has provided few answers to many of
the real questions that governments and large
companies are facing when they try to protect the
privacy of their citizens or customers. Most of the
current work has focused on technical solutions
to anonymous communications and pseudonymous interactions, but, in reality, the majority of
privacy violations involve careless management
of government it-systems, inadequate procedures
or insecure data storage. In this chapter, we introduce a method that helps system developers and
managers to assess the level of privacy protection offered by their system and to identify areas
where privacy should be improved. The method
has been developed in the context of government
IT systems in Europe, which has relatively strict
privacy legislation, but we believe that the method
may also apply to other government systems, nongovernmental organisations (NGOs) and large
private companies. With the privatisation of many
state monopolies, such as telecommunications
and railroads, in many countries and the increasing number of public/private partnerships, the
distinction between the public and private sector
has grown increasingly fuzzy.1 For the purpose
of clarity in our discussions, however, we have
decided to use the vocabulary from government
systems, so we discuss the relationships between
governments and citizens instead of companies
and customers.
Governments are increasingly relying on information technology to provide enhanced services
to the citizens and reduce the costs of the public
sector. This means that an increasing amount of
information about ordinary citizens is collected in
an increasing number of government databases.
As the amount of collected information grows and
the ability to correlate information from many
different databases increases, the risk that some
or all of this information is disclosed to unauthorised third parties grows as well. Although most
8
the citizen remembers what data has been collected and what information can be inferred from
the data. Informed consent, where the individual
citizen agrees to the collection of private data for
a specific purpose, is one of the most important
instruments in data protection legislation, but it is
not a realistic solution to this problem. Moreover,
informed consent only addresses the collection
and authorised use of private information, it does
little to inform the citizens about the way that
their data is stored and what procedures are in
place to keep this data safe.
In this chapter we present an operational privacy model that helps focus the privacy discussions on areas with the most significant privacy
risks. The model defines a method that can be used
to analyse the privacy properties of IT systems,
which are different in their design, functionality,
and the data that they process. The method helps
identify privacy risks in such systems, so that
they may be the subject of further analysis and
discussion. Moreover, the method helps assess the
magnitude of different privacy problems, so that
developers may decide which problem to address
first and citizens may decide whether they wish to
trust the system with their personal data. While
the method may be used to suggest areas of possible privacy enhancements, it does not seek to
provide specific solutions to privacy problems, but
leaves it to the owners of the system to develop
specific solutions for the individual system. In
other words, the goal of the model is to identify
possible privacy risks and provide the basis for
valuable privacy improvements. The assessment
of privacy plays an important role in online
systems because a standard privacy assessment
scheme may provide the basis for a certification
program that will allow individuals to decide
whether they wish to interact with a particular
service provider. The operational privacy assessment model proposed in this chapter is based on
factual answers to a few simple questions, which
makes it ideally suited for a certification scheme.
The results of a privacy assessment are presented
9
in a simple graphical form, which allows a layman to determine the overall privacy protection
in the examined system and identify areas with
particularly high privacy risks.
prIvAcy In government
It-systems
There have been a number of proposals for a
definition of privacy, but although some have
been widely used, none have been able to meet
the demands of new technologies and variations
in cultures. This is why the concept today is
somewhat diffuse and the definition depends
highly on the context in which it is used.
Generally, privacy can be described as a form
of knowledge about our existence which we may
wish to control as it is something others potentially
can use against us. This knowledge can be further
divided into three categories: knowledge about
our person, knowledge about our relationships,
and knowledge about our behavior.
In this classification, knowledge about our
person covers information regarding physical
factors, such as information about our health,
financial situation, consumer profile, current
location, and recent movements. For practical
reasons, it is common for individuals to share
chosen parts of information from this category
with selected persons or institutions. For example,
health information is often shared with doctors
and the health insurance agency, but this does not
automatically confer a wish to share this information with friends or co-workers.
The category knowledge about our relationships covers information about relationships
with other persons or institutions. Examples of
such relationships could be political and religious
convictions as well as family, sexual, work related, and other relationships with friends. Even
though most of these relationships involve at
least one other party, the knowledge about their
existence can still be sensitive and private for
the individual.
0
and so forth. For this reason, these data are extremely sensitive. The consequences of a breach
in the security of these data can be negative for
both the individual and for the government institutions that are responsible for the data. Since
these data serve as the basis of identification of
an individual, a breach can in worst case, lead to
extensive problems with identity theft.
Common for personal data stored in government systems is that they are gathered for a specific
purpose. Whether this purpose is fair or is in
itself a breach of privacy is a completely different
discussion concerning the merits of the specific
system versus the limitations to the privacy of
the individual. This chapter does not enter this
discussion beyond looking at whether a system
has a given purpose for collecting private data or
not. The goal must be to limit access, use, and
processing of personal data as much as possible,
while still acknowledging the purpose for which
the data has been gathered.
With this focus on personal data in government
systems, we can finally make a task-specific definition of privacy. This definition is only applicable
for use in the model described in this chapter and
as such is not an attempt to make a complete and
general definition of privacy.
Privacy is the influence on and knowledge about
the existence and use of personal data in government digital records.
The requirement of knowledge about the
information that is stored about the individual,
gives the individual an important tool to know
exactly what personal information the government has registered as well as the purpose of
this registration. The requirement of influence
addresses the important goal of being able to
control who gets access to personal data. We use
the term influence, instead of the stronger term
control, because government systems usually
prIvAcy protectIon
With the growth of online systems and the digitalisation of administrative procedures, the interest
in the topic of user privacy is rapidly increasing.
Much of the work being done in securing privacy
is being focused on defining and developing
technical tools and standards for the protection of
private information according to Pfitzmann and
Khntopp (2000) and the European Parliament
(1995). These tools seek to maximise privacy in
systems by minimising the amount of and restricting the access to personally identifiable data as
well as the amount of information that can be
gathered from the interactions between private
parties and services. However, these methods
are often hampered or outright prevented by the
functional requirements of the systems they seek
to secure. For example, many financial systems
require the real name of its users in order to meet
accountability requirements, so these requirements prevent the system from implementing
privacy through anonymity or pseudonymity.3
When the requirements of the individual systems
prevent the implementation of technical privacy,
other means are needed to ensure the best possible
level of privacy protection. While the technical
privacy enhancing technologies may be readily
available, their full effect on the privacy of a
system is often hard to measure, especially in
the cases where these tools cannot be used to
their full effect.
Influence on Privacy
The level of privacy protection that is required for
a specific system depends on the type of system
and the type of personal identifiable information
that is managed in the system. There are three
primary categories that influence privacy by defining demands that the system must meet.
Actors
Actors are individuals and institutions, such as
political parties, interest groups, and industry
lobbyists, whose work directly or indirectly influence the public opinion about privacy. Their
mutual interests and conflicts can often influence
the level of privacy protection implemented in
a government IT system. Actors that influence
privacy rarely aim to violate privacy directly.
Often, it is the negative side effects of suggestions to change other parts of the system, which
end up influencing privacy in a negative way, for
example, proposals to increase surveillance as
part of anti-terror legislations are meant to save
innocent lives, but they are obviously detrimental
to privacy. Privacy activists, on the other hand,
legislation
Although laws are the result of the work of politicians, they are still a decisive factor in government systems, because they are more consistent
and stable than the mind of politicians and other
actors.
An example of such legislations is the Data
Protection Directive from the European Parliament (1995), which is defined to protect personal
data. The goal of this directive is to harmonise
culture
Culture is an important factor in the perception of
privacy, and it often decides when privacy issues
are of interest. Cultures with a relaxed attitude towards the protection of privacy are also less likely
to protest against the development of systems to
process personal data. Even within the EU, there
are large variations between the citizens privacy
concerns in the different countries. According to
a privacy poll performed by the European Opinion Research Group (2003), countries such as
Denmark, Spain, and Portugal have populations
where only 13% are very concerned about their
privacy, but the populations of countries such as
Greece and Sweden are much more concerned
about privacy with 58% and 54% of their respective
populations stating that they are very concerned
prIvAcy enhAncIng
technologIes
privacy maintenance
Privacy is not clearly defined and the composition
of the three categories of actors in the privacy
field tends to change over time, so the privacy
risks also change over time. This is important
to keep in mind when designing new systems or
legislation, because both will have to be updated
at regular intervals. There are two main areas that
serve to maintain the privacy of citizens, the
legislation based protection and the technology
based protection. The risk of placing protection
in legislation is that it can be changed. What is
legal today might not be legal next year. Therefore, there are different factors that have to be
considered:
These three terms cover the spectrum of privacy risks in government systems, but they are
far too generic to base an exact evaluation on.
Therefore these three areas are decomposed into
their privacy relevant components.
data-storage
communication
Privacy in communication is very much a matter
of securing the transmissions. But it also extends
data-processing
A very privacy sensitive person may argue that
privacy is broken whenever personal data is being
processed. However, in government systems there
is often a reason for the processing of personal
information, which just has to be accepted. The
discussion of whether the purpose of a specific
system in itself is a breach of privacy, such as the
current transfer of airline passenger data between
European Union and United Stats of America
(2007), is another discussion. Here we focus on
the three components of data-processing that are
important in a privacy evaluation.
operAtIonAl prIvAcy
Assessment model
In the following, we present a method to assess
the level of privacy protection offered by government IT systems that contain sensitive data. The
model defines a method to identify and highlight
privacy issues either during the development of the
system or as an evaluation of an existing system.
The method gives a standardised evaluation of
privacy, which is not influenced by the current
state of technology or current moral standards.
This evaluation can then be used to apply more
focused solutions for upgrading privacy, solutions
which are tailored to the design of the specific
system and its functional requirements.
Applying the model to a system in its design
phase allows the system designer to evaluate the
privacy issues of the current design and the effects that different design choices may have on
privacy. Focusing on these issues already, before
implementation, makes it significantly easier to
ensure minimal privacy risks in the final system.
The method does not prescribe or recommend
specific solutions to these privacy issues, but only
highlight areas that could benefit from additional
privacy protection. The method also produces
results in a simple graphical form that can be used
to compare two possible design alternatives for
the current system or two systems with similar
functionality, which can be useful when having
to choose between systems.
8
Risk profile which is a measure of the sensitivity of the data in a system. The measure
ranges from items of low sensitivity, for
example, phone numbers, to items of high
sensitivity such as medical records.
ID-separation is a measure of the degree in
which identification data has been separated
from the operational data in a system. The
use of pseudonyms, instead of real names,
would provide a low separation, while use of
completely anonymized data would provide
a high degree of separation.
9
0
This category of privacy is less about preventing privacy breaches and more about ensuring
that every person whose private data is collected
and processed is informed of this and given the
necessary knowledge and opportunity to ask
questions and raise objections.
Control: This category covers the controls
that the system implements to check the use and
storage of the data. This includes both the users
control of data correctness and external audits
of the system. The category examines the users
own level of control with data in the system, but
also the level of outside impartial controls to
ensure that the system is only doing what it is
supposed to do. Finally, the category examines
the systems own functions to ensure that data is
correct and coherent. The major focus areas of
the category are:
Storage
1.
No protection
0%
1.
0%
2.
50%
2.
Sensitive data
50%
3.
100%
3.
100%
ID-separation
Communication
1.
No protection
0%
2.
50%
3.
100%
the use of common identifiers. The standard surveillance classes are shown in Table 4.
Ordinary use: This category focus on problems relating to the day to day operation of the
system. This includes the education of staff with
respect to privacy issues, the implementation
of the necessary access controls to enforce the
need-to-know principle, and the ability to make
off-line copies of data that are not subject to the
controls implemented by the system. The standard
ordinary use classes are shown in Table 5.
Transparency: This category focus on the
registered persons ability to ascertain the purpose
1.
No protectionidentification is possible
0%
2.
Responsible pseudonyms
33%
3.
Pseudonyms data
66%
4.
100%
0%
2.
33%
3.
66%
4.
Developed with many external organization and users, with high public attention
100%
0%
2.
50%
3.
100%
Outsourcing
1.
0%
2.
Developed with some/all of the development outsourced to a company that is covered by the same laws
50%
3.
Not outsourced
100%
prIvAcy Assessment
0%
2.
50%
3.
100%
Common identifiers
1.
0%
2.
50%
3.
100%
Purpose
1.
0%
2.
25%
3.
50%
4.
75%
5.
100%
Education
1.
0%
2.
33%
3.
66%
4.
100%
Access control
Contacts
1.
0%
2.
50%
3.
100%
Information of rights
1.
0%
2.
33%
3.
66%
4.
100%
1.
0%
2.
50%
3.
100%
Information of data
1.
0%
User storage
1.
0%
2.
33%
2.
50%
3.
66%
3.
100%
4.
100%
0%
2.
33%
3.
66%
4.
100%
Data correctness
1.
0%
2.
33%
3.
66%
4.
100%
1.
No audit
0%
2.
Occasional audit
33%
3.
66%
4.
100%
Audit
prIvAcy Assessment
evAluAtIon
When the privacy protection instruments of a
given system is evaluated, it is important to examine both the quality of the privacy enhancing
technologies employed in the system and the
prIvAcy Assessment
methodology
Producing the assessment result using the model
means that for each of the seven categories in
the model, the components must be assigned a
percentage value as described in the fifth section.
The specific values are found by establishing the
amount of work done to protect privacy in a scale
between no work and all that is possible. Such
values can be found through setting a number of
possible levels of work and assigning each level
a value. For example, to determine the amount
of work done to protect privacy through audits,
in the category control, four possible levels can
be used (see Box 1).
When determining which level is achieved
by the system, the lowest denominator has precedence, for example, an occasional independent
audit will only score 33%. Completing these
evaluations for all categories and their components
produces the values required to calculate the cat-
No audits
Occasional audit
Internal audits by official schedule
Independent audits by official schedule
0%
33%
66%
100%
prIvAcy Assessment
scenArIos
In order to demonstrate the feasibility of the
proposed privacy assessment model, we have
defined two application scenarios, which illustrate
how our model may be used to assess the privacy
properties of these scenarios. In each scenario, we
provide a description of the setting and the immediate results of an assessment for each category
scenario 1
The first scenario defines a simple medical journaling system used by doctors and small clinics.
The data stored in the system is personal medical
information, which is by nature very sensitive. The
journaling system is integrated with the doctor
or small clinics existing IT system, which also
stores the data. The system does not encrypt data
but the door to the office is locked after opening
hours. The system is developed by a small development company and is only used by a handful
of doctors and clinics. The system uses social
security numbers to identify patients and stores
any personal data that the doctors find relevant.
The clerks at the medical clinic have no special
training regarding management of personal information and they have the same access to the
system as the doctor. The system allows doctors
to print individual patients journals so that they
can bring them on a house call or take them
home. Patients data are entered into the system
the first time they visit the clinic and updated at
subsequent visits. The data in the system does
not expire and there is no external audit of the
system. The patients can, upon request, receive
a copy of their data but they are not otherwise
informed about the contents or functions of the
system and the patient has no direct control of
the stored data.
Assessment Results: Using our model, we get
a specific result for each category which indicates
the level of privacy risks in each category.
Data protection: While the overall score in
this category is relatively high, the model shows
an evident opportunity for improvement by
encrypting the stored private data. The system
scores 25%.
Sensitivity: The category notes some very
large risks inherent in a system with this sort of
8
scenario 2
This scenario is similar to Scenario 1, but with a
different system. The journaling system in this
scenario is developed by the IT branch of a large
medical company which also hosts and serves the
data from its central server farms. These server
farms securely stores the data and communicates
with the clinics using encrypted channels. The
system is able to share data with hospitals and
provides online access for the patients to their
data. The system uses social security numbers
to identify patients and stores any personal data
the doctors find relevant. The clerks and doctors
9
ries of data protection, control, transparency, and ordinary use. While some of these
improvements are technical solutions, such as
encryption and role based access control, many
are the result of improved policies, practises and
awareness. This includes improved training,
regular audits, and the effort to inform the registered individuals of what is going on. While these
improvements can be more difficult to assess than
technical solutions, such as the use of role based
access control, they may prove equally efficient
and help to significantly reduce privacy risks when
technical solutions are not possible.
The comparison reveals that scenario two is
in most areas superior to scenario one from a
privacy perspective. While some of this is based
on technical solutions that may not be easily adaptable for scenario one, many of the more practical
are. Transparency, ordinary use, and control are
areas where scenario one could benefit by learning from scenario two.
While the system describe in scenario one is
likely to be cheaper, it comes with an extra cost
of poor privacy protection.
future trends
Most people are, in one way or another, concerned
about their privacy, this is especially true amongst
80
conclusIon
Privacy is a subjective concept, so its definition
and importance will vary from person to person.
The model presented in this chapter helps to standardise the work of securing privacy in electronic
systems. The model contains seven categories
that together cover all aspects of privacy. Each
category clarifies a range of questions concerning privacy and the model produces a simple
objective result in the form of a score system.
The score system makes it possible to assess the
overall privacy level in a system and to compare
the system to other similar systems. The score in
each category indicates the level of privacy risk
within that category, which helps the developers
and administrators of government IT systems
to identify the privacy factors that should be
addressed first. The score relates to the current
state of privacy in the system, but it may also
help determine how well the system tested may
address future privacy problems, for example, if
the system has a low score in sensitivity because
the information that it manages is highly sensitive,
there is little hope for a better score in the future.
The standardisation and the overview of privacy
risks provided by the model, serve to help system owners understand the privacy risks in their
systems as well as help the individuals, whose
private data is being processed, to understand
their personal privacy situation. Furthermore, the
model addresses all the technical and operational
aspects which influence privacy in a system. The
model has been evaluated in a few real systems
by Jansen and Peen (2007), but we would like
to analyse a whole sector within a government
administration, in order to demonstrate the
general applicability of our model. This analysis
would also provide valuable information about
the general state of privacy in the government
sector. The proposed model focuses on government IT systems, which are governed by privacy
legislation and there are few direct motives for
civil servants to ignore privacy laws. The privacy
8
references
Blaze, M., Feigenbaum, J., & Lacy, J. (1996).
Decentralized trust management. In Proceedings
1996 IEEE Symposium on Security and Privacy
(pp. 164-173).
Chaum, D. (1988). The dining cryptographers
problem: unconditional sender and recipient untraceability. Journal of Cryptology, 1(1), 65-75.
Congress of the United States. (2001). USA PATRIOT ACT of 2001. Retrieved July 16, 2007, from
http://thomas.loc.gov/cgi-bin/bdquery/z?d107:
H.R.3162
Jansen, T. W., & Peen, S. (2007). Privacy i offentlige systemer. Masters thesis, Informatics and
Mathematical Modelling, Technical University of
Denmark (in Danish).
Lederer S., Mankoff, J., & Dey, A. (2003). Towards a deconstruction of the privacy space. In
8
8
endnotes
1
8
5
6
8
Chapter V
AbstrAct
Any interactionfrom a simple transaction to a complex collaborationrequires an adequate level of
trust between interacting parties. Trust includes a conviction that ones privacy is protected by the other
partner. This is as true in online transactions as in social systems. The recognition of the importance of
privacy is growing since privacy guarantees are absolutely essential for realizing the goal of pervasive
computing. This chapter presents the role of trust and privacy in interactions, emphasizing their interplay. In particular, it shows how ones degree of privacy can be traded for a gain in the level of trust
perceived by the interaction partner. After a brief overview of related research, the idea and mechanisms
of trading privacy for trust are explored. Conclusions and future trends in dealing with privacy and trust
problems complement the chapter.
IntroductIon
Any interactionfrom a simple transaction to
a complex collaborationcan be successful
only if an adequate level of trust exists between
interacting entities. One of the more important
Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.
8
perception of a privacy threat from a collaborator may result in a substantial lowering of trust.
In particular, any sharing of an entitys private
information depends on satisfactory limits on its
further dissemination, such as a partners solid
privacy policies. Just a privacy threat can impede
sharing of sensitive data among the interacting
entities, which results in reduced effectiveness of
the interaction and, in the extreme cases, even in
termination of the interaction. For instance, a user
who learns that an Internet service provider (ISP)
has carelessly revealed any customers e-mail will
look for another ISP.
The idea and mechanisms of trading privacy
for trust, the main topic of this chapter, are explored in the following section. It categorizes
types of privacy-for-trust tradeoff, and shows
how to exchange ones privacy for trust in an
optimal way. The remaining sections, in turn,
present our view of future trends in research on
privacy and trust; include conclusions; present
future research directions for privacy and trust
in computing; include references; and suggest
additional reading material that can supplement
the topics of this chapter.
8
88
can simplify security problems by reducing complexity of interactions among system components,
both human and artificial ones.
2.
3.
Symmetric vs. asymmetric trust: Symmetric trust assumes that X trusts Y implies
Y trusts X, which is not true in general.
Asymmetric trust does not assume such
implication and is, therefore, more general.
Symmetric trust can be viewed as its special case, which can be chosen only in very
special circumstances or applications.
Gradual vs. binary trust: The former allows for degrees of trust, and can be defined
on a multilevel or a continuous scale. The
latter is an all-or-nothing proposition, which
forces to specify a single trust threshold
above which full trust can be assumed. Binary trust, as a special case of gradual trust,
is insufficient in general. It can be assumed
only for very special and limited settings.
Explicit vs. implicit trust: Implicit trust is
used by either ignorant or nave interaction
parties. For instance, a user, who downloads
a file from an unfamiliar Web site, trusts it
implicitly by not even considering trust in
a conscious way. The consequences might
include penetration by malware.
Explicit trust allows for its clear specification, assuring that trust considerations are
not ignored. Given Xs need for determin-
4.
5.
6.
7.
89
Caveats
90
Threats to Privacy
4.
5.
6.
9
9
related Work
Related Work on Privacy
Many conferences and journals, not only in
the area of computer science or other technical
disciplines, focus on privacy. We can mention
only a few publications that affected our search
for a privacy-for-trade solution presented in this
chapter.
Reiter and Rubin (1999) use the size of the
anonymity set to measure the degree of anonymity for senders or receivers. The anonymity set
contains all the potential subjects that might have
sent or received data. The size of the anonymity
set does not capture the fact that not all senders
in the set have an equal probability of sending a
message. This may help the attacker in reducing
the size of the set of potential senders. Therefore,
the size of the anonymity set may be a misleading measure, showing a higher degree of privacy
than it really is.
Another approach (Diaz, Seys, Claessens, &
Preneel, 2003; Serjantov & Danezis, 2003) uses
entropy to measure the level of privacy that a
system achieves. Differential entropy is used by
Agrawal and Aggarwal (2001) to quantify the
closeness of an attribute value, as estimated by
an attacker, to its original value. These papers
assume a static model of an attacker, in the sense
that the attacker does not accumulate information
by watching the system over the time.
9
9
9
9
A. Trust Metrics
Trust cannot be built or verified without having
measures of trust or trust gain. We propose a
three-step method for defining a trust gain metric.
In the first step, we determine multilevel trust
metrics with n trust levels, measured on a numeric
scale from 1 to n, where n could be an arbitrarily
large number. Such metric is generic, applicable to
a broad range of applications, with the value of n
determined for a particular application or a set of
applications. The case of n = 2 reduces multilevel
trust to the simplistic case of binary trust (it might
still be useful in simple trust-based applications),
with trust levels named, perhaps, full_service and
no_service. Selecting, for example, n = 5 results
in having five trust levels that could be named:
no_service, minimal_service, limited_service,
full_service, and privileged_service going from
the lowest to the highest level.
9
98
protecting privacy
Protecting privacy requires defining privacy
metrics as a prerequisite. Privacy measures are
discussed first, and methods for protecting privacy,
relying on metrics, are presented next.
A. Privacy Metrics
We cannot protect privacy if we do not know
how to measure it. This indicates the importance
of privacy metrics. More specifically, we need
a privacy metric to determine what degree of
data and communication privacy is provided by
given protection methods. The metric has to work
99
repeated patterns of data access can leak information to a violator; (iii) dynamics of violatorssuch
as how much information a violator may gain by
watching the system for some time; and (iv) costs
associated with a metric implementationsuch
as injected traffic, delays, CPU cycles, and storage use.
We proposed two metrics for assessing the
privacy achieved by a given system: an anonymity
set size metric and an information-theoretic metric
also known as an entropy-based metric. The first
metric can provide a quick estimate of privacy,
while the second gives a more detailed insight into
the privacy aspects of the system it measures.
00
Figure 1. Hiding in a crowd underlies the metrics based on the anonymity set size
in a crowd
HidingHiding
in a crowd
L = | A | min( pi , 1/ | A |)
(1)
| A|
(2)
H ( A, t ) = w j ( pi log 2 ( pi ))
j =1
i
0
0
be triggered to destroy all private data. Data destruction is the ultimate way of preventing their
disclosure.
Let us add a bit more detail to this example.
Consider private data that consists of three attributes: a name, a social security number, and a
zip code. Each attribute has a domain of values.
The owner of private data first computes the
maximum entropy H* for all three attributes. The
owner also determines two values for entropy
mentioned above: the higher value H2 (the threshold for triggering controlled data distortions), and
the lower value H1 (the threshold for triggering
data destruction). Each time when private data is
shared or transferred from one entity to another,
the new value of entropy, Hnew, is calculated using
Equation (2). If Hnew stays above H2, no privacypreserving actions are needed. If Hnew drops below
H2 but stays above H1 (H2 > Hnew > H1), a controlled
data distortions method is invoked to increases
data entropy. Finally, if Hnew drops below H1 (H1 >
Hnew), data destruction must be invoked to protect
data privacy.
The example assumed that the size of the
private data attribute set is fixed. The entropy
metric can be extended to cases when the private
data attribute set is allowed to grow or shrink.
Case (d) in Figure 2, when compared with case
(c), illustrates the situation in which private data
attribute set grew. This growth is indicated by the
larger diameter of the lighter circle, indicating a
larger attribute set for sensitive data. The sizes
of subsets of disclosed attributes, indicated by
the darker circles, are identical in cases (d) and
(c); do not be fooled by the optical illusion that
the darker circle in (d) is smaller than the darker
circle in (c). As a result, entropy for case (d) is
higher than for case (c), as indicated by a higher
vertical bar for case (d). This utilizes the principle
of hiding in the (larger) crowd.
Entropy can be increased not only by increasing the size of the private data attribute set, as
just demonstrated, but also by making its subset
of disclosed attributes less valuable. For example,
0
0
A. Same-Strength and
Different-Strength Trust
As defined earlier, the strength of a party participating in the relationship is defined by the
partys capability to demand private information
from the other party, and the means available in
0
B. Same-Strength and
Different-Strength Privacy-for-Trust
Negotiations
To realize a privacy-for-trust tradeoff, two
interacting parties, P1 and P2, must negotiate
how much privacy needs be reveled for trust.
We categorize such negotiations as either: (1)
same-strengthwhen both parties are of similar
strength; or (2) different-strengthwhen one
partys position is stronger vis--vis the others. In
turn, same-strength privacy-for-trust negotiations
can be either: (1a) privacy-revealing negotiations,
in which parties disclose their certificates or policies; or (1b) privacy-preserving negotiations, in
which parties preserve privacy of their certificates
and policies.
We compare all three kinds of privacy-for-trust
negotiationsthat is, (1a), (1b), and (2)in terms
of their behavior during the negotiations. This
behavior includes defining trust level necessary
to enter negotiations, growth of trust level during
negotiations, and the final trust level sufficient
for getting a service.
Same-strength negotiations are very popular
in the research literature. Different-strength
negotiations, to the best of our knowledge have
been defined explicitly by us.
B.1. Trust growth in same-strength trust
negotiations: Same-strength trust negotiations
involve partners of similar strength.
a. Trust growth in privacy-revealing samestrength trust negotiations: Negotiations of this
type can start only if an initial degree of trust
exists between the parties. They must trust each
other enough to reveal to each other some certifi-
0
C. Privacy-for-Trust Optimization in
Different-Strength Trust Negotiations
The optimization procedure for trading privacy for
trust in different-strength trust negotiations presented, follows our approach (Zhong & Bhargava,
2004; Zhong, 2005). It includes four steps:
1.
2.
3.
4.
0
2.
08
The tradeoff problem changes to a multivariate problem if multiple attributes are taken into
consideration. It is possible that selecting nc1 is
better than nc2 for a1 but worse for a2. We assume
the existence of an m-dimensional weight vector [w1, w2, , wm] associated with these private
attributes. The vector determines the protection
priority for the private attributes a1, a2, , am,
respectively. We can minimize either: (a) the
weighted sum of privacy losses for all attributes
or (b) the privacy loss of the attribute with the
highest protection weight.
Another factor affecting the tradeoff decision is
the purpose of data collection. It can be specified
in the service providers privacy policy statement,
for instance, by using P3P (Marchiori, 2002).
Pseudonymous analysis and individual decision
are two data collection purposes defined in P3P.
The former states that collected information will
not be used in any attempts to identify specific
individuals. The latter tells that information may
be used to determine the habits, interests, or
other characteristics of individuals. A user could
make different negotiation decisions based on the
stated purpose of data collection. Furthermore,
the service providers trustworthiness to fulfill
the declared privacy commitment can be taken
into consideration.
C.2. Estimating privacy loss: We distinguish
two types of privacy losses: the query-dependent
and query-independent ones. Query-dependent
privacy loss for a credential nc is defined as the
amount of information that nc provides in answering a specific query. The following example
illustrates a query-dependent privacy loss for a
credential. Suppose that a users age is a private
attribute. The first query asks: Are you older
than 15? The second query tests the condition
for joining a silver insurance plan, and asks:
Are you older than 50? If the user has already
presented a valid driver license, we are 100%
sure that the answer to the first query is yes but
the probability of answering yes to the second
i =1
i =1
09
Box 1.
Pr ivacyLossa j (ci | ) = Tag (ci )
Pr ivacyLossa j (ci | R) = lub( PrivacyLossa j (ci | ), Pr ivacyLossa j ( R | ) )
0
3.
4.
2.
3.
4.
conclusIon
1.
2.
3.
tradeoff between these two phenomena. An overview of problems facing a person wishing to trade
privacy for trust was followed by a description of
our proposed solution. It started with a look at
trust metrics and means for building and verifying
trust, and continued with a presentation of two
privacy metrics: an effective anonymity set size
and an entropy-based metric. We then discussed
technical means for protecting privacy.
We categorized the processes of trading
privacy for trust into same-strength privacy-fortrust negotiations and different-strength privacyfor-trust negotiations, dividing the former into
privacy-revealing and privacy-preserving subcategories. The described privacy-for-trust solution
is intended for optimization in different-strength
trust negotiations. It involves four steps: formulating the tradeoff problem, estimating privacy loss,
estimating trust gain, and minimizing privacy
loss for a required trust gain. We provided a brief
description of PRETTY, a system minimizing
privacy loss for a required trust gain.
2.
3.
5.
In turn, for trust-related solutions, the following research problems should be addressed
(ibid):
1.
2.
3.
4.
5.
Trust and privacy are strongly related to security. Therefore, in addition to the separate research
directions for privacy and trust specified, we can
also indicate threads of research common not only
to them, but also to security. This means research
on intersecting aspects of trust, privacy, and
security (TPS) (Bhargava et al., 2003). The first
common thread includes the tradeoffs, including
not only the tradeoff between privacy and trust,
but also performance vs. TPS, cost and functionality vs. TPS, and data monitoring and mining
vs. TPS. The second common thread contains
policies, regulations, and technologies for TPS.
This includes creation of flexible TPS policies,
appropriate TPS data management (including
collection, usage, dissemination, and sharing of
TPS data), and development of domain- and application-specific TPS approaches (such as TPS
solutions for commercial, government, medical,
and e-commerce fields). The third and the fourth
AcknoWledgment
This research was supported in part by the NSF
Grants IIS-0242840, IIS-0209059, ANI-0219110,
and NCCR-0001788. The authors thank Dr. Yuhui
Zhong and Dr. Yi Lu for contributing input for
the subsections on Related Work, Trust Metrics,
Privacy Metrics, and Privacy-for-trust Optimization in Different-strength Trust Negotiations; as
well as Dr. Mohamed Hefeeda for contributing
Figure 2. Any opinions, findings, conclusions,
or recommendation expressed in the chapter are
those of the authors and do not necessarily reflect
the views of the funding agencies or institutions
with which the authors are affiliated.
references
Aberer, K., & Despotovic, Z. (2001). Managing
trust in a peer-2-peer information system. In
Proceedings of the 2001 ACM CIKM International Conference on Information and Knowledge
Management, Atlanta, Georgia (pp. 310-317).
New York: ACM.
Agrawal, D., & Aggarwal, C. (2001). On the
design and quantification of privacy preserving data mining algorithms. In Proceedings of
the Twentieth ACM SIGMOD-SIGACT-SIGART
Symposium on Principles of Database Systems,
PODS01, Santa Barbara, California (pp. 247-255).
New York: ACM.
The American Heritage Dictionary of the English
Language (4th ed.). (2000). Boston: Houghton
Mifflin.
Barnes, G. R., Cerrito, P. B., & Levi, I. (1998). A
mathematical model for interpersonal relation-
http://www.niap-ccevs.org/cc-scheme/cc_docs/
cc_v21_part2.pdf
Collberg, C., & Thomborson, C. (2002). Watermarking, tamper-proofing, and obfuscation-tools
for software protection. IEEE Transactions on
Software Engineering, 28(8), 735-746.
Cofta, P. (2006). Impact of convergence on trust
in ecommerce. BT Technology Journal, 24(2),
214-218.
Cover, T., & Thomas, J. (1991). Elements of
information theory. Hoboken, NJ: John Wiley
& Sons.
Cranor, L. F. (2003). P3P: making privacy policies more useful. IEEE Security and Privacy,
1(6), 5055.
Cranor, L. F., Reagle, J., & Ackerman, M. S.
(1999). Beyond concern: Understanding net users
attitudes about online privacy (Tech. Rep. No. TR
99.4.3). Middletown, NJ: AT&T Labs-Research.
Retrieved June 5, 2007, from http://citeseer.ist.
psu.edu/cranor99beyond.html
Diaz, C., Seys, S., Claessens, J., & Preneel, B.
(2003, April). Towards measuring anonymity. In R. Dingledine & P. F. Syverson, (Eds.),
Proceedings of the 2nd International Workshop
on Privacy Enhancing Technologies PET 2002,
San Francisco, CA. Lecture Notes in Computer
Science (Vol. 2482, pp. 184-188). Heidelberg,
Germany: Springer.
Donnellan, T. (1968). Lattice theory. Oxford, NY:
Pergamon Press.
Farrell, S., & Housley, R. (2002). RFC3281: An
Internet attribute certificate profile for authorization. The Internet Society. Network Working
Group. Retrieved June 5, 2007, from http://www.
ietf.org/rfc/rfc3281.txt
Fischer-Hbner, S. (2001). IT-security and privacydesign and use of privacy-enhancing security
mechanisms. Lecture Notes on Computer Science,
1958. Heidelberg, Germany: Springer.
8
9
0
endnotes
1
Chapter VI
AbstrAct
The current measures to protect e-consumers privacy in Australia include (i) regulation/legislation, (ii)
guidelines, (iii) codes of practice, and (iv) activities of consumer associations and the private sector.
However, information about the outcomes of such measures has not been sufficiently reported, whereas
privacy incidents have increased. Some policy implications for e-consumer protection are drawn from
the analysis. Firstly, national privacy legislation should widen its coverage. Secondly, uniform regulations and guidelines could contribute to providing equal protection to e-consumers. Thirdly, guidelines
and codes of practice need to be supported by legislation and a proper compliance regime. Corporate
social responsibility by e-retailers is also required for effective adoption of self-regulatory measures.
Fourthly, consumer education is important to enhance consumer awareness of online privacy risks and
their ability to deal with such incidents. Finally, a combination of legal frameworks, technological, and
human-behaviour related measures is more likely to address online privacy issues effectively.
Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.
IntroductIon
E-retailing has generated many benefits to both
e-retailers and e-consumers. At the same time,
it has also raised serious problems for the operation of the online market, especially consumer
protection. Among several problems with online
shopping, privacy concerns are key factors which
discourage consumers from shopping online
(Stoney & Stoney, 2003).
These concerns have been addressed by a
number of measures at both the international and
national levels. However, insufficient information
about these measures and the outcomes of such
measures has been reported.
This chapter examines the current measures to
protect consumers privacy in the online market,
using Australia as a case study; examines the
current state of e-consumer protection regarding
privacy; and discusses policy implications for the
protection of e-consumers privacy.
This chapter consists of four main sections.
The first section introduces three main privacy
issues, namely data security, spam/spim, and
spyware. The second section examines several
measures implemented at the international and
national levels to address privacy issues. In Australia, these measures include (i) legislation, (ii)
guidelines, (iii) codes of practice, (iv) initiatives by
the private sector, and (v) activities by consumer
associations. The effectiveness of the current
measures to address privacy concerns has been
examined in the third section by analysing the
current state of e-consumer protection in terms of
privacy. This section also discusses a case study,
using Dell as a subject of investigation. The final
section discusses the policy implications.
The findings suggest that although legislation,
guidelines, and codes of practice are available,
the effectiveness of these measures is limited.
Consumers are not confident to shop online
due to privacy and security concerns. Also, the
protection of consumers personal information
depends on how e-retailers exercise their corpo-
bAckground
This section first discusses three sub-issues of
concern in the protection of e-consumers privacy.
It then introduces the concept of consumer rights,
and discusses justification for e-consumer protection. It also analyses the current framework for
e-consumer protection regarding privacy.
privacy Issues
Privacy is one of the key issues in e-consumer
protection (Stoney & Stoney, 2003; Consumers
International, 2001; Jackson, 2003; Kehoe, Pitkow,
Sutton, Aggarwal, & Rogers, 1999). Internet users
are very concerned about how their personal and
financial data and medical history are collected,
used and disseminated (Consumers International,
2001; Jackson, 2003; Kehoe, Pitkow, Sutton, Aggarwal, & Rogers, 1999; Moghe, 2003). Many
consumers are very reluctant to reveal their
particulars because they do not want e-retailers
to misuse their personal information. However,
by adopting advanced technology, e-retailers can
easily collect personal and financial details of
e-consumers (Lynch, 1997). In addition, many
Web sites request e-shoppers to register or accept cookies which can help in tracking their
Internet itinerary (Yianakos, 2002). Privacy risks
can become a greater danger when e-retailers
share common databases (Egger, 2002). To make
things worse, many e-retailers have not published
privacy policies on their Web sites (Consumer Affairs Victoria, 2003; Consumer Affairs Victoria,
2004). For example, only 28% of the Web sites,
investigated in the Internet Sweep Day 2001, had
privacy policies (Australian Competition and
Consumer Commission, 2003).
This chapter focuses on three sub-issues,
namely data security, spam/spim, and spyware, affecting privacy due to their relevance
to e-consumer protection. Firstly, data security
refers to the security of personal and financial
information during the collection, usage, transmission, and retention stages of e-transactions.
Personal identity includes name, residential and
postal address, driving license, date of birth, social
security number, health card number, passport
number, birth certificate, contact number, contact and place of employment, mothers maiden
name, employee or student identification number,
and e-mail address of the workplace (Lawson &
Lawford, 2003; Milne, 2003). Identity can also
be username, password, , cryptographic keys,
physical devices such as dongles, swipe cards, or
even biometric recognition (Marshall & Tompsett, 2005). Financial information includes bank
account number, credit card number, password,
personal identification number (PIN), and tax
file number (Lawson & Lawford, 2003; Milne,
2003). Identity theft occurs when a customers
financial and personal information is illegally
collected and used by unscrupulous retailers
or unauthorised person in order to impersonate
another for personal gain or committing a fraud
(Grabosky, Smith, & Dempsey, 2001; Lawson
& Lawford, 2003; Marshall & Tompsett, 2005;
Milne, Rohm, & Bahl, 2004; Smith, 2004). It has
been noted that more cases of identity theft have
been committed via electronic means (Grabosky,
Smith, & Dempsey, 2001; Ha, 2005; Metz, 2005).
Information can be unlawfully obtained during
the transmission process. Employees, independent
hackers, criminal individuals, and organised
crime rings, business competitors, saboteurs,
and cyber terrorists are possible intruders (Crime
UN
OECD
EU
APEC
Sources: summarised from North American Consumer Project on Electronic Commerce (NACPEC). (2006). Internet
Consumer Protection Policy Issues. Geneva: The Internet
Governance Forum (IGF).
Harland, D. (1987). The United Nations Guidelines for
Consumer Protection Journal of Consumer Policy 10(2),
245-266.
International level
The following section discusses legislation and
guidelines relating to consumer protection by the
UN, the OECD, the EU, and APEC (Table 1).
8
2004). This directive and self-regulatory privacy protection schemes aim to deal with spam
and other privacy incidents (Cheng, 2004). The
EU also acknowledged that sufficient consumer
protection in terms of privacy constitutes a fundamental right to consumers, and new areas of
protection had to be addressed so that the full
potential of e-retailing could be realised, and
both e-consumers and e-retailers could take full
advantage of the benefits which e-retailers could
offer (Buning, Hondius, Prins, & Vries, 2001).
However, Appendix 1 shows that EU Principles
of Consumer Protection do not cover the protection of privacy.
Australia
In Australia, privacy issues had already been a
concern of the public and relevant authorities even
before the introduction of e-retailing. Privacy
concerns have been tackled by several measures,
including (i) legislation, (ii) guidelines, (iii) codes
of practice, (iv) initiatives by the private sector,
and (v) activities by consumer associations as
summarised in Table 2.
Source: Ha, H. (2007). Governance to Address
Consumer Protection in E-retailing (unpublished
thesis). Department of Management, Monash
University.
This section discusses the current policy
framework for protecting e-consumers privacy
and the institutional arrangement in Australia.
Legislation
Privacy protection is based on two main mechanisms: (i) general laws that regulate the collection, use, and dissemination of personal data both
by the public and private sector and (ii) different
acts (Moulinos, Iliadis, & Tsoumas, 2004).
9
Table 2. The current regulatory framework to address privacy concerns in Australia (based on information in Ha, 2007, unpublished thesis)
Policy framework
Activities
Legislation
Guidelines
Scams and Spam booklet and fact-sheets published by the ACCC and the CAV, respectively
Codes by Australian Direct Marketing Association (ADMA) and Australian Internet Industry
Association (IIA)
Using privacy, digital seals, trust marks provided by TRUSTe, WebTrust, BBBOnline, BetterWeb
Using the Platform for Privacy Preferences
Activities by consumer
associations
Activities by Australian Consumers Associations (ACA) and Australia Privacy Foundation (AFP)
0
Guidelines
The Treasury (Australia) published the Australian
Guidelines for Electronic Commerce (AGEC) in
March 2006, which has become one of the most
important documents promoting best practice
in e-retailing (The Australian Guidelines for
Electronic Commerce (AGEC) 2006 replaces the
Australian Best Practice Model (BPM) introduced
in 2000). The AGEC consists of 14 main provisions (see Table 3).
Guidelines
1
2
3
4
5
6
7
8
9
10
11
12
13
14
Guideline 9, Items 37 and 38 of the AGEC indicate that consumers privacy must be respected
and protected. The AGEC encourages small
businesses, which are not under the scope of the
Privacy Act 1988 (Cth), to comply with privacy
legislation so that they can enhance consumer
trust and confidence (Treasury (Australia), 2006).
Nevertheless, some e-retailers do not want to adopt
government guidelines (Ha, 2007).
In 2004, the Australian Competition and Consumer Commission published a Scams and Spam
booklet and other educational material to inform
consumers about the types and the adverse effect
of scams and spam, and how to avoid scams and
spam (Australian Competition and Consumer
Commission, 2004). National enforcement of
consumer protection laws is undertaken by the
ACCC. Established in 1995, it acts independently
of ministerial direction as a national statutory
body to administer the implementation of the
TPA. The main function of the ACCC is to advance the interest of Australian consumers by
promoting fair, dynamic and lawful competition
among all kinds of businesses. The ACCC has
continuously advocated consumer rights and has
conciliated many complaints related to online
purchase (Graeme, 2005; OECD, 2001a).
Other countries, such as the USA, have enacted
a number of pieces of legislation to deal with
privacy issues in general. For example, the U.S.
Can-Spam Act deals with offences relating to
spam e-mails (Cheng, 2004; OECD, 2003). The
state of California (U.S.) has enacted the AntiPhishing Act of 2005 (Business and Professions
Code sections 22948-22948.3) and Computer
Spyware 2004 (Business and Professions Code
section 22947) legislation (2004) which prohibit
phishing activities and illegal installation or
provision of software that can collect personal
information of the recipients without their knowledge and/or consent (State of California, 2007).
The Online Privacy Protection Act of 2003
(Business and Professions Code sections 2257522579), which came into effect in July 2004,
National Level
In Australia, 62% of Australian respondents in
a survey conducted by Roy Morgan Research
in 2004, worried about privacy concerns (Roy
Morgan Research, 2004). Australia loses more
than $1.1 billion per year for identity fraud (The
Age, 2006). The National Australia Bank loses
$1 million per month due to Internet fraud, and
Evaluation
Dell does not fully comply with the regulation in
that Dell does not provide sufficient information
regarding identifiers, anonymity, cross-border
data flows, and sensitive information as required
by the NPPs (see the second section). This implies
that Dell has not fully respected consumers safety
(the first basic consumer right) regarding privacy
(see the second section). Also, the contact number
and the physical address which consumers can
communicate any privacy concern to Dell are
in Singapore, and no name of any individual in
charge of privacy at Dell is indicated. This shows
that Dell fails to meet the accepted standard of
information disclosure (Clayton, 2000, Ha, 2007,
unpublished).
Finally, there has been insufficient formal
evaluation of how Dells codes of conduct improve
the level of privacy protection. Also, most of Dells
collaboration with industry and consumer associations has taken place in the USA, not in Australia.
Furthermore, insufficient information about how
Dell has worked with industry and consumer associations in Australia has been reported.
Generally, companies registered in Australia, except for small businesses, have to comply
8
conclusIon
This chapter has examined privacy issues associated with consumer protection in the online
market, including data security, spam/spim, and
spyware, and the policy framework for addressing online privacy incidents at both international
and national levels.
Although international and national policies
to address privacy issues are in place, the effectiveness of the current policy framework has not
been formally evaluated. However, the number of
online privacy incidents has steadily increased,
and most consumers do not know how to deal with
such incidents. The findings reveal that a single
organisation or a single measure is not adequate
to address the complicated and challenging issues
associated with online privacy. A joint effort of
all stakeholders and adoption of a combination of
different measures would be desirable to protect
e-consumers privacy more effectively.
Given the lack of studies on online consumer
protection in terms of privacy, further research
on online privacy will certainly contribute to
the development of a theoretical framework and
practical approaches to solving stagnant problems
with e-consumer protection regarding privacy.
The exposure of loopholes in the current privacy legislation has led the Australian government
to review it (Ruddock, 2006). This confirms the
potential for more research on the extent to which
new legislation could deter undesirable behaviour
relating to privacy.
The cross-border and transient nature of e-retailing justifies more research on how legislation
and guidelines could be adopted at a supra-national
level to more effectively prevent the abuse of or
illegal use of e-customers particulars.
In addition, the limited formal evaluation of
privacy protection measures in e-retailing suggests they should be further investigated.
Finally, security and privacy issues are interrelated because lack of security measures may
lead to the unwanted disclosure of customers
personal and financial information. Addressing
any one of these issues separately is insufficient to
ensure consumers interests to be fully protected.
Thus, they must be investigated as an integrated
problem and addressed simultaneously.
references
9
0
Expert Group on Electronic Commerce (Australia). (2003). Review of building consumer sovereignty in electronic commerce: A best practice
model for business. Canberra, ACT: Treasury
(Australia).
Federal Bureau of Consumer Affairs (Australia).
(1993). Your consumer rights. In K. Healey (Ed.),
Consumer rights (Vol. 38). Balmain NSW: The
Spinney Press.
Federal Bureau of Consumer Affairs (Australia).
(1995). Your consumer rights. In K. Healey (Ed.),
Consumer rights (Vol. 38). Balmain NSW: The
Spinney Press.
Forder, J. (1999). The IIA code of practice: Coregulation of the internet starts here. Retrieved
March 31, 2007, from http://epublications.bond.
edu.au/law pubs/38
Fraud costing Australia $1.1b a Year. (2006, April
7). The Age.
Gilliams, H. (2003, October). Self regulation by
liberal professions and the competition rules.
Paper presented at the Regulation of Professional
Services Conference organized by the European
Commission, Brussels.
Grabner-Kraeuter, S. (2002). The role of consumers trust in online-shopping. Journal of Business
Ethics, 39(1-2), 43-50.
Grabosky, P., Smith, R. G., & Dempsey, G. (2001).
Electronic theft: Unlawful acquisition in cyberspace. Cambridge and New York: Cambridge
University Press.
Graeme, S. (2005). 30 years of protecting consumers and promoting competition. Keeping Good
Companies, 57(1 (Feb)), 38041.
Greenleaf, G. (2000a). Private sector bill amendments ignore EU problems. Retrieved October
2007, from http://www.austlii.edu.au/au/journals/
PLPR/2000/30.html
Greenleaf, G. (2000b). Safe harbors low benchmark for adequacy: EU sells out privacy for
U.S.$. Retrieved October 2, 2007, from www.
austlii.edu.au/au/journals/PLPR/2000/32.html
Ha, H. (2005, October). Consumer protection in
business-to-consumer e-commerce in Victoria,
Australia. Paper presented at the CACS 2005
Oceania Conference, Perth, WA.
Ha, H. (2006, September-October). Security issues
and consumer protection in business-to-consumer
e-commerce in Australia. Paper presented at
the 2nd Australasian Business and Behavioural
Sciences Association International Conference:
Industry, Market, and Regions, Adelaide.
Ha, H. (2007). Governance to address consumer
protection in e-retailing. Unpublished doctoral
thesis, Monash University, Melbourne, Victoria.
Harland, D. (1987). The United Nations guidelines
for consumer protection. Journal of Consumer
Policy, 10(2), 245-266.
Harland, D. (1999). The consumer in the globalised
information societythe impact of the international organisations. Australian Competition and
Consumer Law Journal, 7(1999), 23.
Haymarket Media. (2007). Spam hits records levels in February. Retrieved March
20, 2007, from http://www.crn.com.au/story.
aspx?CIID=75798&r=rss
Huffmann, H. (2004). Consumer protection in
e-commerce. University of Cape Town, Cape
Town.
Information Law Branch. (undated). Information
paper on the introduction of the privacy amendment (private sector) bill 2000. Barton, ACT:
Attorney Generals Department (Australia).
Internet Industry Association (Australia). (2006a).
Content code. Retrieved March 31, 2007, from
http://www.iia.net.au/index.php?option=com_c
ontent&task=category§ionid=3&id=19&It
emid=33
Internet Industry Association (Australian).
(2006b). About the IIA. Retrieved March
24, 2007, from http://www.iia.net.au/index.
php?option=com_content&task=section&id=7
&Itemid=38
Jackson, M. (2003). Internet privacy. Telecommunications Journal of Australia, 53(2), 21-31.
James, M. L., & Murray, B. E. (2003). Computer
crime and compromised commerce (Research
Note No. 6). Canberra, ACT: Department of the
Parliamentary Library.
Kaufman, J. H., Edlund, S., Ford, D. A., & Powers,
C. (2005). The social contract core. Electronic
Commerce Research, 5(1), 141-165.
Kehoe, C., Pitkow, J., Sutton, K., Aggarwal, G., &
Rogers, J. D. (1999). Results of GVUs tenth world
wide web user survey. Retrieved November 16,
2006, from http://www.gvu.gatech.edu/user_surveys/survey-1998-10/tenthreport.html
Kim, S., Williams, R., & Lee, Y. (2003). Attitude
toward online shopping and retail website quality: A comparison of US and Korean consumers.
Journal of International Consumer Marketing,
16(1), 89-111.
Krone, T. (2005). Concepts and terms. Canberra:
The Australian High Tech Crime Centre.
Krone, T. (2006). Gaps in cyberspace can leave
us vulnerable. Platypus Magazine, 90 (March
2006), 31-36.
Lahey, K. (2005, August 30). Red tape on a roll...
and it must stop. The Age, 8.
Lawson, P., & Lawford, J. (2003). Identity theft:
The need for better consumer protection. Ottawa:
The Public Interest Advocacy Centre.
Lekakis, G. (2005). Computer crime: The
Australian facts and figures. Retrieved April
Milne, G. R., Rohm, A. J., & Bahl, S. (2004). Consumers protection of online privacy and identity.
Journal of Consumer Affairs, 38(2), -232217.
Moghe, V. (2003). Privacy managementa new
era in the Australian business environment. Information Management & Computer Security,
11(2), 60-66.
Moores, T. (2005). Do consumers understand the
role of privacy seals in e-commerce? Communication of the ACM, 48(3), 86-91.
Moulinos, K., Iliadis, J., & Tsoumas, V. (2004).
Towards secure sealing of privacy policies. Information Management & Computer Security,
12(4), 350-361.
Muris, T. J. M. (2002, October). The interface of
competition and consumer protection. Paper presented at the Fordham Corporate Law Institutes
Twenty-Ninth Annual Conference on International Antitrust Law and Policy, New York.
National Consumers League (USA). (n.d.). Essentials for online privacy. Retrieved June 25,
2007, from http://www.nclnet.org/technology/essentials/privacy.html
National Office for the Information Economy
(Australia). (2003). Spamfinal report of the
NOIE review of the spam problem and how it
can be countered. Canberra, ACT: Department
of Communication, Information Technology and
the Arts.
North American Consumer Project on Electronic
Commerce (NACPEC). (2006). Internet consumer
protection policy issues. Geneva: The Internet
Governance Forum (IGF).
NSW Office of Fair Trading. (2003). International
consumer rights: The world view on international
consumer rights. Retrieved November 15, 2006,
from http://www.fairtrading.nsw.gov.au/shopping/shoppingtips/internationalconsumerrights.
html
Acts
Anti-Phishing Act 2005 (California)
Can-Sapm Act 2002 (USA)
Computer Spyware Act 2004 (California)
Online Privacy Protection Act 2003 (California)
Privacy Act 1988 (Public sector) (Cth)
Privacy Amendment (Private Sector) Act 2000
(Cth) sch 3
Spam Act 2003 (Cth)
Telecommunications (Interception and Access)
Act 1979 (Cth)
AddItIonAl reAdIng
Al Iannnuzzir, J. (2002). Industry self-regulation
and voluntary environmental compliance. Boca
Raton: Lewis Publishers.
Ang, P. H. (2001). The role of self-regulation of
privacy and the internet. Journal of Interactive
Advertising, 1(2), 1-11.
Australian Privacy Foundation. (2006). Identity
checks for pre-paid mobile phones. Retrieved
April 2, 2007, from http://www.acma.gov.au/
webwr/_assets/main/lib100696/apf.pdf
Business for Social Responsibility. (2005, April
2005). Privacy (consumer and employee).
Retrieved December 8, 2006, from http://
www.bsr.org/CSRResources/IssueBriefDetail.
cfm?DocumentID=50970
Chung, W. C., & Paynter, J. (2002, January).
Privacy issues on the internet. Paper presented
at The 35th Hawaii International Conference on
System Sciences, Hawaii.
8
AppendIX A
Principles and Guidelines of Consumer Protection by the United Nations (UN), Organisation
Economic Corporation Development (OECD), European Union (EU), and Asia Pacific Economic
Cooperation (APEC) (based on information in Department of Economic and Social Affairs (UN),
2003; OECD, 2000; European Commission, 2005; Consumer Protection Commission, E. Y. T., n.d.)
Table 1A.
No.
UN(a)
OECD (b)
EU (c)
APEC (d)
Physical safety
International cooperation
Promotion and
protection of
consumers economic
interests
Online disclosures
information about the
business, the goods or
services, the transaction
Distribution facilities
for essential consumer
goods and services
Confirmation process
Measures enabling
consumers to obtain
redress
Payment
Education and
information programs
Confirmation process
Promotion
of sustainable
consumption
Privacy
Resolution of consumer
disputes
Measures relating to
specific areas
Privacy
Security
10
Sources: (a) Department of Economic and Social Affairs (UN). (2003). United Nations Guidelines for Consumer Protection
(as expanded in 1999). New York: United Nations.
(b) OECD. (2000). Guidelines for Consumer Protection in the Context of Electronic Commerce. Paris: OECD.
(c) European Commission. (2005). Consumer Protection in the European Union: Ten Basic Principles. Brussels: European
Commission.
(d) Consumer Protection Commission, E. Y. T. (undated). E-Commerce: APEC Voluntary Online Consumer Protection Guidelines.
Consumer Protection Commission, Executive Yuan (Taiwan). Retrieved April 3, 2007, from http://www.cpc.gov.tw/en/index.
asp?Pagenumber=25
9
AppendIX b
Summary of Eight Principles of Consumer Protection in Canada (based on information in Working
Group on Electronic Commerce and Consumers (Canada), 1999)
Table 2B.
No.
Principles
Information provision
Contract formation
Privacy
Redress
Liability
Consumer awareness
Sources: Working Group on Electronic Commerce and Consumers (Canada). (1999). Principles of Consumer Protection for
Electronic CommerceA Canadian Framework. Ottawa: Canada Bankers Association.
0
Chapter VII
AbstrAct
Individuals are generally concerned about their privacy and may withhold from disclosing their personal
information while interacting with online vendors. Withholding personal information can prevent online
vendors from developing profiles to match needs and wants. Through a literature review of research on
online privacy, we develop an integrative framework of online privacy protection.
IntroductIon
The latest report on e-commerce by the U.S.
Census Bureau (2007) shows that although there
has been an increase in online purchasing by individuals, the portion of consumer e-commerce
or online to total retail sales is far less than the
Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.
bAckground
Research has shown that privacy concerns act as
a hindrance to the growth of electronic commerce
(Hoffman, Novak, & Peralta, 1999; Miyazaki &
Fernandez, 2001). In countering privacy concerns,
the Federal Trade Commission has primarily relied
upon fair information practices to guide privacy
regulation in the United States (Milne, 2000).
Fair information practices include the following:
notice of the firms information practices regarding what personal information will be collected
and how the collected information will be used;
choice or consent regarding the secondary use
of the information; accessibility of users to view
their own data collected by companies; security
of the collected data; and enforcement to ensure
that companies comply with fair information
practices.
Research shows that fair information practices have not been effective in alleviating the
privacy concerns of consumers (Culnan, 2000).
In the absence of stricter laws to ensure privacy,
consumers adopt differing strategies to protect
their identity online, for instance, falsification,
passive reaction, and identity modification (e.g.,
Sheehan & Hoy, 1999). For the purpose of this
chapter, the strategies adopted by consumers to
protect their identity are defined under the general
term of privacy protection behavior in an
online environment.
How do information
privacy concerns affect the
growth and development
of consumer-oriented
commercial activity on the
internet?
Authors &
Year
(Hoffman et
al., 1999)
(Sheehan &
Hoy, 1999)
(Miyazaki &
Fernandez,
2000)
(Culnan,
2000)
(Milne, 2000)
(Sheehan &
Hoy, 2000)
Survey
Conceptual
Survey
Survey
889
361
128
Awareness
of collection,
information
usage, sensitivity
of information,
familiarity, and
compensation
Marketer information
strategy
Personal identifying
information
Privacy related
statements,
security-related
statements, consumer
perceptions
Privacy concerns,
situational contexts
889
Survey
IV
Information privacy,
environmental
control, and
secondary use of
information control
Conceptual
Method
Consumer
information
behavior
Information practices
Online disclosures
Purchase
likelihood
Information
practice
and privacy
policy
disclosures
Consumer complaining
behavior
Consumer
behavior
(i.e.,
falsifying
information,
reading
unsolicited
e-mail
Relationship exchange
Policy,
Protection
Findings
Theoretical Framework
DV
(Ranganathan
& Ganapathy,
2002)
(Sheehan,
2002)
Survey
Survey
502
889
214
100
Authentication,
nonrepudiation,
confidentiality,
privacy protection,
data integrity, trust,
attitude, behavioral
intention
Awareness,
usage, sensitivity,
familiarity, and
compensation
Information content,
design, security, and
privacy
Choice, access,
security, and notice
(Milne &
Culnan, 2002)
Survey
Trustworthiness, site
quality, privacy, and
security features
Experiment
(Belanger et
al., 2002)
140
Disposition to trust,
institution-based
trust, trusting beliefs,
trusting intentions,
and Web vendor
interventions (i.e.,
third party seals,
privacy policy)
Conceptual
Internet experience,
purchasing method,
risk concerns
(McKnight
& Chervany,
2001)
160
Survey
(Miyazaki &
Fernandez,
2001)
Actual use
Purchase
intent
Information
disclosure
Technology acceptance
model
Information practices
Information practices
Information practices
Trust related
Internet
behavior
Intention to
purchase
Information practices
Online
purchasing
rate
Table 1.continued
23
Interview
Do information transparency
features, which provide
knowledge of information
and procedures, affect
willingness for information
disclosure?
(Olivero &
Lunt, 2004)
(Dinev &
Hart, 2006)
(Awad &
Krishnan,
2006)
Survey
Experiment
Survey
293
&
449
Survey and
experiment
(Malhotra,
Kim, &
Agarwal,
2004)
401
109
369
212
Experiment
(Liu et al.,
2004)
102
Survey
Information
transparency, privacy
concern, privacy
policy, and previous
privacy invasion
Privacy statement,
privacy seal,
monetary incentive,
sensitivity of
information
Privacy concerns,
trust, privacy risk,
personal interest
Attitude toward
privacy, control,
perceived risk,
and awareness
of information
collection
Collection, control,
awareness, type of
information, trust
beliefs, risk beliefs
Notice, access,
choice, security, and
trust
Concerns of
unauthorized
use Concerns of
giving out personal
information
Willingness
to be
profiled
Information
disclosure
Willingness
to disclose
information
Information practices
Contemporary choice
theory
Privacy calculus
Online disclosures
Behavioral
intention
Willingness
to disclose
information
Information practices
Behavioral
intention to
purchase
Privacy
controls
Table 1.continued
revIeW of fIndIngs
The methodology followed for this chapter was
a literature review. In this conceptual study,
the existing privacy and related literature was
analyzed to identify existing frameworks and
variables related to online privacy. In the review
of the literature, we retained the studies where
privacy was in the context of online, and the
unit of analysis was either individual and/or online
consumers. The results of the literature review
are presented in Table 1. The research articles that
were considered for the review were published
from 1999 onwards. This was necessary, since the
popular media has been ripe with news coverage on
heightened privacy concerns of consumers since
that time. Most of the research studies included
in the review used a survey methodology, while
experiments were the second most frequently
used methodology.
Our review of the literature on privacy revealed
that most of the research studied the consumers
willingness to disclose information in light of
their privacy concerns (Dinev & Hart, 2006;
Hui et al., 2007; Malhotra et al., 2004; Milne &
Culnan, 2002; Olivero & Lunt, 2004). There were
other group of literature that studied the consumers willingness to purchase in light of privacy
concerns (Belanger et al., 2002; Miyazaki &
Fernandez, 2001; Suh & Han, 2003). There were
very few studies that actually studied privacy
protection behavior (Chen & Rea, 2004). The
review of current research shows that privacy
concerns affect the disclosure of information
or purchase intent of consumers (Belanger et
al., 2002; Malhotra et al., 2004). The reviewed
literature gives us insights into how the privacy
construct is used with other related constructs
from different perspective. Therefore, we feel
it is necessary that an integrative framework of
privacy be proposed. This framework would be
helpful to study in more completeness, the impact
of privacy on consumer behavior. As outlined
in the beginning, the proposed framework will
Unconcerned only
Unconcerned only
privacy protection
Approaches taken by individuals to protect their
personal information online may be passive or
active. Passive protection may involve depending upon external entities such as government or
private institutions and not adopting any privacy
protection by oneself. As the name suggests, active
protection involves using different measures for
privacy protections. Some of the privacy protection strategies are as follows: use personal firewalls, withhold information to a Web site, remove
name and address from mailing lists, inform Web
sites not to share information, avoid using a Web
site, disable cookies, use anti-spyware tools, and
provide false or incomplete information when
registering on a Web site. Privacy protections
can be viewed from three perspectives: preroga-
Dependent Variable
Several differing factors that contribute to the
overall behavior of an individual to protect their
privacys have been discussed in literature (Chen
& Rea, 2004). The first factor, falsification, refers
to altering ones personal information and removing browser cookies when registering for online
Web sites. The second factor, passive reaction,
refers to just ignoring or deleting the intrusion
of others. The third factor, identity modification,
refers to changing ones personal identity by using gender-neural identities or multiple identities
when registering for online services.
Independent Variables
Our literature analysis showed that a wide range
of variables have been used to predict online
privacy protection behavior. These predictor
variables can be classified as privacy concerns,
Internet experience, demographics, and awareness of privacy issues as shown in Figure 1.
Privacy Concerns
frAmeWork
In this section, we propose an integrative framework for online privacy protection behavior.
The proposed framework, as shown in Figure 1,
builds upon prior research and integrates research
8
Prerogative
Objective
Subjective
can be addressed by
measuring effectiveness
of privacy protection
strategies
can be addressed by
focusing on factors that
determine the adoption of
specific privacy protection
strategies
Internet Experience
As consumers become more experienced in using
the Internet, they are likely to become familiar with
privacy protection strategies. The relationship
between Internet experience of an individual and
their adoption of privacy protection strategies has
been suggested in literature, since the Internet
experience helps to increase behavioral control
which is considered significant in the prediction
Demographics
Among demographic variables of age, gender,
and race, Chen and Rea (2004) found that gender
and race are significant factors in privacy protection behavior. Their findings suggest that male
consumers are more likely to falsify personal
information than are female users. Their results
further implied that data quality of personal
information collected online may vary among
racial groups. Phelps et.al (2000) found that among
demographic variables such as gender, marital
status, age, education, employment status, and
income, only the education was significant with
privacy concerns. They reported that respondents
who had vocational or some college education
were associated with highest levels of privacy
concern. This further supports the contention
that demographic variable may be related to the
privacy protection behavior.
Awareness
Consumers may be more likely to adopt privacy
protection behavior if they are aware of the
malpractices of online companies and the extent
and the severity of privacy violations that could
occur. In their research about anti-spyware tools,
Hu and Dinev (2005) found that awareness was a
key predictor to anti-spyware adoption behavior.
Users were likely to run anti-spyware tools only
when they became aware that their personal computers were infected with spyware and when they
were aware of the negative consequences posed
by spyware. The concept of awareness has been
defined as the initial stage in the innovation dif-
9
Online privacy
protection behavior
Awareness
dIscussIon
The proposed framework provides a comprehensive approach for studying online privacy
protection behavior of consumers. There has
been much literature assessing the consumers
concerns about privacy. We all know that consumers are concerned about their privacy in general.
What we need to understand is how consumers
are dealing with these concerns. The framework
proposes that apart from privacy concerns, Internet experience, demographics, and awareness
are also important antecedents to the prediction of
online privacy protection behavior. Only those
variables that have been researched in the past
0
conclusIon
This research was undertaken with the objective of
investigating how online privacy has been studied. A framework for online privacy protection
behavior was proposed based on the literature
review. The proposed framework provides us with
a roadmap to further analyze and understand the
privacy concerns of consumers and consequent
strategies taken by consumers to protect their
privacy online. Implications for research and
practice were discussed. Further, we hope that the
proposed research directions will help to encourage more research in this exciting area.
references
Awad, N. F., & Krishnan, M. S. (2006). The
personalization privacy paradox: An empirical
evaluation of information transparency and the
willingness to be profiled online for personalization. MIS Quarterly, 30(1), 13-28.
Belanger, F., Hiller, J. S., & Smith, W. J. (2002).
Trustworthiness in electronic commerce: The role
of privacy, security, and site attributes. Journal
AddItIonAl reAdIng
Culnan, M. J. (1993). How did they get my
namean exploratory investigation of consumer
attitudes toward secondary information use. MIS
Quarterly, 17(3), 341-361.
Culnan, M., & Armstrong, P. (1999). Information privacy concerns, procedural fairness, and
impersonal trust: An empirical investigation.
Organization Science, 10(1), 104-115.
Greenaway, K. E., & Chan, Y. E. (2005). Theoretical explanations for firms information privacy
behaviors. Journal of Association for Information
Systems, 6(6), 171-198.
Henderson, S. C., & Snyder, C. A. (1999). Personal information privacy: Implications for mis
managers. 36(4), 213-220.
Section III
Empirical Assessments
Chapter VIII
AbstrAct
Protecting personal information while Web surfing has become a struggle. This is especially the case
when transactions require a modicum of trust to be successfully completed. E-businesses argue that
they need personal information so they can create viable data to tailor user interactions and provide
targeted marketing. However, users are wary of providing personal information because they lack trust
in e-businesses personal information policies and practices. E-businesses have attempted to mitigate
user apprehension and build a relationship base in B2C transactions to facilitate the sharing of personal information. Some efforts have been successful. This chapter presents survey results that suggest
a relationship between gender and how users control personal information. The findings suggest that
e-businesses should modify information and privacy policies to increase information and transactional
exchanges.
IntroductIon
In the past few years we have witnessed the
competing interests of technological convenience,
personal privacy, and e-business needs. Consumers are finding that e-businesses are asking foror
takingmore personal information than they may
be willing to give in order to utilize goods and
Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.
bAckground
Negotiating trust is especially important in this
study because all online interactions require
some level of trust between the consumer and
the e-business. From a simple hyperlink click
to a complex e-commerce purchase, trust must
be negotiated before users are willing to share
personal data in exchange for goods, services,
or information. E-businesses must understand
how to establish and maintain this trust in order
to be successful. A crucial enabling component
for trust is privacy.
privacy
Privacy takes many forms. Some researchers view
it as a moral, legal, or consumer right (Goodwin,
1991; Papazafeiropoulou & Pouloudi, 2001; Han
& Maclaurin, 2002). Others view it within the
context of a social power struggle (Campbell
& Carlson, 2002), economic theory (Hemphill,
2002), or commitment-trust theory (Mukherjee
& Nath, 2007). Some view it as simply a need
to sustain personal space (Gumpert & Drucker,
1998; Clarke, 1999) or a necessary psychological
condition (Yao, Rice, & Wallis, 2007).
This study does not address the privacy needs
of personal space in terms of biometrics, video
monitoring, or workplace surveillance. Instead,
8
trust
Before users choose to enter into a relationship
with a business, they must first be convinced that
it is in their best interest. Consumers look at discounts, reputation, and other factors before they
decide they will enter a physical store to conduct
business (So & Sculli, 2002; Duffy, 2005; Eastlick et al., 2006; Chen & Barnes, 2007; Roman,
2007). There must also be some factor of initial
trust before a user will enter into an e-business
transaction:
9
0
networks to make purchasing decisions (Garbarino & Strahilevitz, 2004). Studies of women in
female-only discussion groups show women to
be more focused on self-disclosure and individual
opinions; women respond directly to others in
the discussion groups. Conversely, men in maleonly discussion groups do not self-disclose, and
instead argue to win discussions (Savicki, Kelley,
& Lingenfelter, 1996).
Because the content of communication messages largely affects the participants willingness
to share the messages with others, it is likely that
women would prefer more privacy protection
than men. Therefore, we conjecture the following hypothesis:
H1: Female users are more concerned with online
privacy practices than male users.
Consumers can be very uncomfortable sharing their personal data with others because they
are sensitive to disclosing such data (Phelps et
al., 2000; Han & Maclaurin, 2002; So & Sculli,
2002; Duffy, 2005; Aiken & Boush, 2006; Flavin & Guinalu, 2006). The term information
sensitivity refers to the level of privacy concern
an individual feels for a type of data in a specific
situation (Weible, 1993). Despite concerns about
online privacy, consumers do realize that personal
information is important to online marketers.
Consequently, they are willing to provide such
information when Web sites provide privacy
statements explaining how the collected information would be used (Hoffman et al., 1999; Han
& Maclaurin, 2002; Ashrafi & Kuilboer, 2005;
Flavin & Guinalu, 2006; Pan & Zinkhan, 2006).
The disclosure of such privacy statements and
similar constructs, such as trustmarks, is related
to higher levels of involvement in e-commerce
(Miyazaki & Fernandez, 2000; Noteberg et al.,
2003; Hu et al., 2003; Kim et al., 2004; Patton &
Josang, 2004; Moores, 2005; Aiken & Boush,
2006). Thus, online privacy issues are also related
to (1) an individuals willingness to share personal
survey procedure
study procedures
The authors developed a survey (Table 4) based
on existing privacy literature. To our best knowledge, there is no existing study that empirically
covers online privacy issues focusing on users
preferences to control the unwanted presence of
others (either actively or passively), and concerns
on privacy disclosure, revealing personal data, and
online privacy practices. Specifying the domain
of individual privacy concerns and their ability
to control, the authors used the literature review
(detailed in a previous section) as the starting point
for this instrument. Questions for three types of
privacy concerns (revealing personal data, privacy
practice, and disclosure of privacy policy) were
modified from Smith, Milberg, and Burke (1996)
and others, to suit the context of the current study.
The user controls over others presence were
derived from Goodwin (1991) and others. The
survey instrument was internally distributed to
faculty members who specialize in e-commerce
and online privacy to ensure its content validity.
A pilot study followed and the instrument was
further fine-tuned before implementation.
survey structure
The survey begins with a series of questions that
assess the respondents concerns over various
privacy issues and his or her ability to control
personal data and the presence of others. To avoid
the likelihood of associating each participants
personal demographic data with real individuals, questions regarding identifiable information
were separated from the main part of the survey.
This survey is part of a larger study (Chen & Rea,
.86
(B)
.83
(C)
.83
(D)
.82
(E)
.82
(F)
.67
.39
.90
(H)
.88
(I)
.85
(J)
.85
(K)
.82
(L)
.73
.84
(N)
.74
(O)
.74
(P)
.70
.86
(R)
.84
(S)
.60
(T)
.58
.78
(V)
.77
(W)
.56
(X)
.49
Variance explained
Cronbachs alpha
24.07%
17.89%
10.02%
7.85%
6.90%
.90
.92
.84
.70
.62
Mean
Std.
Deviation
Std.
Error
95% Confidence
Interval for Mean
Lower
Bound
Upper
Bound
Min.
Max.
Female
26
-.08
1.20
.24
-.57
.40
-4.22
1.30
Male
71
.02
.93
.11
-.20
.24
-3.56
1.11
Total
97
-.05
1.00
.10
-.21
.20
-4.23
1.30
Female
26
.38
.75
.15
.08
.69
-1.30
1.36
Male
71
-.15
1.05
.12
-.40
.10
-2.22
1.48
Total
97
-.08
1.00
.10
-.21
.19
-2.22
1.48
Female
26
-.21
1.11
.22
-.66
.24
-2.64
1.31
Male
71
.07
.96
.11
-.16
.30
-2.60
1.65
Total
97
-.05
1.00
.10
-.21
.20
-2.64
1.65
Female
26
-.33
.96
.19
-.72
.05
-1.82
1.51
Male
71
.14
.99
.12
-.09
.37
-2.15
1.90
Total
97
.01
.99
.10
-.19
.21
-2.15
1.90
Female
26
.25
.77
.15
-.06
.56
-1.48
1.50
Male
71
-.06
1.03
.12
-.31
.18
-3.91
1.81
Total
97
.02
.98
.1
-.18
.22
-3.91
1.81
Sum of
Squares
df
Mean
Square
Sig.
Between groups
.221
.22
.22
.64
Within groups
96.55
95
1.02
Total
96.77
96
Between groups
5.45
5.45
5.69
.02
Within groups
90.91
95
.96
Total
96.36
96
1.47
.23
4.45
.04
1.952
.17
Between groups
1.47
1.47
Within groups
95.28
95
1.00
Total
96.75
96
Between groups
4.28
4.28
Within groups
91.30
95
.96
Total
95.58
96
Between groups
1.86
1.86
Within groups
90.59
95
.95
Total
92.45
96
8
(B)
(C)
(D)
(E)
(F)
(M)
(J)
(L)
(I)
(H)
(K)
(G)
(A)
Variable
Strongly
disagree
Somewhat
disagree
Disagree
Neutral
Agree
Somewhat
agree
Strongly
agree
(O)
(P)
(R)
(S)
(T)
(U)
(V)
(W)
(X)
(Q)
(N)
Table 4. continued
9
future trends
The ongoing tension between the e-business need
for personal information and user willingness
to provide personal information in exchange for
goods and services will increasingly demand our
attention. As more information becomes available
online for individual use, more information is
collected by those offering the services. Consider
Internet behemoths, such as Google, that offer a
powerful search engine, e-mail, map searches
(including actual street views), word processing
software, and various other tools (Google, 2007a),
all online for free access and use by anyone.
However, these features may come at a price of
which most users are unaware: privacy. Whether
80
conclusIon
Our study suggests that both female and male
users implement controls when utilizing online
services in the current e-business environment.
Males use multiple identities and techniques to
actively thwart data collection, whereas females
passively ignore information requests or altogether forgo participation. Whatever the case, if
e-businesses want to collect viable data in order
to improve online offerings and remain competitive, they must (1) implement an accessible and
easy-to-read privacy statement and (2) obtain
endorsement from well-known privacy groups
such as the BBBOnLine (BBBOnLine, 2007) and
TRUSTe (TRUSTe, 2007), as well as prominently
display the resulting certification logo. These two
items are the foundation upon which an e-business
can begin to form the initial trusting relationship
between itself and users.
Without trust between e-business and consumer, there can be no productive relationship
(Duffy, 2005; Flavin & Guinalu, 2006; Roman,
2007). Consumers will share the required information if they understand and agree with the privacy
policies, as well as trust that an e-business will
only use their personal data to better the available personalized offerings. In the information
age, most consumers are bombarded with useless
information on a daily basis; customized offerings are a most welcome change. However, if the
trust is breeched, an e-business will experience an
uphill battle to regain relationships with existing
customers and acquire new consumers.
8
references
8
8
8
Kang, H., & Yang, H. (2006). The visual characteristics of avatars in computer-mediated communication: Comparison of internet relay chat
and instant messenger as of 2003. International
Journal of Human-Computer Studies, 64(12),
1173-1183.
Kilbourne, W., & Weeks, S. (1997). A socio-economic perspective on gender bias in technology.
Journal of Socio-Economics, 26(1), 243-260.
Kim, D. J., Steinfield, C., & Lai, Y. (2004). Revisiting the role of web assurance seals in consumer
trust. In M. Janssen, H. G. Sol, & R. W. Wagenaar (Eds.), Proceedings of the 6th International
Conference on Electronic Commerce (ICEC 04,
60) (pp. 280-287). Delft, The Netherlands: ACM
Press.
King, J., Bond, T., & Blandford, S. (2002). An
investigation of computer anxiety by gender
and grade. Computers in Human Behavior, 18,
69-84.
Krill, P. (2002). DoubleClick discontinues web
tracking service. ComputerWorld. Retrieved February 24, 2002, from http://www.computerworld.
com/storyba/0,4125,NAV47_STO67262,00.html
Large, A., Beheshti, J., & Rahman, T. (2002).
Gender differences in collaborative web searching
behavior: An elementary school study. Information Processing & Management, 38, 427-443.
Lauer, T., & Deng, X. (2007). Building online trust
through privacy practices. International Journal
of Information Security, 6(5), 323-331.
Leitzsey, M. (2006). Facebook can cause problems
for students. Online PacerTimes. Retrieved October 2, 2007, from http://media.www.pacertimes.
com/media/storage/paper795/news/2006/01/31/
News/Facebook.Can.Cause.Problems.For.Students-1545539.shtml
Levy, S., & Gutwin, C. (2005). Security through
the eyes of users. In Proceedings of the 14th In-
Meyers-Levy, J. & Maheswaran, D. (1991). Exploring differences in males and females processing strategies. Journal of Consumer Research,
18(June), 63-70.
Microsoft. (2003). Windows media player 9 series
privacy settings. Retrieved July 7, 2007, from
http://www.microsoft.com/windows/windowsmedia/player/9series/privacy.aspx
Miyazaki, A. D., & Fernandez, A. (2000). Internet
privacy and security: An examination of online
retailer disclosures. Journal of Public Policy &
Marketing, 19(Spring), 54-61.
Miyazaki, A. D., & Fernandez, A. (2001). Consumer perceptions of privacy and security risks
for online shopping. Journal of Consumer Affairs,
35(1), 27-44.
Ono, H., & Zavodny, M. (2005). Gender differences in information technology usage: A U.S.Japan comparison. Sociological Perspectives,
48(1), 105-133.
Mukherjee, A., & Nath, P. (2007). Role of electronic trust in online retailing: A re-examination of
the commitment-trust theory. European Journal
of Marketing, 41(9/10), 1173-1202.
8
Schuman, E. (2006). Gartner: $2 billion in ecommerce sales lost because of security fears.
eWeek.com, November 27. Retrieved October
15, 2007, from http://www.eweek.com/article2/0,1895,2063979,00.asp
8
Yao, M., Rice, R., & Wallis, K. (2007). Predicting user concerns about online privacy. Journal
of the American Society for Information Science
and Technology, 58(5), 710-722.
8
AddItIonAl reAdIng
Antn, A., Bertino, E., Li, N., & Yu, T. (2007). A
roadmap for online privacy policy management.
Communications of the ACM, 50(7), 109-116.
Bonini, S., McKillop, K., & Mendonca, L. (2007).
The trust gap between consumers and corporations. The McKinsey Quarterly, 2, 7-10.
Boyens, C., Gnther, O., & Teltzrow, M. (2002).
Privacy conflicts in CRM services for online
shops: A case study. In Proceedings of the IEEE
International Conference on Privacy, Security,
and Data Mining Maebashi City, Japan, (Vol.
14, pp. 27-35).
Curtin, M. (2002). Developing trust: Online privacy and security. New York: Springer-Verlag.
Cvrcek, D., Kumpost, M., Matyas, V., & Danezis,
G. (2006). A study on the value of location privacy. In Proceedings of the 5th ACM Workshop
on Privacy in Electronic Society (pp. 109-118).
Alexandria, VA.
Drennan, J., Sullivan, G., & Previte, J. (2006).
Privacy, risk perception, and expert online behavior: An exploratory study of household end
users. Journal of Organizational and End User
Computing, 18(1), 1-22.
Earp, J., & Baumer, D. (2003). Innovative web
use to learn about consumer behavior and online
privacy. Communications of the ACM, 46(4),
81-83.
Electronic Frontier Foundation (EFF). (2007).
Privacy issues. Retrieved July 7, 2007, from
http://www.eff.org/Privacy/
Electronic Privacy Information Center (EPIC).
(2007). EPIC online guide to practical privacy
tools. Retrieved July 7, 2007, from http://www.
epic.org/privacy/tools.html
Frackman, A., Ray, C, & Martin, R. (2002). Internet and online privacy: A legal and business
guide. New York: ALM Publishing.
88
Freeman, L., & Peace, A. (Eds.). (2005). Information ethics: Privacy and intellectual property.
Hershey, PA: Information Science Publishing.
Golle, P. (2006). Revisiting the uniqueness of
simple demographics in the U.S. population. In
Workshop on Privacy in the Electronic Society:
Proceedings of the 5th ACM Workshop on Privacy
in the Electronic Society (pp. 77-80). Alexandria,
VA.
Gross, R., Acquisti, A., & Heinz, H. (2005).
Information revelation and privacy in online
social networks. In Workshop on Privacy in The
Electronic Society: Proceedings of the 2005 ACM
Workshop on Privacy in the Electronic Society
(pp. 71-80). Alexandria, VA.
Kauppinen, K., Kivimki, A., Era, T., & Robinson, M. (1998). Producing identity in collaborative virtual environments. In Proceedings of the
ACM Symposium on Virtual Reality Software and
Technology (pp. 35-42). Taipei, Taiwan.
Khalil, A., & Connelly, K. (2006). Context-aware
telephony: Privacy preferences and sharing patterns. In Proceedings of the 2006 20th Anniversary
Conference on Computer Supported Cooperative
Work (pp. 469-478). Banff, Alberta, Canada.
Landesberg, M., Levin, T., Curtain G, & Lev,
O. (1998). Privacy online: A report to congress.
Federal Trade Commission. Retrieved July 7,
2007, from http://www.ftc.gov/reports/privacy3/
toc.shtm
Lumsden, J., & MacKay, L. (2006). How does
personality affect trust in B2C e-commerce?
In ACM International Conference Proceeding
Series, Vol. 156: Proceedings of the 8th International Conference on Electronic Commerce: The
New E-Commerce: Innovations for Conquering
Current Barriers, Obstacles and Limitations to
Conducting Successful Business on the Internet.
(pp. 471-481). Fredericton, New Brunswick,
Canada.
89
90
Chapter IX
AbstrAct
This chapter looks at the literaturemyths and realitiessurrounding the demographics, psychological
predispositions, and social/behavioral patterns of computer hackers, to better understand the harms that
can be caused to targeted persons and property by online breaches. The authors suggest that a number
of prevailing theories regarding those in the computer underground (CU)such as those espoused by
the psychosexual theoristsmay be less accurate than theories based on gender role socialization, given
recent empirical studies designed to better understand those in the CU and why they engage in hacking
and cracking activities. The authors conclude the chapter by maintaining that online breaches and online
concerns regarding privacy, security, and trust will require much more complex solutions than currently
exist, and that teams of experts in psychology, criminology, law, and information technology security
need to collaborate to bring about more effective real-world solutions for the virtual world.
Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.
IntroductIon
Hackers are the elite corps of computer designers
and programmers. They like to see themselves
as the wizards and warriors of tech. Designing
software and inventing algorithms can involve
bravura intellection, and tinkering with them is as
much fun as fiddling with engines. Hackers have
their own culture, their own language. And in the
off-hours, they can turn their ingenuity to sparring
with enemies on the Net, or to the midnight stroll
through systems you should not be able to enter,
were you not so very clever. Dark-side hackers, or
crackers, slip into systems for the smash-and-grab,
but most hackers are in it for the virtuoso ingress.
It is a high-stress life, but it can be amazing fun.
Imagine being paidwell paidto play forever
with the toys you love. Imagine. St. Jude, Mondo
2000: Users Guide to the New Edge
Since its appearance in the United States in
the second part of the twentieth century, the Internet has been the topic of arduous study from
a number of academic disciplines, including the
social sciences and criminology, business, law,
computer science, and political science. In recent
decades, as the Internet has expanded at unprecedented rates, and with different socio-economic
interests becoming increasingly involved, the
Internets impact on global citizens daily lives
has been profound. The Internet has become one
of the most important ways of communicating
internationally in real time (such as is the case
with online activismknown in the information technology field as hacktivism). Also, the
complex infrastructure of the Internet has on
the positive side facilitated a number of common
activitiessuch as e-commerce, Internet banking, online gaming, and online votingand has
provided a more level political and economic
playing field for citizens residing in both developed and developing nations, particularly in
China, India, Russia, and Pakistan.
9
9
9
9
bAckground
purpose of thIs chApter
The purpose of this chapter is to summarize what
is known in the literature about the demographic,
psychological, and social/behavioral patterns of
computer hackers and crackers. This information
can improve our knowledge of cyber intruders
and aid in the development of effective techniques
and best practices to stop them in their tracks.
There is little question among IT security experts
that when it comes to privacy issues, hackers and
crackers are often ignored. As a matter of fact, if
crackers do attack an international database containing high degrees of personal and/or homeland
security information (with the 2005 LexisNexis
database exploit serving as a smaller-scale case in
point), this large-scale exploit could cause massive
disasters affecting citizens across multitudes of
jurisdictions, including a critical infrastructure
failure. This chapter intends to assist in shedding
light on what is known about how hackers and
crackers generally tend to think and behave.
Specific topics covered in this chapter include:
9
9
9
theoretical framework
From its very beginning as an all-male Tech
Model Railroad Club in the 1960s at MIT, where
the geeks had an insatiable curiosity about how
thingsand particularly how a slow-moving hunk
of metal called the PDP-1 workedthe CU has
attracted to this day predominantly men to its
fold. Back then, because of the PDP-1s turtle-like
pace, the smarter computer programmers at MIT
created what they called hacks, or programming
shortcuts, to complete their computing tasks more
efficiently. In fact, the clubs adoption of the term
hacker to describe themselves as well as their
acts indicated a creative individual who could
push the envelope around what computers
were designed to do. The clubs talented hackers
became the seed of MITs Artificial Intelligence
(AI) Lab, the worlds prime center of AI research.
In 1969, the AI labs fame and influence spread
fast, the year in which the Advanced Research
Projects Agency Network, or ARPANET, was
formed, the first transcontinental, high-speed
computer network created by the U.S. Defense
Department as an experiment in digital communications (Schell et al., 2002)
It is interesting to note that the positive, creative
reputation associated with those in the CU has
over the years taken on a negative connotation.
Since the 1980s, the media seem to have focused
on the darker side, frequently reporting the costs
due to property and personal harm as a result of
computer exploits by those in the CU. Moreover,
this rather less than positive picture has also been
painted by theorists trying to understand this
rather unique population.
One school of thought posited by psychosexual
theorists argues that hacking can be viewed as a
way for young men to fulfill their erotic desires
through electronic means (Taylor, 2003). This
notion is generated, in part, by stereotypical conceptions of hackers as introverted, socially inept,
or awkward males who have difficulty relating to
others (Furnell, 2002; Taylor et al., 2006). Certain
99
00
and female) earned, on average, $54,000. Moreover, the Black Hat and White Hat males tended
to work in large companies (with an average of
5,673 employees) and were generally not charged
with hacking-related crimes (but not exclusively),
whereas the females (White Hat and Black Hat)
tended to prefer working in smaller companies
with an average of about 1,400 employees.
Interestingly and consistent with previous
study findings and with the myth about hackers
that they value information and activities that
make them smarter, both the Black Hat and the
White Hat hackers in the Schell et al. (2002) study
tended to be self- and other-taught (like the founding hackers at MIT) and were quite well educated,
with at least a community college education. The
female White Hats and Black Hats, consistent with
the gender role socialization theory, admitted to
generally learning their computer skills later in life
at college or university, largely because they were
not steered in the technology direction by parents,
teachers, or career counselors. Also, consistent
with Meyers (1989) earlier study suggesting that
neophyte hackers are drawn to computers from
an early age and tinker with them on their own
time, the most frequent response (39%, n = 83) to
the item asking the hackers (male and female) how
they learned their computer skills was that they
were self-taught. The next largest pocket (7% of
the respondents) said that they were self-taught,
completed formal courses and courses on the job,
and learned from friends and relatives.
Regarding the myth in the public domain about
odd sleeping patterns of hackers in the CU, a significant 79% of the hacker respondents (males and
females) said that they sleep sometime during the
night from 12 midnight through 8 A.M.similar
to individuals sleeping patterns in mainstream
culture. Thus, Schell et al. (2002) noted that this
myth about odd sleeping patterns was unfounded
by their study findings.
Regarding prevailing culture within the CU,
Thomas (2002) has suggested, as earlier noted, that
those in the CU operate within a predominately
0
0
theoretical framework
At the start of this chapter, we addressed a number
of concerns that industry and the general public
have about privacy, security, and trust. However,
in recent months, even those in the CU have
expressed concerns about harmful episodes that
have jeopardized their psychological safety and
could interfere over the longer term with their
personal safety. For example, in March, 2007,
anonymous online death threats were levied
against Kathy Sierra, a popular Web developer
within the information technology community,
author, and blogger who encourages companies
to consider human behavior when designing their
technological products. While many bloggers rallied to her supportonline and off-line, a number
of women and men got online to talk about their
incidents of online bullying, harassment, and
stalking (Fost, 2007).
By definition, online bullying entails verbally
abusing targets by threatening to cause harm to
ones reputation; cyber harassment uses cyber
space to harass a targeted individual; and cyber
stalking occurs when individuals repeatedly
deliver online unwanted, threatening, and offensive e-mail or other personal communications
to targeted individuals, including death threats
(Schell & Martin, 2006). All of these threats are
intended to cause psychological damage to others,
and some of these exploits may actually result in
death to the targets.
One of the most well known cyber stalking
cases reported in the popular media involved a
young cracker named Eric Burns (a.k.a. Zyklon).
Eric Burns claim to fame is that he attacked the
Web pages of about 80 businesses and government offices whose pages were hosted by Laser.
Net in Fairfax, Virginia. Burns, a creative individual, designed a program called Web bandit
to identify computers on the Internet that were
vulnerable to attack. He then used the vulnerable systems to advertise his proclamations of
love for a young classmate named Crystal. These
computer exploits by Burns became his way of
advertising worldwide his unrelenting love [or,
more accurately, to get the attention of and then
to take control of or to take root of] Crystal. He
hoped that by proclaiming his love in the cyber
world, he would, hopefully, get her attention, if
not her long-term commitment. This real-world
case ended with the 19-year-old male pleading
guilty to attacking the Web pages for NATO and
Vice President Al Gore. In November, 1999, the
judge hearing the case ruled that Burns should
serve 15 months in federal prison for his cracking
exploits, pay $36,240 in restitution, and not be
allowed to touch a computer for 3 years after his
release. The irony in this case is that the young
woman named Crystal attended the same high
school as Eric but hardly knew him. In the end,
she assisted the authorities in his capture, but Eric
Burns, in his role as cyber stalker, did not make
the media headlines, just the fact that he cracked
Web sites (Reuters, 1999).
Mental health experts who assessed Eric Burns
declared that he likely felt more comfortable communicating online than in person, he had difficulty
overcoming his fear of rejection by people in the
real world, he lacked the social skills to repair
relationships, and he was mocking authority by
saying something like I can put my favorite
girls name on your Web site for everyone to see,
but you cant get me. In short, Eric Burns had a
number of pre-existing mental health issues that
he acted out online (Schell et al., 2002).
The case of Eric Burns is symbolic on a number of planes, including that most of what the
literature reports about the psychological profiles
of those in the CU has been gleaned from legal
0
0
0
0
0
08
closIng
This chapter has discussed a number of privacy,
security, and trust issues affecting online consumers. Clearly, understanding these issues and finding solutions for themlegislative, technological,
sociological, or psychologicalis a complex chore
requiring experts in multiple fields, including law,
information technology, security, business, and
the social sciences. Likely the only way forward in
finding more comprehensive solutions is to adopt
a team approach for finding hybrid solutions that
are outside any one silo of expertise.
references
Associated Press. (2005). Business schools: Harvard to bar 119 applicants who hacked admissions
site. The Globe and Mail, March 9, B12.
Brenner, S. (2001). Is there such a thing as virtual
crime? Retrieved February 1, 2006,from http://
www.crime-research.org/library/Susan.htm
Cincu, J., & Richardson, R. (2006). Virus attacks
named leading culprit of financial lossby U.S.
companies in 2006 CSI/FBI computer crime and
security survey. Retrieved July 13, 2006, from
http://www.gocsi.com/press/20060712.jhtml
Cline, J. (2003). The ROI of privacy seals.
Retrieved October 12, 2007, fromhttp://www.
computerworld.com/developmentopics/websitemgmt/story/0,10801,81633,00.html
Cohoon, J. M., & Aspray, W. (2006). A critical
review of the research on womens
participation in postsecondary computing education. In J. M. Cohoon and W. Aspray (Ed.),
Women and information technology: Research on
underrepresentation (pp. 137-180). Cambridge,
MA: MIT Press.
Evans, M., & McKenna, B. (2000). Dragnet targets
Internet vandals. The Globe and Mail, February
10, A1, A10.
09
Jonczy, J., & Haenni, R. (2005). Credential networks: A general model for distributed trust and
authenticity management. Retrieved on October
10, 2007, from http://www.lib.unb.ca/Texts/
PST/2005/pdf/jonczy.pdf
Jordan, T., & Taylor, P. (1998). A sociology of hackers. The Sociological Review, 46(4),757-780.
Krebs, B. (2003). Hackers to face tougher sentences. Washington Post. Retrieved Feburary 4,
2004, from http://www.washingtonpost.com/ac2/
wp-dyn/A35261-2003Oct2?language=printer
0
AddItIonAl reAdIngs
The Cybercrime Blackmarket. (2007). Retrieved
July 11, 2007, from http://www.symantec.com/avcenter/cybercrime/index_page5.html
Florio, E. (2005). When malware meets rootkits.
Retrieved July 11, 2007, from http://www.symantec.com/avcenter/reference/when.malware.
meets.rootkits.pdf
Grabowski, P., & Smith, R. (2001). Telecommunications fraud in the digital age: The convergence
of technologies. In D. S. Wall (Ed.), Crime and the
Internet (pp. 29-43). New York: Routledge.
enforcement, security and surveillance in the information age (pp. 1-14). New York: Routledge.
Chapter X
AbstrAct
This chapter introduces a situational paradigm as a means of studying online privacy. It argues that
data subjects are not always opponent to data users. They judge contexts before disclosing information. This chapter proves it by examining online privacy concerns and practices with two contexts:
technology platforms and users motivations. It explores gratifications of online photo album users in
Taiwan, and finds the distinctive staging phenomenon under the theory of uses and gratifications,
and a priori theoretical framework, the spectacle/performance paradigm. The users with diffused
audience gratifications are concerned less about privacy but not disclose more of their information.
Furthermore, it finds that users act differently in diverse platforms, implying that studying Internet as a
whole is problematic. The author proposes that studying online privacy through the use of a situational
paradigm will help better research designs for studying privacy, and assist in understanding of users
behaviors among technology platforms.
IntroductIon
The common assumptions of the online privacy
concerns literature claim that net users who have
Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.
bAckground
studying privacy with the situational
paradigm
The major flaw of the current definition of privacy is that it assumes that people are vulnerable
without considering situations. Therefore, privacy
risks are always deemed to be dangerous (Raab
& Bennett, 1998). According to the previous discussion, studying the nature of privacy, privacy
risks, and privacy protection with the adversarial
paradigm means one is always coping with new
privacy infringement. As Moor (1997) puts it,
the privacy concept has been developed chronologically. In the current computer age, privacy
has become very informationally enriched.
As such, there should be an updated approach
studying privacy.
Moor, Raab, and Bennett separately study
the nature of privacy, privacy risks, and privacy
protection from an adversarial paradigm toward
a situational paradigm, especially for Internet
settings. However, little research is aware that
privacy concerns studies are still trapped on the
adversarial paradigm. Nowadays, online privacy
concerns studies mostly adopt the off-line literature review and try to find what kind of Internet
users care more about their privacy by means of
using demographics as independent variables
(Hoffman, Novak, & Peralta, 1999; Kate, 1998;
Milne & Rohm, 2000; ONeil, 2001; Sheehan,
2002). Nevertheless, the findings of the privacy
concerns literature focusing on demographics
usually are in conflict with each other. It implies
that privacy concerns are not static, but vary
with contexts.
8
diffused Audiences
As Abercrombie and Longhurst (1998) argue, audience research has to take account of the changing
nature of audience and social processes, which
current research ignores. The common uses and
gratifications approach (and effects literature) is
categorized as a behavioral paradigm. According
to Hall (1980), there are a number of problems with
this paradigm, such as its lack of attention to power
relations, the textual nature of media message,
and understanding of social life. Halls critical
approach to the study of the media is categorized
as the incorporation/resistance paradigm (IRP),
as in how social structuring and social location
influence decoding of media texts. The key argument is the extent to which audiences resist or are
incorporated by media texts in ideological terms.
However, overemphasizing the coherence of the
response to different texts is problematic.
Abercrombie and Longhurst (1998) argue that
the spectacle/performance paradigm (SPP) is
much better in understanding the changing audience and conceptualizations of the audience and in
recognizing the audiences identity formation and
reformation in everyday life. They propose that
there are three different types of audiences: simple,
mass, and diffused. All of them co-exist.
The simple audience involves direct communication from performers to audience. The mass
audience reflects the more mediated forms of
communication. The diffused audience implies
that everyone becomes an audience all the time,
which entails people spending increasing amounts
of time in media consumption. Thus, the audience
interacts with the form of mediascapes, rather than
9
0
data collection
The survey sample is drawn from volunteers, who
were recruited from the biggest online photo album
Web site in Taiwan: Wretch (http://www.wretch.
cc). The advertisement for recruiting volunteers
was put on the homepage with a hyperlink to
a Web survey interface. A sample of 893 users
participated in this Web survey. The sample
consisted of 91.3% users (815) who have at least
one online photo album and 8.6% users who do
not have any (77). The average number of hours
on the Internet per week was reported to be 5.64
hours. The most frequent visiting album types
are relatives (56.2%), gorgeous persons (48.0%),
celebrities (19.4%), and others.
Instrument construction
This study draws on motives identified in previous studies on uses and gratifications from mass
media and the Internet, adds the virtual community factor found by Song and his colleagues,
and develops new items based upon Abercrombie
and Longhursts diffused audience cycle. The
diffused audience questions are developed by
data Analysis
The results are listed, followed by the orders of
research questions and hypotheses. All analyses
were done using the SPSS 14.0 statistical program.
Principal component solution and varimax rotation
were adopted to find gratification groupings.
This chapter selects items with a major
loading at 0.5 or higher and a secondary loading no less than 0.40. Each factor is extracted
with eigenvalues greater than 1.0 and minimum
reliability more than 0.60. All hypotheses were
tested by Pearson product-moment correlations.
A regression analysis was employed to evaluate the relationship between privacy concerns
(dependent variable) and Internet gratifications
(independent variables).
results
RQ1: What is the relationship between privacy
concerns and practices?
H1a: The more they disclose their information,
the less privacy concerns they hold.
H1b: The more they disclose their visual cues, the
less privacy concerns they hold.
H1c: The more they disclose their visual cues
(post more pictures), the more information
they disclose.
It seems that the respondents are not concerned
a lot about privacy as the means are all fewer
than 2 (see Table 1). However, the respondents
concern about their privacy is very different as
the standard deviations are all higher than .600,
especially questions 1, 5, 7, and 12, in which
their standard deviations are over .800. In order
to know the details, all 12 questions added as
general privacy concerns, are further analyzed
with gratifications and other variables.
The correlation of frequency of information
disclosing and general privacy concerns shows
some critical points. First, the privacy practices
are not parallel with their privacy concerns. The
coefficients are quite weak and contradictory (see
Table 2). Those who post more clear pictures of
themselves (-.130**) or clear pictures of their
friends or relatives (-.124**) hold less privacy
concerns. Second, users who usually disclose
visual cues on the Internet are concerned less
about privacy. However, despite other family
members information, contact information, and
demographic information being somehow more
sensitive than everyday life and online contact
information, respondents with higher privacy
concerns unexpectedly disclose more of the
sensitive types of information. In order to make
clear the contradiction, it is necessary to see the
differences between users who have online photo
albums and those who do not.
SD
1.
1.85
.887
2.
When a Web site asks me for personal information, I sometimes think twice before providing it.
1.77
.789
3.
Web sites should take more steps to make sure that unauthorized people cannot access personal
information in their computers.
1.48
.687
4.
When people give personal information to a Web site for some reason, the Web site should never
use the information for any other reason.
1.34
.637
5.
Web sites should have better procedures to correct errors in personal information.
1.69
.815
6.
Computer databases that contain personal information should be protected from unauthorized
access, no matter how much it costs.
1.35
.638
7.
Some Web sites ask me to register or submit personal information before using them. It bothers
me to give my personal information to so many websites.
1.82
.910
8.
Web sites should ever sell the personal information in their databases to other companies.
1.26
.605
9.
Web sites should never share personal information with other Web sites unless it has been
authorized by the individual who provided the information.
1.31
.627
10.
I am concerned that Web sites are collecting too much personal information about me.
1.60
.799
11.
The best way to protect personal privacy on the Internet would be through strong laws.
1.50
.781
12.
The best way to protect personal privacy on the Internet would be through corporate policies,
which the corporations develop themselves.
1.63
.908
-.130**
-.124**
Demographic information
.126**
Contact information
.183***
.080*
.199***
.028
Table 3. Effect of having online photo album or not on privacy concerns and information disclosure
Online
photo
album
Mean
SD
Mean
difference
YES
702
1.53
.463
-4.544a
-.31
NO
63
1.84
.873
YES
746
3.72
1.253
4.526a
1.724
(PCU)
NO
11
2.00
1.342
YES
744
3.63
1.256
3.549a
1.360
-2.109c
-.786
-4.445a
-1.309
-.093
-.040
-4.475a
-1.210
.916
.330
(PFR)
NO
11
2.27
1.618
Demographic information
YES
734
2.21
1.159
(DEM)
NO
10
3.00
1.886
Contact information
YES
741
1.49
.912
(CON)
NO
10
2.80
1.687
Online contact
information
YES
741
2.66
1.348
(OCO)
NO
10
2.70
1.767
YES
743
1.49
.832
(FAM)
NO
10
2.70
1.767
Everyday life
YES
743
3.53
1.124
(LIF)
NO
10
3.20
1.687
a
Correlation is significant at the 0.001 level (2-tailed). b Correlation is significant at the 0.01 level (2-tailed). c Correlation is
significant at the 0.05 level (2-tailed).
.547
.738
.851
.860
.837
.513
.587
5.31
10.83
0.89
4.03
8.232
0.88
.730
.757
.787
.794
.705
.575
Table 4. continued
Feel excited
.520
Feel entertained
.754
Feel relaxed
.738
Kill time
.655
3.57
7.292
0.85
3.52
7.190
0.83
3.52
7.179
0.82
3.47
7.088
0.90
3.21
6.559
.88
3.01
6.153
.82
1.75
3.565
.59
1.62
3.301
.82
.568
.576
.711
.644
.686
.608
.574
.641
.700
.507
.545
Get noted
.610
.744
.879
.885
.798
.645
.659
.636
.660
.568
.637
.535
Find companionship
.755
.674
.777
.519
.600
.719
MD
c
DV
PF
NA
.127
.031
RM
AE
VC
FN
RE
.003
.096
.061
-.030
.021
.080c
1.
-.001
.082
2.
.116b
.127a
.137a
.066
.049
.142a
.136a
.061
.071c
.123a
3.
.093b
.131a
.205a
.060
.049
.170a
.173a
.097b
.063
.154a
4.
.050
.137
.204
.037
.009
.203
.138
.036
.063
.102b
5.
.139a
.218a
.223a
.177a
.124b
.164a
.229a
.172a
.201a
.200a
6.
.089
.141
.169
.050
.008
.179
.169
.060
.088
.128a
7.
.039
.054
.160a
.040
.037
.089c
.103b
-.013
.057
.084c
8.
.093b
.153a
.198a
.056
.016
.214a
.166a
.064
.089c
.126a
9.
.019
.115
.164
.040
-.027
.186
.144
.016
.064
.079c
10.
.119b
.147a
.153a
.093b
.108b
.177a
.145a
.108b
.167c
.164a
11.
.132a
.168a
.158a
.029
.053
.170a
.168a
.071c
.186a
.165a
12.
.169
Gen.
.127b
.163
.177
.101
.139
.147
.163
.130
.180
.157a
.195a
.247a
.107b
.083c
.236a
.219b
.109b
.156a
.206a
a
Correlation is significant at the 0.001 level (2-tailed). b Correlation is significant at the 0.01 level (2-tailed).c Correlation is
significant at the 0.05 level (2-tailed).
Table 6. Regressoion analysis with the enter Table 7. Regressoion analysis with the stepwise method
method
Model
1
Beta
-1.621
.064
1.247
.192
3.464
Performance
-.072
-1.419
Narcissism
-.050
-.934
Diversion
.134
3.097b
Aesthetic
Experience
.094
1.701
Virtual
Community
-.032
-.581
Reference
8
Model
8.886a
.120
Beta
(Constant)
Relationship
Maintenance
Relationship
Maintenance
Function
R2
22.850
46.303
R2
10.707
-.085
Media Drenching
F
a
(Constant)
Information
Seeking
.256
(Constant)
.066
.066
6.805 a
14.341
a
Relationship
Maintenance
.191
4.799 a
Diversion
.180
4.527 a
(Constant)
34.086
.104
.151
3.584 a
a
.661
Diversion
.159
3.952
.119
Reference
.114
2.760 a
.028
.010
Relationship
Maintenance
.094
12.263
.029
2.535
R2
change
25.492
a
9
DV
a
-.076
PCU
.024
-.198
PFR
.050
-.153 a
DEM
.033
-.051
PF
c
NA
RM
AE
VC
FN
RE
-.045
.061
-.065
-.179 a
-.062
-.126 b
-.031
-.065
-.023
-.022
-.314
.007
-.018
-.004
-.272 a
.006
.090
-.004
-.062
.001
-.082 c
-.004
.007
CON
-.045
.058
-.005
-.024
-.112
.072
.004
-.078
-.038
.001
OCO
-.086 c
-.015
-.143 a
-.103 b
-.151 a
-.020
-.104 b
-.179 a
-.075 c
-.024
FAM
-.079 c
-.020
.011
-.083 c
-.156 a
.057
.020
-.101b
-.033
-.006
-.046
-.062
-.028
-.163 a
LIF
0
MD
-.019
-.164
-.088
-.075
-.077
-.178
and protecting privacy through strong laws. Accordingly, it is quite reasonable that users manage
pure online relationships with special cautions of
being cheated or their identity in the real world
being divulged.
This in turn begs the question: What gratification might entail users to care more about their
privacy? By looking at both correlation and regression analyses, relation maintenance, diversion, and
reference gratifications predict privacy concerns
with a 10.4% variance. Three gratifications are
gained from using online photo albums secretly
or sharing with some in-group persons. Thus,
they care about collection, data error, unauthorized access, and secondary uses, and hope that
privacy can be protected by both strong laws and
corporate policy. As for other gratifications, they
are not significant predictors at all. Moreover, relation maintenance raises the interesting challenge
to understanding the international/intercultural
differences of privacy (Taylor et al., 2000). Does
the in-group privacy phenomenon only happen
in Taiwan or Asian countries? Could it be applied
to other countries and cultures?
By doing this research, the author tries to prove
that current online privacy theories, frameworks,
and models are not thoughtful. The situational
paradigm could seem to be a solution for online
privacy research. Speaking of the situational
paradigm, two common questions are brought
up most. First, when it comes to contexts, there
are no concrete answers to questions. It is not
possible to depict privacy, because it depends on
the contexts. Second, how can we account for any
kind of contexts? The situational paradigm does
not mean contexts or situations are omnipotent.
We need to keep in mind that users behavior
patterns and environment risks are not static, but
dynamic, especially in Internet settings. Thus, it
would be unguarded and arbitrary to conclude
what types of persons care more about their privacy and disclose less information. The findings
also give a positive answer to the main questions
of this book: There are no absolute truths on
future trends
In previous studies, Web sites categories, regulation, culture/nation, demographics, physical
environment, and technology contexts are all examined. There are many contexts left uncovered.
It is crucial to find more contexts which are able
to predict net users behaviors on the Internet.
It additionally seems that the purposes of
using varied forms of computer-mediated communication (CMC) influence Net users privacy
concerns and practices. It is also essential to find
their purposes of using CMC. Take this research
as an example. The uses and gratifications approach and the SPP provide a glimpse of how
users individual needs influence their privacy
concerns and practices. Users who have more
private gratifications act and are concerned are so
differently from those who have more spectacular
and performance gratifications.
Researchers not only discover the predictive
power of contexts, but also further examine their
interrelationship. The next step is to use structural equation models (SEM) such as LISREL,
to confirm the relationship.
conclusIon
This study explores gratifications of online photo
album users in Taiwan and finds the distinctive
staging phenomenon with media gratifications
and a priori theoretical frameworkthe spectacle/
performance paradigm (SPP). Media drenching,
performance, function, and reference are new
gratifications, which no prior research has found.
Performance, narcissism, and virtual community
gratifications are consistent with the argument of
the diffused audience on the Internet.
This research then examines online privacy
concerns and practices with two contexts: technology platforms (online photo album websites) and
users gratifications (staging phenomenon) under
the situational paradigm. It finds that users gratifications influence users privacy concerns and
practices accordingly. Users with more diffused
audience gratifications are concerned less about
privacy, but do not necessarily disclose more of
their information. They judge the contexts first
before disclosing information.
Studying online privacy therefore needs to
adopt the situational paradigm instead of the
adversarial paradigm. The findings provide implications to policy-making in that online privacy
protection should not be fixed, but rather should
take human relativism into consideration.
references
Abercrombie, N., & Longhurst, B. (1998). Audiences: A sociological theory of performance and
imagination. CA: Sage Publication.
Barreto, M., & Ellemers, N. (2002). The impact of
anonymity and group identification on progroup
behavior in computer-mediated groups. Small
Group Research, 33, 590-610.
Bennett, C. J. (1992). Regulating privacy. Ithaca,
NY: Cornell University Press.
Hsu, C. W. (2002). Online privacy issues: Comparison between net users concerns and web
sites privacy statements. Paper presented to the
52nd Annual Conference of International Communication Association, Seoul, Korea.
Lea, M., Spears, R., & de Groot, D. (2001). Knowing me, knowing you: Anonymity effects on social
identity processes within groups. Personality and
Social Psychology Bulletin, 27, 526-537.
Lelia, G. (2001). Treating internet users as audiences: Suggesting some research directions.
Young, K. S. (1998). Internet addiction: the emergence of a new clinical disorder. CyberPsychology
& Behavior, 1(3), 237-244.
AddItIonAl reAdIng
Bennett, C., & Raab, C. (2006). The governance of
privacy policy instruments in global perspective
(2nd ed.). Cambridge, MA: The MIT Press
Bird, S. E. (2003). The audience in everyday
life: Living in a media world. N.Y. and London:
Routledge.
8
Section IV
0
Chapter XI
AbstrAct
While delivering content via the Internet can be efficient and economical, content owners risk losing
control of their intellectual property. Any business that wishes to control access to and use of its intellectual property is a potential user of digital rights management (DRM) technologies. DRM control content
delivery and distribution but it may affect users privacy rights. Exactly how it should be preserved is
a matter of controversy; and the main industry solution is the W3C platform for privacy preferences
(P3P) initiative. But, the issue remains unresolved, and industries and consumers are confronted with
several incompatible standards.
IntroductIon
The Internet has made an unprecedented impact
in daily life. Broadband subscriptions in the U.S.
Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.
digital Watermarking
Steganography is the art and science of hiding
a message in a medium, such as a digital image,
audio, or video file, that can defy detection. Application of steganography can be traced back
to antiquity. In the story of The 300 Spartans,
Demeratus needed to warn the Spartans that Xerxes, King of Persia, was about to invade Greece.
To send the message without detection, Demeratus
removed the wax from a writing tablet, wrote his
message on the wood underneath, and then covered
the message with wax, making the tablet look
like a blank one. While these kinds of methods
worked in times past, they were replaced in the
modern era by more sophisticated techniques such
as invisible inks and microdots. In the computer
age, digital images, as well as audio and video
files, offer a rich medium for hiding an almost
unlimited amount of data.
The rise of peer-to-peer (P2P) networks has
been an inevitable outgrowth with the rise of the
Internet. Unfortunately, P2P networks have grown
from helpful tools in information sharing to havens
for unauthorized copies of copyrighted materials.
Digital content identification or steganography
is crucial in setting up controlled distribution
systems and provide efficient means for copyright
protection. Steganographic techniques can be
divided into two categories: digital watermarking
and digital fingerprinting. While watermarking
is an arrangement of digital bits hidden in the
content of the carrier, fingerprinting is the deriving of a unique identifier based upon the charac-
There is no perfect lossless compression. Watermarking techniques face a similar trade off. The
stronger the embedded watermarking signal, the
easier it will be to detect, but it is also likely to
affect the quality of the reproduced content (Cox,
Miller, & Bloom, 2002).
digital fingerprinting
A digital fingerprint is a unique pattern that
describes the content for identification purpose.
Digital fingerprinting cannot resist illegal copying, but it enables the copyright owners or content
distributors to track the recipients who leak or
redistribute the fingerprinted content. Unlike
watermarks, a fingerprint is not added into, but
is extracted from the existing characteristics of
the digital content. While the watermark for a
recorded song may be a faint background sound,
the fingerprint would be derived from the songs
tempo, rhythms, the length of verses or movements, the mix of instruments used, or other
features. Fingerprinting identification works by
matching a small sample of digital content to
the original content against a database of fingerprints. Since digital fingerprinting does not add
data to the content, it can be applied to contents
that are already published. On the other hand,
as the fingerprint is derived from the content,
it cannot be used to store information about the
content like a watermark does. Furthermore,
taking the fingerprint of existing content does
require added work and equipment. Once the
fingerprint is created and stored in a database, it
could then perform similar functions as a digital
watermark, acting as a unique identifier for each
piece of content. In any case, to be persistent, a
watermark or fingerprint must be able to survive
any of the digital duplication and transformations
that the copyright infringer will likely to attempt
on the content.
Rather than preventing illegal copying,
steganography is used to detect copying after
the fact. An example of the application is trai-
On a positive note, this approach does not affect user privacy in terms of personal information
gathering and the actual use of digital content
can take place in total privacy, even off-line, free
of obstructions. However, apart from the obvious
security risks, such an approach constitutes a
more serious invasion of privacy by installing
software onto a users computer, which is personal
property, without their full knowledge. Even if
the user consented to the EULA, one could argue such agreement is based on unconscionable
licensing terms. Distributors must be extremely
careful with their embedded DRM software, especially after the SonyBMG episode of creating
rootkit security vulnerabilities in millions of
users computers (EFF, 2005). Embedded DRM
technology should always be subject to an independent and vigorous security review. Even with
such review, the strategy may not be worthwhile
given the legal liability and negative publicity
that could result.
DRM with encryption is less intrusive as it
does not involve installing software on a users
computer. Encryption keys are used to set and
automatically enforce limits on user behavior.
Different keys can have different privileges, and
the keys can even be tied to a particular device
or set of devices. Under this approach, the user
would submit the serial numbers of the devices
where content will be used. The encryption keys
can be generated using the submitted numbers as
a factor. This is a more complicated and persistent
approach, and it requires ongoing and periodic
contact between user and distributor. Users would
want to be able to continue using the content when
they are replacing an old device with new one.
While information collected under this scenario
will be minimal and less sensitive, it does build
up a device ownership database where information can be mined. Users privacy rights can be
affected, since the results of the data mining could
be sold to others.
While DRM encryption does not raise privacy issues in itself, the functional restrictions
8
9
deploying p3p
P3P implementation on a Web site involves both
business and technical issues; and it does require
a wide range of knowledge and input; from legal,
technical, and marketing perspectives. While it
can be done by a single individual in the case
of a small organization, it is more appropriately
conducted in a team approach involving members
with diverse backgrounds and resources. The first
team assignment is to review the current information practices of the organization; followed
by an in-depth discussion of long term business
plans for the use of consumer information. The
entire organization should be involved in deciding exactly what the privacy policy should be,
0
future trends
DRM systems can have many variations in implementation details, but they generally conform
to a common structure (Rosenblatt, Trippe, &
Mooney, 2001). Consumers download the digital
content from a Web site. The digital content
contains encrypted content and a key, along with a
license that stipulates the usage right. A software
or hardware controller interprets the licensing
agreement. If authentication and authorization are
successful, the controller allows the consumer to
perform what is intended, such as playing, viewing, or copying of the digital content. However,
within the general framework, there are several
important implementation specific issues that
must be addressed. These issues are:
a.
c.
d.
e.
f.
g.
conclusIon
Information goods are characterized by negligible
marginal costs, and therefore arguments in favor of
1.
2.
What are the benefits and challenges of different systems and business models?
What are the current trends in the digital
content distribution and delivery market?
references
American Heritage. (2006). American Heritage
Dictionary of the English Language (4th ed.).
Retrieved April 16, 2007, from http://dictionary.
reference.com/browse/copyright
Chor, B., Fiat, A., Naor, M., & Pinkas, B. (2000).
Tracing traitors. IEEE Transactions on Information Theory, 44(3), 893-910.
ClickZ. (2003). Users still resistant to paid content.
Jupitermedia, April 11. Retrieved April 16, 2007,
from http://www.ecommerce-guide.com/news/
news/print.php/2189551
Cohen, J. (2003). DRM and privacy. Communications of the ACM, 46(4), 47-49.
Cox, I., Miller, M., & Bloom, J. (2002). Digital
watermarking: Principles & practice. San Diego,
CA: Academic Press.
Cravens, A. (2004), Speeding ticket: The U.S.
residential broadband market by segment and
technology. In-Stat/MDR.
EFF. (2005). Sony BMG Litigation Info, Electronic
Frontier Foundation. Retrieved April 16, 2007,
from http://www.eff.org/IP/DRM/Sony-BMG/
EMI. (2007, April 2). EMI music launches DRMfree superior sound quality downloads across
its entire digital repertoire. EMI Group, Press
Release.
EPIC. (2002, June 5). Letter to house judiciary
subcommittee on the courts, the internet, and intellectual property. The Electronic Privacy Information Center and the Electronic Frontier Foundation.
Retrieved April 16, 2007, from http://www.epic.
org/privacy/drm/hjdrmltr6.5.02.html
Estes, A. (2007, May 6). Bill would ban lenders
alerts to homebuyers. Boston Globe, p. B3.
Fraser, A. (1999, August 8). Chronicle of Higher
Education, 48, p. B8.
http://www.sans.org/reading_room/whitepapers/
basics/434.php
UCITA. (2002). Uniform Computer Information
Transactions Act. National Conference of Commissioners on Uniform State Laws.
USG. (1998). The Digital Millennium Copyright
Act of 1998, U.S. Copyright Office, Pub. 05-304,
112 Stat. 2860.
W3C. (2002a). How to create and publish your
companys P3P policy in 6 easy steps. W3C P3P
Working Group. Retrieved April 16, 2007, from
http://www.w3.org/P3P/details.html
W3C (2002b). The platform for privacy preferences 1.0 (P3P1.0) specification. W3C P3P
Working Group. Retrieved April 16, 2007, from
http://www.w3.org/TR/P3P/
Wikipedia (2007). Retrieved October 1, 2007, from
http://en.wikipedia.org/wiki/Public_key_infrastructure
AddItIonAl reAdIng
Ahrens, F. (2005). Hard news, daily papers face
unprecedented competition. Washington Post
Sunday, February 20, p. F01
BEUC. (2007). Consumers digital rights initiative,
European consumers organization. Retrieved
June 22, 2007, from http://www.consumersdigitalrights.org
Boiko, B. (2002). Content management bible. New
York: Wiley Publishing Inc.
Brands, S. (2000). Rethinking public key infrastructures and digital certificates; building in
privacy. Cambridge, MA: The MIT Press.
Chapter XII
AbstrAct
Marketing practices have always presented challenges for consumers seeking to protect their privacy.
This chapter discusses the ways in which the Internet as a marketing medium introduces additional
privacy concerns. Current privacy issues include the use of spyware and cookies, word-of-mouth marketing, online marketing to children, and the use of social networks. Related privacy practices, concerns,
and recommendations are presented from the perspectives of Internet users, marketers, and government
agencies. The chapter concludes with a discussion of the ways in which consumers privacy concerns,
as they apply to Internet marketing, would benefit from additional research..
IntroductIon
Privacy has once again become the price computer
users pay for their use of the information technology infrastructure Mathias Klang, 2004
The Internet is a marketers dream come true.
No other medium comes close to providing the
Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.
Year
Description
Privacy Act
1974
Communications
Decency Act
1996
Childrens
Online Privacy
Protection Act
1998
CAN-SPAM Act
2003
8
5.
Marketers have traditionally preferred selfregulation to government regulation. Self regulation efforts by business, such as the Network
Advertising Initiative (NAI), represent industrys
efforts to protect privacy without the aid of the
federal government. The NAI, which was formed
in 1999 in response to the controversial use by
DoubleClick of click stream data combined with
personal information, represented a historical
moment for privacy protection in marketing. Two
other industry groups, the U.S. Better Business
Bureau and TrustE, now provide downloadable
templates for businesses to provide accurate
privacy policies quickly and easily to corporate
web sites.
Companies have also taken a stand for self
regulation. In 1999 IBM announced that it would
cancel its Internet advertisements from any Web
site that did not provide clear privacy policies
(Auerbach, 1999). Nevertheless, an effective and
comprehensive self-regulatory policy for online
consumer privacy has not yet emerged (Culnan,
2000).
There are a number of recommended best
practices to be applied to the issue of data collection, ownership, and dissemination. In addition
to providing clearly labeled privacy policies and
prominent data collection warnings, marketers
will find it beneficial to practice good public and
customer relationships in order to generate online
registrations and information from consumers.
Building relationships with consumers before
making unsolicited contact or asking them to
register or provide information has been recommended as an important strategy for marketers
(Milne, 1997; Sheehan & Hoy, 1999). There is also
a highly positive relationship between a companys
positive reputation and consumers decisions to
provide accurate personal information to the Web
site (Xie, Teo, & Wan, 2006).
Internet AdvertIsIng
prActIces: the use of
cookIes And spyWAre
The advertising model, in which free or low
cost media content is exchanged for advertising
placements, has been a standard business model
in the off-line world. It appears that the Internet
has embraced a similar advertising-for-content
model, although there are examples of subscription-based content providers (The Wall Street
Journal, for example) and other models of online
exchange. Unlike traditional media such as television, magazines, or newspapers, the Internet can
deliver metrics such as advertising impressions,
click-through rates, and page views on a daily or
hourly basis. Despite the advantages that Internet
advertising provides, this new medium has also
created a new list of privacy issues for consumers
and marketers alike.
The most basic and vital tools in the marketers
toolbox are segmentation of the market and targeting of users, and the Internet helps marketers to
segment and target with great accuracy. Marketers are rapidly shifting mass media dollars to the
Internet, which has grown from .7% of total ad
dollars spent in 1998 to an estimated 3% in 2006,
which translates to $8.7 billion (Media Week
Marketers Guide to Media, 2006).
Behavioral targeting/marketing is online advertising that serves ads to individuals who are
most likely interested in them based on previous
online activity (Shimp, 2007). Behavioral targeting can be an improvement on traditional segmentation and targeting because the marketer has a
strong indication that the individual being targeted
has exhibited a behavior that indicates potential
interest in the product. Behavioral marketing
takes as its premise that past behaviorsuch as
site visits or page viewsis the best predictor
of future behavior. The information technology
forming the basis of behavioral marketing is the
cookie.
9
0
conclusIons And
recommendAtIons for
further reseArch
This chapter has provided an overview of some
of the key issues related to privacy in marketing
contexts and recommendations for the future.
While it is not an exhaustive review of all marketing-related issues, it is a comprehensive discussion
of several timely and important online marketing
privacy issues affecting marketers and consumers.
But there will be many more marketing issues
to analyze and critique, particularly when one
considers the dynamic nature of both marketing
and the Internet medium.
The marketing issues presented in this chapter
provide fertile research opportunities because
many of the marketing technologies and techniques discussed here are relatively new and
changing rapidly. Privacy policy disclosure and
clarity is one area where we can expect to see
activity in the future. Much of the literature about
marketing and online privacy recommends the
inclusion of clearly written, prominently displayed
privacy policies. The thinking by many of those
who study this issue is that information is powerful and that awareness is the first step towards
compliance. Much needs to be done, because
only 5% of Fortune 500 companies were found to
fully comply with the guidelines for provision of
privacy policy notices on their Web sites (Hong,
McLaughlin, Pryor, Beaudoin, & Grabowicz,
2005). It may be that the provision of privacy
policies could become a Web-based requirement
for e-commerce, similar to the financial disclosures consumers must receive prior to investing
in a mutual fund. Turow (2003) recommends that
references
Ashworth, L., & Free, C. (2006). Marketing
dataveillance and digital privacy: Using theories of justice to understand consumers online
privacy concerns. Journal of Business Ethics,
67, 107-123.
Auerbach, J. G. (1999, March 31). To get IBM ad,
sites must post privacy policies. The Wall Street
Journal, pp. B1, B4.
Barbaro, M. (2006, March 7). Wal-Mart enlist bloggers in P.R. campaign. The New York
Times.
Belch, G. E., & Belch, M. A. (2004). Advertising and promotion: An integrated marketing
communications perspective (6th ed.). New York:
McGraw-Hill/Irwin.
AddItIonAl reAdIngs
Cantos, L., Fine, L., Porcell, N., & Selby, S. E.
(2001). FTC approves first COPPA safe harbor
application. Intellectual Property & Technology
Law Journal, 13(4), 24.
Caudill, E. M., & Murphy, P. E. (2000). Consumer
online privacy: Legal and ethical issues. Journal
of Public Policy & Marketing, 19(1), 7-19.
Dobrow, L. (2006). Privacy issues loom for marketers. Advertising Age, 77(11), S6.
Dommeyer, C., & Gross, B. L. (2003). What consumers know and what they do: An investigation
of consumer knowledge, awareness, and use of
privacy protection strategies. Journal of Interactive Marketing, 17(2), 34-51.
Eastlick, M. A., Lotz, S. L., & Warrington, P.
(2006). Understanding online B-to-C relationships: An integrated model of privacy concerns,
trust, and commitment. Journal of Business
Research, 59, 877-886.
George, J. F. (2004). The theory of planned behavior and Internet purchasing. Internet Research,
14(3), 198-212.
Han, P., & Maclaurin, A. (2002). Do consumers really care about online privacy? Marketing
Management, 11(1), 35-38.
Hann, I., Hui, K., Lee, T., & Png, I (2002). Online
information privacy: Measuring the cost-benefit
trade-off. In Proceedings of the Twenty-third
International Conference on Information Systems.
Heckman, J. (1999). E-marketers have three childprivacy options. Marketing News, 33(16), 5-6.
Klang, M. Y., Raghu, T. S., & Shang, K. H-M
(2000). Marketing on the Internetwho can benefit from an online marketing approach? Decision
Support Systems, 27(4), 383-393.
Langenderfer, J., & Cook, D. L. (2004). Oh, what
a tangled web we weave: The state of privacy
protection in the information economy and recommendations for governance. Journal of Business
Research, 57, 734-747.
Luo, X. (2002). Trust production and privacy
concerns on the Internet: A framework based
on relationship marketing and social exchange
theory. Industrial Marketing Management, 31,
111-118.
Metzger, M. J., & Docter, S. (2003). Public opinion
and policy initiatives for online privacy protection.
Journal of Broadcasting & Electronic Media,
47(3), 350-374.
Milne, G. R., & Culnan, M. J. (2004). Strategies
for reducing online privacy risks: Why consumers
read (or dont read) online privacy notices. Journal
of Interactive Marketing, 18(3), 15-29.
Miyazaki, A. D. & Fernandez, A. (2001). Consumer perceptions of privacy and security risks
for online shopping. Journal of Consumer Affairs,
35(1), 27-44.
Morrison, K. L. (2003). Children reading commercial messages on the Internet: Web sites that
merge education, information, entertainment, and
advertising. Doctoral dissertation, University of
California, Los Angeles.
Palmer, D. E. (2005). Pop-ups, cookies, and spam:
Toward a deeper analysis of the ethical significance of Internet marketing practices. Journal of
Business Ethics, 58, 271-280.
Pan, Y., & Zinkhan, G. M. (2006). Exploring the
impact of online privacy disclosures on consumer
trust. Journal of Retailing, 82(4), 331-338.
Roman, S. (2007). The ethics of online retailing: A scale development and validation from
the consumers perspective. Journal of Business
Ethics, 72, 131-148.
Schwartz, G. (2003). Mobile marketing 101.
Marketing Magazine, 108(26), 21.
Sheehan, K. B. (2002). Toward a typology of
Internet users and online privacy concerns. The
Information Society, 18(1), 21-32.
8
9
Chapter XIII
AbstrAct
The purpose of this chapter is to investigate the current status of online privacy policies of Fortune
100 Companies. It was found that 94% of the surveyed companies have posted an online privacy policy
and 82% of them collect personal information from consumers. The majority of the companies only
partially follow the four principles (notice, choice, access, and security) of fair information practices.
For example, most of the organizations give consumers some notice and choice in term of the collection and use of their personal information. However, organizations fall short in security requirements.
Only 19% of organizations mention that they have taken steps to provide security for information both
during transmission and after their sites have received the information. The results also reveal that a
few organizations have obtained third-party privacy seals including TRUSTe, BBBOnline Privacy, and
Safe Harbor.
Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.
IntroductIon
Privacy is defined as the right to be let alone
which is part of the basic human rights to enjoy
life (Warren, 1890). As an extension of privacy
in the information age, information privacy is
the legitimate collection, use, and disclosure of
personal information, or the claims of individuals
that data about themselves should generally not be
available to other individuals and organizations,
and that, where data is possessed by another party,
the individual must be able to exercise a substantial degree of control over that data and its use
(Clarke, 1999). One type of information privacy
is online privacy, which is defined as consumer
concerns about what data is being collected by
an online vendor about the customer and how it
will be used (Nyshadham, 2000). Compared
to an off-line environment, the Internet enables
organizations to collect more information from
consumers cost effectively, sometimes even without the consent of consumers. The Internet poses
greater security threats for consumers as their personal information is transmitted over the Internet
if an organization does not have a good security
mechanism in place. Furthermore, the connectivity of the Internet allows organizations to capture
and build electronic profiles of consumers and
potential consumers. Therefore, consumers today
are facing a high level of privacy threat/invasion.
One way to show an organizations commitment
to protect consumers online privacy is to post
an online privacy policy and follow the policy
truthfully. Online privacy has been viewed as a
significant factor contributing to consumer trust
and therefore an imperative for business success
(Privacy & American Business, 2002). However,
its provision is often at odds with organizational
goalssuch as the maximization of personal
information value obtained from disclosure to
third parties (often for commercial gain) and
the retention of customer loyalty via enhanced
personalized services (Lichtenstein, Swatman,
& Babu, 2003).
0
bAckground
An online privacy policy (OPP) is a key organizational measure for assuring online privacy for Web
site users (Lichtenstein et al., 2003). Serving as
the high level guideline of the information privacy
of an organization, the promises made in OPPs
varied from each other, reflecting the difference
in the organizational recognition of the online
privacy issues and the privacy practices.
Several studies have focused on the investigation of OPPs using survey methodology (Culnan,
1999; Nyshadham, 2000; Desai et al., 2003; Lichtenstein et al., 2002; McRobb & Rogerson, 2004).
(FTC, 1998), the Culnan report to the FTC (Culnan, 1999), and Online Privacy Alliance (OPA,
2005). The data collection process was as follows:
First, the Web site of each Fortune 100 company
was visited by the authors. Second, the online
privacy policy of each Web site, if available, was
reviewed and evaluated carefully by the authors
based on the four groups of questions described.
To guarantee the validity and consistency of data
collected, the authors first developed and agreed
on the way of interpreting the information and
one author would double check with the other if
any discrepancies arose during data collection.
The same method was used in most studies of
online privacy policies.
data Analysis
The Number of Web Sites Having
Privacy Policy
It was found that out of 100 Web sites, six do not
post a privacy policy, since these companies usually do not have direct contact with consumers.
For the 94 companies having an online privacy
policy, five of them indicate that their Web sites are
only for the purpose of displaying information. In
sum, 89 companies have online privacy policy and
also collect consumer information online. Those
companies will be used in the later analysis.
Percentage
Personal information
73
82.0%
Name
53
59.6%
E-mail address
50
56.2%
Postal address
41
46.1%
Personal Information
Phone number
34
38.2%
18
20.2%
5.6%
Age/date of birth
11
12.4%
Family information
3.4%
Gender
4.5%
Education
2.2%
Income
3.4%
Preferences/interests
9.0%
Occupation
7.9%
Cookies/tracers
70
78.7%
Clear Gifs/Web
beacons/Web bugs
25
28.1%
Non-Personal Information
Number
Percentage
Does the site say anything about what information it collects from consumers?
63
70.8%
Does the site say anything about how it collects information from consumers?
62
69.7%
Does the site say how the information it collected from consumers will be used?
56
62.9%
Does the site say that this organization may use information the site has collected to contact
consumers for marketing or other purposes?
61
68.5%
Does the site say that it gives consumers choice about whether they want to be contacted by
this organization for marketing or other purposes?
46
51.7%
Does the site say that the information collected from consumers may be disclosed to outside
third parties (e.g., advertisers, business partners, or affiliates)?
53
59.6%
Does the site say it only discloses this information to outside third parties in aggregate form?
11
12.4%
Does the site say it gives consumers choice about having collected information disclosed to
outside third parties?
20
22.5%
Does the site say that it allows consumers to review or modify the information that the site
has collected?
50
56.2%
Does the site say how inaccuracies with the personal information the site has collected are
handled?
30
33.7%
Does the site say anything about the steps it takes to provide security for information during
transmission?
14
15.7%
Does the site say anything about the steps it takes to provide security for information after the
site has received the information? (not during transmission, but after collection)
18
20.2%
Mentioned both
17
19.1%
SSL
26
29.2%
Contact Information
25
28.1%
Does the site say how to submit a question about privacy? (e.g., provide contact information)
44
49.4%
Does the site say how to complain to the company or another organization about privacy?
(e.g., provide contact information)
16
18.0%
Notice
Choice
Access
Security
Security Seal
Verisign
Privacy Seals
Table 3 shows that a few firms have begun to use
third party privacy seals (19%), including TRUSTe
(9%), BBBOnline Privacy (8%), and Safe Harbor
(2%). These third-party firms survey an applying
companys online privacy practices and certify it.
A certified company can display the graphic seal
on its Web site. This is a form of self-regulation
that a company can abide.
In addition, four companies have support for
the platform for privacy preferences (P3P) standard (W3C, 2002), which can be used by Web
browsers to automatically shield users from sites
that do not provide the level of privacy protection
they desire. Using the P3P standard, a Web site
can publish a summary of the privacy practices
Number
Percentage
TRUSTe
9.0%
BBBOnLine Privacy
(better business bureau online
privacy)
7.9%
Safe Harbor
2.2%
P3P
4.5%
future trends
As the Internet and e-commerce become an integrated part of doing business in todays digital
economy, and as companies are collecting more
personal information from consumers and using
the information for understanding consumers
needs and marketing their products, consumers
will be demanding a stricter protection of their
privacy which in turn forces companies to implement a high level standard in privacy protection.
For example, compared to the survey conducted
by Culnan in 1999, our research shows that more
companies have posted an online privacy policy
which implies the rise of privacy protection awareness from 1999 to 2005.
conclusIon
This chapter investigates the current status of online privacy policies of Fortune 100 companies. It
is found that 94% of the surveyed companies have
posted an online privacy policy and 82% of them
8
references
Baumer, D. L., Earp, J. B., & Poindexter, J.
C. (2004). Internet privacy law: a comparison
between the United States and the European
Union. Journal of Computers & Security, 23(5),
400-412.
9
AddItIonAl reAdIngs
Antn, A. I., Bertino, E., Li, N., & Yu, T. (2007).
A roadmap for comprehensive online privacy
policy management. Communications of the ACM,
50(7), 109-116.
Anton, A. I., Earp, J. B., He, Q. F., Stufflebeam,
W., Bolchini, D., & Jensen, C. (2004). Financial
privacy policies and the need for standardization.
IEEE Security & Privacy, 2(2), 36-45.
Ashrafi, N., & Kuilboer, J. (2005). Online privacy
policies: an empirical perspective on self-regulatory practices. Journal of Electronic Commerce
in Organizations, 3(4), 61.
Brown, D. H., & Blevins, J. L. (2002). The safeharbor agreement between the United States
and Europe: A missed opportunity to balance
the interests of e-commerce and privacy online?
Journal of Broadcasting & Electronic Media,
46(4), 565.
Chen, K., & Rea, A. L., Jr. (2004). Protecting
personal information online: a survey of user
privacy concerns and control techniques. Journal
of Computer Information Systems, 44(4), 85.
Cockcroft, S. (2002). Gaps between policy and
practice in the protection of data privacy. Journal
of Information Technology Theory and Application, 4(3), 1.
Cranor, L., Guduru, P., & Arjula, M. (2006).
User interfaces for privacy agents. ACM Transactions on Computer-Human Interaction, 13(2),
135-178.
Liu, C., Marchewka, J. T., & Ku, C. (2004).
American and Taiwanese perceptions concerning
80
privacy trust and behavioral intentions in electronic commerce. Journal of Global Information
Management, 12(1), 18.
Malhotra, N. K., Kim, S. S., & Agarwal, J. (2004).
Internet users information privacy concerns
(IUIPC): the construct, the scale, and a causal
model. Information Systems Research, 15(4),
336355.
Mascarenhas, O. A., Kesavan, R., & Bernacchi,
M. (2003). Co-managing online privacy: a call for
joint ownership. Journal of Consumer Marketing, 20(7), 686.
Meinert, D., Peterson, D., Criswell, J., & Crossland, M. (2006). Privacy policy statements and
consumer willingness to provide personal information. Journal of Electronic Commerce in
Organizations, 4(1), 1.
Metzger, M., & Docter, S. (2003). Public opinion
and policy initiatives for online privacy protection.
Journal of Broadcasting & Electronic Media,
47(3), 350.
Milne, G., & Culnan, M. (2002). Information society, using the content of online privacy notices to
inform public policy: a longitudinal analysis of the
19982001 U.S. Web Surveys, 18(5), 345-359.
Milne, G., Rohm, A., & Bahl, S. (2004). Consumers protection of online privacy and identity.
Journal of Consumer Affairs, 38(2), 217.
Pan, Y., & Zinkhan, G. (2006). Exploring the
impact of online privacy disclosures on consumer
trust. Journal of Retailing, 82(4), 331-338.
Papacharissi, Z., & Fernback, J. (2005).Online
privacy and consumer protection: an analysis of
portal privacy statements. Journal of Broadcasting & Electronic Media, 49(3), 259.
Peslak, A. (2006). Internet privacy policies of
the largest international companies. Journal of
Electronic Commerce in Organizations, 4(3),
46-62.
Rowland, D. (2003). Privacy, freedom of expression and cyberslapps; fostering anonymity on the
internet? International Review of Law Computers
& Technology, 17(3), 303-312.
8
AppendIX
Fortune 100 Companies (Accessed in February 2005)
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
17.
18.
19.
20.
21.
22.
23.
24.
25.
26.
27.
28.
29.
30.
31.
32.
33.
34.
35.
36.
37.
38.
39.
40.
8
Wal-Mart Stores
Exxon Mobil
General Motors
Ford Motor
General Electric
ChevronTexaco
ConocoPhillips
Citigroup
Intl. Business Machines
American Intl. Group
Hewlett-Packard
Verizon Communications
Home Depot
Berkshire Hathaway
Altria Group
McKesson
Cardinal Health
State Farm Insurance Cos
Kroger
Fannie Mae
Boeing
AmerisourceBergen
Target
Bank of America Corp.
Pfizer
J.P. Morgan Chase & Co.
Time Warner
Procter & Gamble
Costco Wholesale
Johnson & Johnson
Dell
Sears Roebuck
SBC Communications
Valero Energy
Marathon Oil
MetLife
Safeway
Albertsons
Morgan Stanley
AT&T
41.
42.
43.
44.
45.
46.
47.
48.
49.
50.
51.
52.
53.
54.
55.
56.
57.
58.
59.
60.
61.
62.
63.
64.
65.
66.
67.
68.
69.
70.
71.
72.
73.
74.
75.
76.
77.
78.
79.
80.
81.
82.
83.
84.
85.
86.
87.
88.
89.
90.
Ingram Micro
FedEx
Merck
ConAgra Foods
HCA
Alcoa
Electronic Data Systems
Bank One Corp.
Comcast
Mass. Mutual Life Ins.
91. Coca-Cola
92. Bristol-Myers Squibb
93. WellPoint Health Networks
94. Georgia-Pacific
95. Weyerhaeuser
96. Abbott Laboratories
97. AutoNation
98. Williams
99. Supervalu
100. Cisco Systems
8
8
Chapter XIV
AbstrAct
In this chapter, the authors will briefly discuss some cross cultural concerns regarding Internet privacy.
The authors believe that due to the cross cultural nature of the Internet itself, different cultures will tend
to result in different concerns regarding Internet privacy. As such, there is no single system of protecting
Internet privacy that may be suitable for all cultures. The authors also utilize focus groups from various
countries spanning Asia and the United States to discover the differences between cultures. Hopefully an
understanding of such differences will aid in future research on Internet privacy to take a more culture
sensitive approach.
IntroductIon
As the worlds population becomes increasingly
plugged into the Internet, many of the new and
Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.
Cross Cultural Perceptions on Privacy in the United States, Vietnam, Indonesia, and Taiwan
bAckground
Just as the Internet has allowed for the instant
transmission of much needed information, it has
also become a channel for some unsavory elements. Organizations and individuals can now
collect information on individuals with speed,
ease, and relative accuracy. Masses of unwanted
solicitation e-mails, commonly known as spam,
8
Cross Cultural Perceptions on Privacy in the United States, Vietnam, Indonesia, and Taiwan
8
Cross Cultural Perceptions on Privacy in the United States, Vietnam, Indonesia, and Taiwan
cAn-spAm Act
The CAN-SPAM Act went into effect in January,
2004. It covers e-mails that have a primary purpose of advertising. This act establishes requirements for companies sending out commercial
e-mail and explains clearly the punishments for
spammers and companies who do not comply
with regulations set in the act. These regulations
include: banning of misleading header information, statement of originating domain name and
e-mail address, and banning of deceptive subject
lines. The act also gives consumers the right to
ask advertisement companies to stop sending
advertising in the form of spam to them. Every
spammer or company must have an opt-out option
for consumers to choose. Lastly, it requires that
the e-mail be identified as an advertisement and
include the senders postal address (CAN-SPAM
Act, 2003). However, one severe limitation to the
CAN-SPAM Act is that it is not able to effectively
stop spam, which is what most Internet users would
wish for. Instead, the CAN-SPAM Act grants
spam a legitimacy that was previously lacking.
Spammers are now merely required to openly
state the purpose of their e-mails, sending out
e-mails entitled Advertisement: BUY VIAGRA
NOW instead of thinly veiled e-mails. In doing
so, spammers become law abiding citizens, albeit
something of a nuisance. The second limitation
of the CAN-SPAM Act is that the act is not
enforceable outside of the United States, where
most spam originates.
spy-Act of 2007
In the past few years, anti-spyware efforts have
gained momentum. The Spy Act bans the more
obvious forms of spyware such as the ones that
hijack a persons computer or log keystrokes. Keystroke logging software records all the text typed
onto the keyboard and runs completely hidden
without the person knowing it. However, there
are issues with the Spy Act. Many people point
8
Cross Cultural Perceptions on Privacy in the United States, Vietnam, Indonesia, and Taiwan
88
Cross Cultural Perceptions on Privacy in the United States, Vietnam, Indonesia, and Taiwan
solutIons And
recommendAtIons
Since its inception in the late 1980s, the Internet
has created more questions than it answers about
how people from different cultural backgrounds
view privacy. It is a complex issue that wrestles
with many theoretical norms in regards to how
best to frame it, if at all. Hsu (2006) investigated
differences in privacy and cultural attitudes by
comparing two Eastern cultures, China and Taiwan, with two Western nations, United States and
Holland. Findings discovered that individuals in
each of the four countries possessed significant
differences not only in terms of reaction towards
different types of information gathering Web
sites, but also in terms of privacy concerns and
privacy practices. Users in the United States were
more likely to disclose information to non-profit
and commercial Web sites, and less likely to
government and health Web sites. Furthermore,
89
Cross Cultural Perceptions on Privacy in the United States, Vietnam, Indonesia, and Taiwan
90
Cross Cultural Perceptions on Privacy in the United States, Vietnam, Indonesia, and Taiwan
9
Cross Cultural Perceptions on Privacy in the United States, Vietnam, Indonesia, and Taiwan
Research Question 2: The more collectivist the culture, the less legal protection for
individual privacy.
9
Cross Cultural Perceptions on Privacy in the United States, Vietnam, Indonesia, and Taiwan
9
Cross Cultural Perceptions on Privacy in the United States, Vietnam, Indonesia, and Taiwan
Government Regulation
The Vietnamese group indicated that they felt
no great need for government regulation on
online private information, as did the Indonesian
group. Participants in both groups stated that
while spam was a nuisance, they were content
with merely deleting the offending e-mail. The
Taiwanese group, while expressing opinions that
the government should do something about it,
also exhibited distrust in the telecommunications
industry and believed that it was the ISPs that were
9
Summary of Findings
Although our focus group discussions were not in
depth, the brief discussions conducted indicated
that it is possible that more collectivistic Asian
cultures tended to be less sensitive to violations of
personal privacy in the same way individualistic
Western cultures may be. American participants
also tended to lean towards a more legislative
solution to the problem of personal privacy.
Vietnamese and Indonesian participants did not
actively express desires for personal privacy to
be legislated. However, Taiwanese participants
did express opinions that personal privacy
Cross Cultural Perceptions on Privacy in the United States, Vietnam, Indonesia, and Taiwan
conclusIon
future trends
At present, there is still no universal standard in
terms of Internet privacy protection, and no light
is yet visible at the end of the tunnel. The United
States alone will still be required to wrangle on
its own internal standard of privacy protection
and obtain a consensus from all 50 states. The
European Union, while possessing theoretically
sound protection of personal data, has yet to
vigorously test its own laws. Difference in opinion
between European and American standards on
privacy protection has also yet to settle. Asian
countries face an even greater challenge as their
populations become increasingly wired into the
Internet, while possessing no historically strong
legal or cultural basis for privacy protection.
However, as individual citizens become
increasingly aware of the pains of spam and
inadequate privacy protection, it is possible that
nations will eventually step up to the task of
legislating privacy protection in an acceptable
manner. A series of high profile murders of
spam kings in Russia at the time of this writing
would also indicate possible interactions with
the criminal world that will eventually force law
enforcement agencies to see spam as a serious
problem.
The trend of increasing awareness of privacy
protection is counterbalanced by the need to
9
Cross Cultural Perceptions on Privacy in the United States, Vietnam, Indonesia, and Taiwan
references
9
Cross Cultural Perceptions on Privacy in the United States, Vietnam, Indonesia, and Taiwan
AddItIonAl reAdIng
Harper, J. (2006). Identity crisis: How identification is overused and misunderstood. Washington,
D.C.: Cato Institute.
Milberg, S. J., Burke, S. J., Smith, H. J., & Kallman,
E. A. (1995). Values, personal information privacy,
and regulatory approaches. Communications of
the ACM, 38(12), 65-74.
REAL ID Act of 2005, H.R. 418, (2005).
Rotenberg, M. (2006). Real id, real trouble?
Communications of the ACM, 49(3), 128.
State of Arkansas. (86th General Assembly,
2007). To urge Congress and the United States
Department of Homeland Security to add critical
privacy and civil liberty safeguards to the REAL
ID Act of 2005 and to fully fund or suspend
implementation of the REAL ID Act, SCR 22.
Available at http://www.realnightmare.org/images/File/AR%20SCR22.pdf
9
Cross Cultural Perceptions on Privacy in the United States, Vietnam, Indonesia, and Taiwan
98
Section V
00
Chapter XV
AbstrAct
Biometrics is an application of technology to authenticate users identities through the measurement of
physiological or behavioral patterns. The verification system offers greater security to the use of passwords
or smart cards. Biometric characteristics cannot be lost or forgotten. As biometric characteristics are
concerned with the very makeup of who we are, there are also security, privacy, and ethical concerns in
their adoption.Fingerprint, iris, voice, hand geometry, face, and signature are all considered biometric
characteristics and used in the authentication process. Examples of everyday biometric applications
include thumbprint locks on laptop computers, fingerprint scanners to enter a locked door on a house,
and facial recognition scans for forensic use. While there are several examples of biometrics currently
in use, it is still an emerging technology. The purpose of this chapter is to provide a descriptive discussion of the current and future state of biometrics.
IntroductIon
The world is growing increasingly digital as
information systems and networks span the
globe. As individuals, customers, employees, and
Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.
Figure 1.
figure A: how victim's Information is misused
26%
25%
17%
15%
12%
10%
9%
6%
5%
governm ent
docum ents/benefits
fraud
checking/savings
account fraud
0%
phone/utilities fraud
5%
18%
20%
25%
loan fraud
30%
0
0
biometrics dates back centuries. The use of fingerprint by law enforcement agencies to identify
individuals suspected of committing a crime dates
back to the late 19th century (Jain, Ross, & Prabhakar, 2003). Going back to earlier centuries, artists
in East Asia were known for leaving fingerprints
on their work and traders in Egypt were commonly
identified by physical characteristics like height,
hair, and eye color. Citizens and societies have
commonly used unique human characteristics to
identify individuals. Biometric tools are not a new
addition to authenticating identity, but rather the
modern technology has allowed for new methods
of conducting biometrics.
The Theory of Human Identification believes
that there are three methods of associating data
with a particular human being (Clark, 1994).
Knowledge-based identification is when an individual possesses knowledge that only that person
would know. For example, passwords, birthdates,
and social security numbers represent forms of
knowledge-based identification. Token-based
identification recognizes a persons possession
of an item. Examples include a drivers license,
a passport, or an ID card. The third method is
biometric identification, using unique and personal physiological characteristics to recognize
an individuals identity.
Using human characteristics is seen as a reliable method of confirming ones identity or matching an unknown identity to a list of possibilities.
For example, a user may be granted access by
a biometric device, thus authenticating his/her
identity as a legitimate user. Biometrics can also
be used to search for human characteristics of an
unknown identity, seeking out possible matches.
An example of this second use of biometrics can
be found in detective work, matching the fingerprints of an unknown suspect to a database of
known offenders.
Knowledge and token-based identification are
the traditional methods used to authenticate user
identity. Confirming identity was based on some-
eXAmples of bIometrIc
ApplIcAtIons
While the standard example of a biometric human characteristic is the fingerprint, there are
many others in use as well. Common biometric
characteristics are profiled. Figure 2 compares
the biometric characteristics on ease of use and
accuracy (Liu & Silverman, 2001).
Fingerprint recognition focuses on the ridges
on the skin of the fingers. An existing fingerprint
can be queried across a database of prints to find
a match to the users identity. A small number
of people are unable to use fingerprint recognition systems due to excessive wear, dryness, or
exposure to corrosive chemicals (Rosenzweig,
Kochems, & Schwartz, 2004). Fingerprint recognition is presently the most common form of
biometric device in use today and can be found
in a variety of devices from simple USB plug-ins
to wireless phone and PDA protection to entry
door locks.
Facial recognition is the ability to match facial
patterns with those in a database to determine
identity. Facial characteristics measured can include the location and shape of eyes, eyebrows,
nose, and lips. Due to the complexity of the patterns, facial recognition is regarded as being the
most expensive of biometric options. However,
biometric devices using this facial detection are
being used by industries to scan large areas to
recognize suspected persons. Examples include
airport systems screening for known terrorists and
casinos screening against prohibited players.
Hand recognition measures the shape and
distance of key features like finger length, finger
facial
recognition
hand
recognition
retina
iris
Ease of use
high
medium
high
low
medium
Accuracy
high
high
high
very
high
very
high
0
performAnce Issues
Important terminology in the field of biometrics
includes false rejection rates (FRR) and false acceptance rates (FAR). It must be noted that biometric devices look to match the users characteristics
against a database of patterns of characteristics.
It is unlikely, due to environmental and physical
factors, that the database will return an exact
match. Therefore, biometric measures look to
find a probable, or likely match, leaving open the
possibility of false positives and negatives. Figure
3 summarizes the activities of a typical biometric
device. These activities do assume that the user
has been enrolled into the system. Enrollment is
simply the process of registering new users into
the biometric system.
The first step is the actual use of the biometric device by the user. Once the user is scanned,
the system will create a biometric template. The
template is simply a mathematical representation
of the users characteristics. The third step is to
search for potential matches to the biometric
template. The system looks for matches compared to users who have been enrolled into the
system. These results are analyzed, and scored,
for probability of identity. Finally, the system will
return the results of its processes, and positively
or negatively acknowledge the identity of the user
(Liu & Silverman, 2001).
0
Activity
user scan
template creation
query database
analyze results
return action
Description
user scans
hand,
finger, or
eye using
biometric
reader
biometric database
is searched for
similar results
algorithms are
used to determine
possible positive
matches
system,
negatively
or positively
acknowledges
user identity
key Issues
There are a number of important considerations
that must be considered when analyzing biometrics in use today. This section will profile the
technological, economic, business, and legal/ethical issues.
technological
Before biometric measures can be properly implemented, users must be aware of the technologies
needed to support. Every biometric device will
need to store a database of positive patterns. For
example, a retina scanner will need to be linked
to a database of acceptable retina scans, used
to confirm or reject the identities of users. The
biometric industry has been fortunate to create
several open standards that support the interoperability of devices. BioAPI and the common
biometric exchange file format are examples of
these standards (Liu & Silverman, 2001).
BioAPI is a joint effort of more than 120 different organizations. The BioAPI group created
a joint, open standard application programming
interface (API) in 2000. This standard allowed
biometric software applications to communicate
with different biometric technologies. BioAPI is
0
economic
One of the largest drawbacks of biometrics to this
point is the physical cost associated with their
adoption. If an organization implements a biometric security system to guard their entryways,
the system will be needed at all accessible points
to be reliable. In addition to the physical costs,
firms will need to acquire the expertise needed to
setup, maintain, and audit these systems. Finally,
compatibility issues with other information systems will need to be explored in order to prevent
against mismatched programs and reports. A
comprehensive biometric solution can range in
cost from a few hundred dollars to the hundreds
of thousands depending on the scope and level
of complexity needed.
There are more indirect costs to biometric
systems as well. Users may be initially resistant
to change, especially if the need for the new system is not properly communicated. Users may
also be hesitant to provide data for the system, as
they may question the need to scan personal and
unique human characteristics. A perfect example
of this reluctance may be found in fingerprint
scanning systems. Due to their use in criminal
justice applications, some may be suspicious of
a biometric database.
Still, there are numerous benefits to biometric
systems. Secure points of entry into physical and
digital assets. Rigorous authentication and identity
confirmation can be an asset for those looking to
better protect their systems. Biometric systems
do away with the need to remember passwords
and ID cards. While there is a cost associated
with these systems, that cost may be more easily taken to prevent a cybercrime attack than in
response to one.
0
business
There are a number of business considerations
for biometrics. From an end user standpoint,
employees may appreciate the benefits of biometric systems. First, these devices offer increased
security, which alone may make them worthwhile
for highly secretive firms and industries. Second,
biometric measures do offer greater convenience
because they cannot be forgotten or left behind.
Users will not have to remember a number of
passwords or worry about bringing the ID card.
However, users must be trained on the proper use
of the new systems. This is especially true as the
positioning of the eye for a retina or iris scanning
device is critical to the operation.
From an operational standpoint, firms will
need to evaluate vendors in this industry to
find a successful fit with their individual needs.
Depending on the technical skill and savvy, the
firm may be able to manage the system in house
or be reliant on outside suppliers and consultants.
Any successful supplier relationship will require
adequate planning and communication.
It must be noted that biometrics will be only
one component of a comprehensive security plan.
Organizations will use biometrics in conjunction
with other security devices and checkpoints.
Businesses must have multiple layers of security.
For example, a firm may choose to increase the
security of an entryway by using a biometric fingerprint scanner in conjunction with a swipeable
ID card and a password keypad. Furthermore, the
firms methods of guarding its entryways must
be integrated with its overall security strategy.
The sharing of data and intelligence about who is
accessing these systems and what they are using
the systems for is invaluable. Likewise, it would
be smart to share data on when someone was not
granted access to a system and compared to other
company systems. Viewing security in a holistic
manner can help identify attempted breaches
before a crime can actually occur.
legal/ethical
Biometrics, by their very nature, touch on a host
of legal and ethical issue. Any device that requires
the capture, storage, and analysis of unique human
characteristics must be managed with discretion.
Users must be made aware of the need for biometrics and provide their consent for use.
Due to the sensitive nature of these devices,
they would be impacted by a number of privacy
laws that have been created in the United States.
For example, a medical institution using biometrics would need to verify that their processes
comply with the Health Insurance Portability and
Accountability Act (HIPAA). Furthermore, any
loss of data or breach in the system would need
to recognized and dealt with immediately (Down
& Sands, 2004).
An additional legal concern for biometric
systems is the Identity Theft and Assumption
Deterrence Act, passed by the United States
Congress in1998. This law protects against the
transfer of material that can be used for identity
theft. The act explicitly includes biometric data in
its list of materials that can be used as a means
of identification (Hemphill, 2001).
future of bIometrIcs
Biometrics will continue to evolve. In the future,
possible biometric traits to be incorporated into
pattern matching systems include body odor, ear
shape, facial thermography, and DNA matching
(Rosenzweig et al., 2004). It is expected that
0
08
conclusIon
Today, personal and digital privacy have become
mainstream issues as a result of the growth of
cybercriminal activity and the severity of identity
theft. While individuals must take greater precaution with their own private data, biometrics
can assist in securing against fraudulent access
to information systems and the impersonation
of identity.
Biometric devices offer another layer of
security for physical and digital systems and assets. Their adoption, given proper planning, can
be a valuable method to authenticate users and
identify unknown persons. Modern technology
is improving the techniques biometric systems
use and lowering their cost of adoption.
The designers and creators of biometric
systems must be wary of the legal and ethical
questions that their devices are likely to create.
Indeed, biometric devices affect privacy on two
fronts. First, they can help protect user privacy
by legitimately authenticating identity. Their use
can protect against cybercrime and identity theft.
Second, due to their use and storage of personal
and unique characteristics, biometrics opens a host
of questions on their impact on civil liberties.
However, with adequate communication, users
are likely to appreciate systems that allow them
the ease of use and convenience that biometric
systems offer. Biometrics offer increased security
levels when used properly and in conjunction with
a well thought security plan. For this reason, we
should expect their use to continue to grow in
the future.
references
Clark, R. (1994). Human identification in information dystems: Management challenges and
public policy issues. Information Technology and
People, 7(4), 6-37.
AddItIonAl reAdIng
Pons, A. P. (2006). Biometric marketing: targeting the online consumer. Communicatiojns of the
ACM, 49(8), 60-66.
09
0
Chapter XVI
Government Stewardship of
Online Information:
FOIA Requirements and Other
Considerations
G. Scott Erickson
Ithaca College, USA
AbstrAct
This chapter focuses on the specific issue of the federal Freedom of Information Act and associated
state and local freedom of information laws. While full of good intentions regarding openness in government, the statutes have increasingly been applied to circumstances when individuals or organizations
seek government records for legal or business purposes. As such, confidential business information and
private personal information are both vulnerable when data are in government hands. Given the maze
of exemptions and agency interpretations regarding freedom of information requests, the circumstances
are both highly variable and unpredictable. Better understanding of the statutes and their interpretations
will help individuals and organizations make better decisions regarding data interactions with various
levels of government.
IntroductIon
In an age with ever increasing amounts of personal data held in commercial and government
databases, many individuals view the government
Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.
bAckgound
freedom of Information
The Freedom of Information Act (FOIA) (Apfelroth, 2006; Uhl, 2003; Halstuk & Davis, 2002;
Perritt, 1998; Perritt, 1995) was enacted in the U.S.
privacy
Although a lot of discussion about privacy rights
takes place in the U.S., the actual legal standing of the concept is not as straightforward as it
might seem. Indeed, in the previous section, we
used the term rather frequently, but extensions
to individuals who might have information concerning them within government databases can
be problematic.
The right to privacy generally derives from
the fourth amendment, [t]he right of the people
to be secure in their persons, houses, papers,
and effects against unreasonable searches and
seizures, shall not be violated. As should be
clear from the context, this text tends to refer to
government rather than private individuals or
organizations (Brenner & Clark, 2006). So the
U.S. constitution deals with privacy from the
government, but not necessarily from Google. The
right to privacy as we generally perceive it comes
more from court rulings over time (Dalal, 2006),
particularly Olmstead wherein Brandeis coined
the term the right to be left alone. A later case,
Katz, formalized the concept, establishing the
criteria or whether an individual expected privacy
and whether the expectation was reasonable; if
so, their privacy should not be invaded (Brenner,
2005). This decision also extended Olmstead in
that it made clear privacy applied to people, not
just places (the home).
How does privacy apply to records, particularly electronic ones? There are existing laws,
mAIn thrust
contemporary databases and the
Internet
In an age of an ever-increasing number of electronic and online databases that contain an everincreasing depth of information, the issue of privacy has become of considerable concern to both
individuals and organizations. The past 10 years
have seen an explosion of spending in information
systems that tie parts of firms, collaborator firms,
and customers together. These extended networks
of data exchange are now a critical part of doing business, transferring information between
individuals and organizations on an ongoing and
increasing level on a daily basis.
All of this becomes important in that governments, at all levels, are also capable of building
these types of databases (indeed, one of the initial adopters of RFID technology was the U.S.
Department of Defense, mandating tagging for
all major suppliers). Further, corporations submit
vast amounts of information from their proprietary
databases in regulatory filings, including all sorts
of operational and marketing data down to the
individual consumer. The federal government is
the largest single producer and collector of information in the United States (Leahy, 1998). And,
finally, governments purchase marketing and other
databases, combining them with other, existing
databases in some cases. We will discuss some
specific examples in the following sections.
The fact that all of this activity is occurring in
an age of ever-increasing transfer and publication
A further complication is the changing nature of what might be considered CBI. Standard
intellectual property such as patents, copyrights,
and such have always had some sense of protectionall are revealed anyway, though additional
information covering how patented technology
works best, and so forth, might not be. Intellectual
property clearly has some commercial value but
also has specific and generally effective protection, even when revealed. Trade secrets and more
loosely defined or protected information are a
more sticky issue. Trade secrets, by definition, are
secret and must be kept so in order to have value.
One interesting development in recent years has
been the firming up of trade secret law in the U.S.,
as the Economic Espionage Act more explicitly
established what a trade secret is (basically, any
business information of value) and what must be
done to keep it so (strong attempts to maintain its
secrecy) (Carr, Erickson, & Rothberg, 2004).
The EEA and its interpretation create a couple
of interesting issues regarding commercial information and the FOIA. Initially, what might be
considered CBI or a trade secret is wider than
what had been standard practice in the past. In
the enforcement of the EEA, quite a number of
prosecutions have centered on marketing data,
including customer lists and consumer information. We discussed casinos and their mountains
of customer data earlier in the chapter. Harrahs
customer database, the state of the art in this
industry, can undoubtedly now be classified as a
trade secret. It might not have been a few years
ago. Secondly, the onus is really on the holder of
the trade secret to keep it secret. Sloppy protection
mechanisms, loose internal and external security,
and other such actions can invalidate trade secret
status. The holder must have proper procedures
to keep the information hidden. After objecting
to some new product information being posted by
bloggers, Apple, for example, was basically told
tough luck by the courts, at least in part because
it showed loose controls when the information got
out in the first place (OGrady, 2006).
8
review and standards for determining this balance between openness and privacy. Others still
remain less concerned and review requests based
on such concerns in an ad hoc manner, if at all. As
long as different agencies have different attitudes,
approaches, and eventual decisions, individuals
will need to remain concerned about their personal
privacy. And as we discussed relating to Internetgathered data and RFID data, information can be
gathered unobtrusively, without individuals even
knowing they have became part of a file, perhaps
held by a government agency that routinely honors
FOIA requests.
Further, the joining of data is a major part
of the problem, as both firms and government
entities build ever bigger databases by combining holdings. What could very well happen is
government data without individually identifiable features might be released under FOIA and
then combined with a database with extensive
personal details. With modern technology, it
would not be that great a chore to match up some
details in each database and add the government
data on a person-by-person basis. So the original government database would have no call to
be withheld for reasons of personal privacy, but
would be personally identifiable when combined
with other data.
Finally, and somewhat adding to this general
issue, FOIA is supported by 51 FOIL statutes in
the states and District of Columbia. While very
similar in many respects, the state laws do not
precisely mirror the federal statute. Indeed, they
tend to vary considerably by state (Tesler, 2000;
Westin, 1996; Vaughn, 1984) and, of course, in
their actual administration and specific judicial
guidance. State and local agencies are likely to
have far less guidance in terms of how to process
and evaluate FOIL requests, and, of course, are
far less likely to have established procedures,
openness or privacy officials, or any other such
systems. Bloom (2006) discusses at length the
case of Greenwich, CT, recently decided in Connecticut Supreme Court. The court required the
9
recommendations
0
future reseArch
As reiterated often in this chapter, this is a constantly changing field with new decisions made
on FOIA and FOIL applications on a daily basis
at all levels of government. There are new statutes and new court decisions constantly flowing
from various legislatures and courts. With each
major change, opportunities for new research
are created.
In terms of immediately relevant and predictable research directions, however, we have the
obvious differences in FOIA/FOIL practice that
we have discussed. Initially, at the federal level,
different agencies can have dramatically different
procedures and responses to FOIA requests. Some
have officers or groups in charge of processing and
determining whether exemptions apply. Millions
of dollars are spent at the federal level answering hundreds of thousands of annual requests.
Analysis of differences in agency approach and
process could be fruitful research directions and
help our understanding of the benefits and burdens
of the FOIA.
Similarly, at the state level, laws, agency behavior, and processes differ markedly. There is some
research on differences in statute and court decisions, but the field is again ripe for more detailed
examinations of specific agencies, approaches,
staffing, and so forth. In line with that theme, as
freedom of information concerns become more
formalized in Europe and elsewhere, similar
research directions could prove fruitful.
references
Apfelroth, J. (2006). The open government act: A
proposed bill to ensure the efficient implementation of the freedom of information act. Administrative Law Review, 58(1), 219.
Bloom, I. (2006). Freedom of information law in
the digital age: The death knell of informational
privacy. Richmond Journal of Law & Technology,
12(Spring), 9.
Brenner, S. W. (2005). The search and seizure of
computers and electronic evidence: The fourth
amendment in an era of ubiquitous technology.
Mississippi Law Journal, 75(Fall), 1.
Brenner, S. W., & Clark, L. L. (2006). Fourth
amendment protection for shared privacy rights
in stored transactional data. Journal of Law and
Policy, 14, 211.
Perritt, Jr., H. H. (1995). Sources of rights to access public information. William & Mary Bill of
Rights Journal, 4(Summer), 179-221.
Perritt, Jr., H. H. (1998). Electronic freedom of
information. Administrative Law Review, 50(2),
391-419.
Perritt, Jr., H. H., & Lhulier, C. J. (1997). Information access rights based on international
human rights law. Buffalo Law Review, 45(Fall),
899-929.
Prime, J. S. (1996). A double-barrelled assault:
How technology and judicial interpretations
threaten public access to law enforcement records. Federal Communications Law Journal,
48(March), 341-369.
Rice, S. (2000). Public environmental recordsa
treasure chest of competitive information. Competitive Intelligence Magazine, 3(3), 13-19.
Rothberg, H. N., & Erickson, G. S. (2005). From
knowledge to intelligence: Creating competitive
advantage in the next economy. Woburn, MA:
Elsevier Butterworth-Heinemann.
Sawyer, S., & Tapia, A. (2005). The sociotechnical nature of mobile computing work: Evidence
from a study of policing in the United States.
International Journal of Technology and Human
Interaction, 1(3), 1-14.
Sharrott, D. (1992). Provider-specific qualityof-care data: A proposal for limited mandatory
disclosure. Brooklyn Law Review, 58(Spring),
85-153.
Solove, D. J. (2005). The coexistence of privacy
and security: Fourth amendment codification
and professor kerrs misguided call for judicial
deference. Fordham Law Review, 74(November),
747.
Susman, T. M. (1988). The privacy act and the
freedom of information act: Conflict and resolution. John Marshall Law Review, 21, 703-733.
Tesler, W. (2000). Gould debunked: The prohibition against using New Yorks freedom of information law as a criminal discovery tool. New York
Law School Law Review, 44, 71-129.
Uhl, K. E. (2003). The freedom of information act
post-9/11: Balancing the publics right to know,
critical infrastructure protection, and homeland
security. American University Law Review,
53(October), 261-311.
Vaughn, R. G. (1984). Administrative alternatives
and the federal freedom of information act. Ohio
State Law Journal, 45(Winter), 185-214.
Westin, M. (1996). The Minnesota government
data practices act: A practitioners guide and observations on access to government information.
William Mitchell Law Review, 22, 839-902.
AddItIonAl reAdIng
Andrussier, S. E. (1991). The freedom of information act in 1990: More freedom for the government,
less freedom for the people. Duke Law Journal,
41(June), 753.
Beall, C. P. (1996). The exaltation of privacy
doctrines over public information law. Duke Law
Journal, 45, 1249.
Bunker, M. D., Splichal, S. L., Chamberlin, B. F.,
& Perry, L. M. (1993). Access to government-held
information in the computer age: Applying legal
doctrine to emerging technology. Florida State
University Law Review, 20(Winter), 543.
Davis, C. N., & Splichal, S.L. (Eds.). (2000). Access
denied: Freedom of information in the information
age. Ames, IA: Iowa State University Press.
Grunewald, M. H. (1988). Freedom of information act dispute resolution. Administrative Law
Review, 46, 1.
Chapter XVII
AbstrAct
This chapter will discuss the legal framework for consumer and data protection in Europe. Central to this
discussion will be the law of the European Union (EU) on data and consumer protection.3 Recent years
have seen the creation of legal frameworks in Europe which seek to secure the protection of consumers
while simultaneously facilitating economic growth in the European Union. This chapter will outline the
main sources of law which protect consumers and their privacy. This chapter will outline the important
provisions in these sources of law and critically analyse them. The chapter will also point up the gaps
and deficiencies in the consumer and data protection legal structures.
consumer protectIon
There is a need for commercial law to respond to
the challenges posed by technology and the means
Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.
The European Commission produced Directive 2000/31/EC in 2000, which established the
basic legal framework for electronic commerce
in the internal market. This directive removed
obstacles to cross border online services in the
European Union. Importantly, the directive also
provided legal certainty to business and citizens of
the European Union. The European Commission
introduced the directive with a view to creating
a legal framework that ensures:
e-commerce dIrectIve
It was recognised in the mid 1990s that the Internet,
which was once only accessible to a privileged few,
was becoming increasingly popular as a means
of communication and of doing business. The
challenge for the EU was to design a regulatory
framework which would protect parties doing
business online but also to enable EU citizens to
take advantage of new markets making the EU
a true knowledge-based economy. See European
Commission, Directorate General Press and Communication (2002) Towards a knowledge-based
Europe.
The need for a regulatory framework for ecommerce was first officially recognised in 1997
when the commission adopted its communication
A European initiative on electronic commerce
by the IP/97/313 (16/04/1997). Single market
principles were recognised as being central to
the continued growth and regulation of the sector, to ensure maximum efficiency and certainty.
The commission identified four key areas where
action should be taken.
Firstly it was recognised that there would
need to be standardised systems and upgraded
technological infrastructure within the EU. Secondly, perhaps most importantly, single market
principles were found to be integral to any future
regulatory framework in the area. Thus the fundamental freedoms such as the free movement of
goods, services, people, capital, and the freedom
of established would guide any future developments in the area. Therefore, it was made clear
that single market principles would be grafted
on to the emerging Internet market and would
govern the variety of important issues such as:
distance selling, data protection, electronic signatures, and electronic payments. Importantly,
it was also stated that the need for an analogous
9
0
electronIc sIgnAtures
dIrectIve
The Electronic Signatures Directive 1999/93/E.C.,
a community framework for electronic signatures,
addresses the problems associated with the legal
status of electronic signatures by permitting certification service providers to establish without
advance authorisation and to act as a third party
in certifying the authenticity of a digital signature
under Article 3 of the directive.
future chAllenges
To become binding within individual member
states, the directives must be transposed into
national law. The nature of a directive leaves flexibility to individual member states to transpose
the contents of a directive into law in a matter
which suits its traditions and legal system best. In
Ireland, the Electronic Commerce Act 2000 (No.
27 of 2000) implemented the Electronic Signatures
Directive and a number of provisions contained
in the E-Commerce Directive. The greater bulk
of the provisions of the E-Commerce Directive
were transposed into law by the European Communities (Directive 2000/31/EC) Regulations
2003. These regulations are enforced by the data
protection commissioner and director of consumer
affairs in Ireland. The Irish approach is a lightregulatory approach and this is regarded by the
government as the best method to regulation in
Ireland (Colgan, 2003).
The 10 new member states began transposition
prior to their accession into the EU. Poland, for
example, passed an act on July 18, 2002 relating
to the provision of services by electronic means.
The transposition of directive into the law of all
the member states of a larger EU is an important
development for the future growth of e-commerce
in the EU. This harmonisation ensures a large
and efficient e-market in the EU. However,
it has been recognised that the same perceived
flaws remain in the legislation as were identified
with the directive as the transposing law, though
flexibility must remain faithful to the directives
provisions (Kryczka, 2004). Therefore, if there
are problems with the E-commerce Directive, at
least they are uniform and therefore can be addressed by a review if it is thought it is required
in the future. It is clear from the legal framework
that has been constructed for e-commerce in the
EU that the focus has been to ensure the proper
functioning of the internal market, competitiveness, and economic growth in Europe. While this
is important, there needs to be a greater emphasis
placed on consumers and the protection that they
are afforded.
sub conclusIon
The directives discussed here constitute an important contribution to the overall e-regulation
architecture. The transposition of the directives
into the national laws of member states enhances
the harmonisation of the rules governing e-commerce in Europe. This is vital to the European
economy of both the more established member
states and the new accession member states.
There is considerable unease that common law
principles are being squeezed out in favour of
civil law principles in this new application of law
to e-commerce. Roebuck contends that a greater
notice should be taken of common law approaches
as they are always more trade friendly than their
civil law counterparts (Roebuck, 2002). Flexibility
remains important and it is feared that heightened
legislative activity may inhibit this.
A strong regulatory scheme which is flexible
is important for consumer confidence. However,
certain commentators have found that the exception of consumer contracts from the country
of origin principle difficult to justify (Moerel,
2001). However, as is so often the case with resolutions adopted between states, a certain amount
of political compromise is needed which lawyers
may find difficult to work in practice. Ultimately,
e-commerce is of a global nature and the success
of the EU venture points to the greater need for a
global answer to a global regulatory problem.
dAtA protectIon
Consumers traditionally have been able to shop
anonymously with little or no intrusion into their
private lives. However, shopping online requires
consumers to divulge much more personal information than was required in the past. Even where
consumers are merely enquiring about goods
and services, they may be required to provide
councIl of europe
conventIon for the
protectIon of IndIvIduAls
WIth regArd to AutomAtIc
processIng of
personAl dAtA 198115
This convention drew inspiration from the European Convention on Human Rights. In particular,
the convention was inspired by Article 8. The
Council of Europe, through these guidelines
established a structure of precise principles
and norms which were aimed at preventing the
unjust compilation and processing of personal
data. This work culminated in the Convention
for the Protection of Individuals with regard
to automatic processing of personal data. This
the control system in a democratic society. Complaints to these data protection agencies will result
in an investigation into the complaint and action
by the agency to address the complaint.
These data protection agencies maintain a
register which provides general information about
data handling practices of many important data
controllers. Organisations such as government
departments, financial institutions, and organisations who maintain sensitive types of personal
data are included on this register.
These data protection agencies have an important role in protecting consumers from excessive
collection of personal data. For example, the Irish
data protection commissioner publishes case
studies of where his office takes action against
different institutions, including financial institutions for collection of excessive amounts of
personal data. These case studies are illustrative
and easily understandable and are a good source
of information for people who have concerns.
As already mentioned, Article 29 the Data
Protection Directive 1995 established a working
party. Under Article 29(2) the membership of
the working party is made up of the data protection commissioners from the different European
Union member states along with a representative
of the European Commission. The involvement
of these data protection agencies in the working
party means that problems encountered by these
agencies in monitoring the application and operation of the data protection laws are formally
filtered back to the European Commission. The
working party advises the European Commission on the levels of data protection measures in
place in countries outside the European Union.
The European Commission benefits from the advice of these European data protection agencies.
This is beneficial as these bodies often deal with
complaints regarding the processing of personal
information outside of the European Union.
These data protection agencies have demonstrated their ability to ensure compliance with
the law. Recent examples of this include a raid by
8
prIvAcy notIces
Privacy statements inform online consumers how
their personal information will be used by the company from which they purchase goods and services
and it thus an important source of protection.24
However, there has been much criticism about the
effectiveness of these privacy notices, particularly
as these notices are presented in a convoluted way,
using language which is unnecessarily technical
and legal. This issue has been raised by Consumers
International in a study published in 2007. The
Article 29 Data Protection Working Party (which
was set up under Article 29 of the Data Protection
Directive. The working party is an independent
EU advisory body, whose remit is in the area of
data protection and privacy) issued direction on
corporate privacy notices in 2004. This concept
of multi layered privacy notices is important
as such notices can effect improvements in the
quality of information received by consumers on
data protection. It does this, as the working party
pointed out, by focusing on the information that
the individual needs to understand their position
and to make decisions. The working party called
for layered and easily readable privacy statements
consumer protectIons
As Fielder (2002) pointed out, despite the creation
of data protection laws throughout the EU through
the Data Protection Directive 1995, research demonstrates that there is still widespread neglect
of good privacy practice and lack of compliance
with data protection legislation. This has obvious and serious implications for consumers, and
is particularly concerning due to the continuing
growth and availability of more sophisticated data
collection technology.
In 2004 the European Commission published a
report entitled Commission Staff Working Document, Consumer Confidence in E-Commerce:
lessons learned from the e-confidence initiative
Brussels which examined consumer confidence
in e-commerce. The report identified that com-
9
0
future developments
It is clear that data protection agencies have an
important role to play in ensuring that the data
protection legislation achieves its goals. In this regard, the publication of research which highlights
international difficulties with data protection is
paramount. Of course this will be contingent upon
adequate funding which will permit these agencies
to produce this work. Consumer literacy in terms
of data protection will also form a key component
in realising the data protection principles set out
in the Data Protection Directive 1995.
An important component of ensuring the
protection of personal data is the availability of
adequate training for businesses and data holders.
It is essential that businesses know the law and are
provided with incentives for protecting personal
data and processing the data in accordance with
the law. It is important also that businesses are
provided with model privacy policies. Policies
which are concise and readily understandable
could be effective tools in promoting a culture
of respect and further compliance with the data
protection laws through out Europe.
Significant also will be the increased availability of fast and effective redress for consumers
and data subjects where there is infringement of
their rights under the data protection laws. These
issues require consideration and action by the
European Commission and national European
governments.
sub conclusIon
Recent years have seen a rise in concern with how
personal data is handled. These concerns have
resulted in the introduction of data protection laws
which aim at guaranteeing that personal data is
handled appropriately. However, the creation of
these legal rights and legal protections will only
conclusIon
The legal frameworks that have been constructed
to protect consumers and their privacy are not
perfect, many gaps, deficiencies, and short comings still exist. However, the legal framework
constructed does provide the foundation for the
development of laws in Europe, which will be
responsive and effective in protecting consumers from unethical processing and use of their
personal data and from exploitative business
practices. The European Commission and European governments need to keep the law under
review to ensure that it responds and evolves to
protect consumers. Consumer rights agencies and
data protection agencies will increasingly have a
role to play in ensuring that consumers and data
subjects are aware of their legal rights and are
empowered to assert them.
references
Anassutzi, M. (2002). E-commerce directive
00/31. International Company and Commercial
Law Review, 13(9), 337-342.
Brazell, L. (2004). Electronic signatures law and
regulation (1st ed.). London: Sweet & Maxwell.
Colgan, N. (2003) Ireland: Electronic commerce
directiveimplementation into Irish law. International Trade Law and Regulation, 9(2).
Corbett, R. (1993). The treaty of Maastricht.
London: Longman.
Craig, P., & de Brca, G. (2007) EU law text
cases & materials (4th ed.). Oxford, UK: Oxford
University Press.
Ellis, H. (2004). Modern Irish commercial and
consumer law. London: Jordan Publishing.
Fielder, A. (2002). Better compliance: guidance,
enforcement & self-regulation. Paper presented
at the Data Protection Conference and Report on
the implementation of Directive 95/46/EC, 2002.
Retrieved October 13, 2007, from http://ec.europa.
eu/justice_home/fsj/privacy/docs/lawreport/
fielder_en.pdf
Gillies, L. (2001). A review of the new jurisdiction
rules for electronic consumer contracts within the
European Union. Journal of Information Law and
Technology (1).
Hornle, J. (2005). Country of origin regulation in
cross-border media: one step beyond the freedom
The Convention on jurisdiction and the enforcement of judgments in civil and commercial matters, signed at Brussels, 27 September 1968, OJ
L299/32 1968. Retrieved 13 October, 2007, from
http://eur-lex.europa.eu/LexUriServ/LexUriServ.
do?uri=CELEX:41968A0927(01):EN:HTML
Council Regulation (EC) No 44/2001 of 22
Dec 2000 on jurisdiction and the recognition
and enforcement of judgments in civil and commercial matters. Also known as the Brussels
1 Regulation. Retrieved October, 13 2007, from
http://eur-lex.europa.eu/LexUriServ/LexUriServ.
do?uri=CELEX:32001R0044:EN:HTML
Directive 2000/31/EC. Retrieved October 11, 2007,
from http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:32000L0031:EN:NOT
Directives 98/34/EC and 98/84/EC.
European Commission, Directorate General
Press and Communication (2002). Towards a
knowledge-based Europethe European Union
and the information society. Retrieved October
13, 2007, from http://ec.europa.eu/publications/
booklets/move/36/en.pdf
European Commission, The Special Eurobarometer Survey on Data Protection. Retrieved October
13, 2007, from http://ec.europa.eu/public_opinion/archives/ebs/ebs_196_en.pdf
First Report on the Implementation of the Data
Protection Directive (95/46/EC), COM(2003) 265
final (Brussels, May 15, 2003). Retrieved October 13, 2007, from http://eurlex.europa.eu/LexUriServ/site/en/com/2003/com2003_0265en01.
pdf
IP/97/313Date:16/04/1997. Retrieved October 13,
2007, from http://europa.eu/rapid/pressReleasesAction.do?reference=IP/97/format=HTML&age
d=1&language=EN&guiLanguage=en
OECD Conference to Examine Alternative Dispute Resolution Mechanisms for On-Line Commerce. The Hague, 11-12 December 2000.
sources of lAW
Data Protection Act 1998 (UK).
Data Protection Directive 95/46/EC.
Directive 2002/58/EC
Electronic Commerce Act, Number 27 of 2000
(Ireland).
European Communities (Directive 2000/31/EC)
Regulations 2003 S.I. No. 68 of 2003 (Ireland).
European Convention on Human Rights and Additional Protocols is available from the Council of
Europes Web site. Retrieved October 13, 2007,
from http://www.echr.coe.int/NR/rdonlyres/
D5CC24A7-DC13-4318-B457-5C9014916D7A/0/
EnglishAnglais.pdf
OECD Guidelines on the Protection of Privacy and
Transborder Flows of Personal Data. Retrieved
October 13, 2007, from http://www.oecd.org/
document/18/0,2340,en_2649_34255_1815186_
1_1_1_1,00.html
AddItIonAl reAdIng
endnotes
Bygrave, L. (2002). Data protection lawapproaching its rationale, logic and Limits.
Springer.
Carey, P. (2007). Data protection: A practical
guide to UK and EU law. Oxford University
Press.
Schultz, A. (2006). Legal aspects of an e-commerce transaction. In Proceedings of the International Conference in Europe. Sellier: European
Law Publishers.
Singleton, S. (2003). E-commerce: A practical
guide to the law. UK: Gower.
Stein, S. D., (2003). Law on the web: A guide
for students and practitioners. USA: Pearson
Education.
10
11
12
13
14
15
16
17
Where the data subject has unambiguously given his or her consent for the
processing;
18
19
20
21
22
23
24
Chapter XVIII
Cybermedicine, Telemedicine,
and Data Protection in the
United States
Karin Mika
Cleveland State University, USA
Barbara J. Tyler
Cleveland State University, USA
AbstrAct
This chapter provides an overview of law relating to online and Internet medical practice, data protection, and consumer information privacy. It provides a comprehensive overview of federal (HIPAA) and
state privacy laws, concluding that both those legal resources leave gaps in consumer protection and
provide no real penalties for violating the laws. The authors educate the readers to the legal and data
protection problems consumers will encounter in purchasing medical and health services on the Internet.
Furthermore, the authors recount some actual case studies and follow those with expert advice for those
Internet consumers who wish to be not merely informed, but also safe. The authors not only educate
the readers to the lack of protection afforded to them but also advocate throughout the chapter that the
United States must enact more federal protection for the consumer in order to deter privacy violations
and punish criminal, negligent, and wilful violations of personal consumer privacy.
Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.
IntroductIon
The practice of medicine is not immune from
the information age. The use of the Internet,
including e-mail, in medical practice is altering
the traditional method of delivering medical care.
Millions of Americans now rely upon the Internet
as a primary source of medical information or
education about their own symptoms, conditions, diagnoses, and treatments. The practice of
telemedicine, consulting with another physician
by using technology, is constantly evolving and
expanding into areas never before imagined.
Physicians are establishing their own Web sites
and some few are now practicing medicine on
the Internet.
The progression of the traditional practice
of medicine in cyberspace has brought with it
many issues related to privacy and online data
protection. No longer is the physician-patient
relationship limited to an in-person office consultation that carries with it the legal protections
of doctor-patient privilege. Rather, the practice
of medicine has evolved to include interactions
that might not have ordinarily been considered
a physician-patient relationship, and these
contacts may stretch across both real and virtual
boundaries. In fact, the interactions are, at times,
both real and virtual, and the consumer-patient is
now in a situation where it is difficult to identify
exactly who is the party on the other end.
This chapter will provide an overview of the
law relating to cybermedicine, medicine practiced without traditional in-person contact, and
telemedicine, in terms of data protection and
other legal complications related to licensing
and a conflict of state laws. The chapter will
examine the laws applicable to Web sites where
medical diagnosis or the purchase of medical
services (including prescriptions) is available. The
chapter will discuss how the new methodology of
acquiring medical care is at odds with traditional
notions of state regulation and how current laws,
both federal and state, leave many gaps related to
8
Board Certification
Traditional medical licensing has changed in
recent years to require more education for those
9
0
Informed consent
In general, consent for most treatment must be an
informed consent. This type of consent means
that the treating provider is required to give the
patient or decision maker several elements of
information before the decision on treatment is
made. As the American Medical Association
states so eloquently on its Web site: Informed
consent is more than simply getting a patient to
sign a written consent form. It is a process of
communication between a patient and physician
that results in the patients authorization or agreement to undergo a specific medical intervention
(AMA, 2007). To establish a cause of action based
upon lack of informed consent, the patient must
prove that a practitioner failed to disclose to the
patient the various alternatives and the reasonably
foreseeable risks and benefits involved which a
reasonable medical practitioner under similar circumstances would have disclosed (AMA, 2007).
The ethical obligation to communicate certain
information to the patient exists in statutes and
case law all 50 states.
In 2007, Medicare and Medicaid circulated
new interpretive guidelines contained in the Code
of the Federal Regulations that significantly expanded the scope and documentation of informed
consent that must be obtained by hospitals prior
to performing surgical procedures. For example,
the new Medicare/Medicaid guidelines require
that patients be informed if a practitioner, other
than the primary surgeon, would perform important parts of the procedure, even when the
covering the privacy of an individuals medical records. The privacy provisions of HIPAA
are intended to allow a patient to limit who will
have access to medical records and further provides a limitation on the internal use of sharing
information for purposes of diagnosis in that it
restricts the disclosure of health information to
the minimum amount necessary required for
the intended purpose (Schmidt, 2000).
HIPAA specifically covers health information
oral or recorded in any form or element that:
a) is created or received by a health care provider,
health plan, public health authority, employer,
life insurer, school or university, or health care
clearing house; and
b) relates to the past, present, or future, physical
or mental health condition of an individual; the
provision of health care towards an individual;
or the past, present, or future payment for the
provision of health care to an individual. (Health
Insurance Portability and Accountability Act of
1996)
HIPAA essentially applies to three specific
health entities: health care providers (such as doctors and hospitals), health care plans, and health
care clearing houses, which include such entities
as third party billing services that may be hired to
code certain medical procedures for insurance
companies (Public Welfare. 45 C.F.R. 160.102).
HIPAA also applies to employers, insurance
companies, and public agencies that deliver social
security or welfare benefits to the extent that they
work with or necessarily disseminate information related to an employees medical records (
160,164). HIPAA applies to all individual health
information that is maintained or transmitted
and includes health claims, health plan eligibility, enrollment and disenrollment, payments for
care and health plan premiums, claim status, first
injury reports, coordination of benefits, and related
transactions (Metz, 20004). Thus, the purpose
3.
HIPAA appears to specifically limit enforcement actions may be brought by only the respective states or the secretary of health and human
services (42 U.S.C. 300gg-22(a); ODonnell
v. Blue Cross Blue Shield of Wyo., 2001). Thus,
reporting a HIPAA violation might be the best
that an individual harmed by the release of data
can do. However, some plaintiffs have attempted
to bring indirect causes of action for a HIPAA
violation by bringing a common law invasion of
telemedIcIne, cybermedIcIne,
And InformAtIonAl Web sItes
the basics
When determining the complications related to
medical practice and privacy on the Internet, it is
important to understand the distinction between
the types of medical Web sites that one might
encounter, and the medical interactions that
one might have. Like the traditional in person
patient/physician relationship, some virtual
interactions between patients and physicians are
between known individuals who might actually
see each other, although far removed from
each other geographically (Harrington, 1999). In
other interactions, however, the person logging
into a website might have no idea who is on the
other end, or even if the person on the other end
is a physician. Thus, these relationships make
the definitions of patient and physician hazy
when attempting to discern the application of any
existing privacy laws to Internet or telemedical
transactions (Lewis, 2004).
telemedicine
Telemedicine is defined as the use of telecommunications technology to provide health care
services to patients who are distant from a physician or other health care provider (Granade &
Sanders, 1996). Generally, a consultant is used
to medically diagnose a patients condition via
two-way interactive television, remote sensing
equipment, and computers. Such medical practice
has advantages, not only for improving access
cybermedicine
A. Patient Treatment
An offspring of the information and technology
revolution, cybermedicine is the embodiment of
a discipline that applies the Internet to medicine.
The field uses global networking technologies to
educate, innovate and communicate in ways that
enhance and promote medical practice, rapidly
transforming medicine into a different discipline.
In many respects, cybermedicine is broadly
defined as the practice of medicine without the
necessity of any physical in-person consultation
or examination (Scott, 2001).
Cybermedicine is a relatively new phenomenon that has been around for less than 15 years.
On Friday, October 4, 1996, the news announced
the first virtual real live doctors office on the
World-Wide-Web had opened (Cyberdocs Today,
1996). On this site, for a reasonable fee of $65, a
patient could enter her name and vital statistics, her
medical problem, send the service her symptoms,
In some instances, the privacy aspects making many sites attractive are problematic in and of
themselves. Individuals can often order drugs by
a brief description of symptomssymptoms that
they may or may not have (FDA FAQs, 2007).
In addition, because the Internet guarantees near
anonymity, there is no way for the Web site to
tell that the person ordering on line is the person
actually on the other end of the transaction. Although this type of scenario raises the specter of
fraud, it raises other issues related to privacy.
The truth is that anyone who owns a credit card
and a computer can order controlled substances
or any other drugs online. Pharmacies that do
not require a faxed or mailed prescription from
a licensed physician present serious concerns for
not only privacy but life itself. Drugs offering help
for erectile dysfunction, Viagra, Cialis, and Levitra, rank among the top 10 drugs bought online.
Other drugs in the top 10 sought after online are
Propecia, for baldness, as well as drugs for acid
reflux, cholesterol, and bone density (Lade, 2007).
Anonymity seems to be the deciding factor in ordering such drugs online. Legitimate pharmacies
functioning online can be problematic for privacy
concerns. An FDA advisory was issued in spring
2007 after examining foreign drug purchases
and finding drugs with potentially dangerous
side effects were easily ordered online, without
a prescription. Two deadly examples of resulting
problems with online ordering of drugs follow.
C. A Suicide Case
D. An Overdose Case
Christopher Smith made over $24 million dollars
selling prescription painkillers illegally through
his Internet pharmacy before he was indicted
(Unze, 2007). Minneapolis court documents
show that Smith sold prescription painkillers to
8
degree possible is to be aware of the privacy policies of the Web site with which they are dealing.
Financial data should be logged only into trusted
Web sites that have encryption methodology in
place, and personal data (such as names and addresses) should be shared only on websites that
have a similar encryption data or on those that
have privacies policies with which the individual
agrees. Most consumers should be aware that the
Internet makes it very easy to share data with
other businesses, and that most businesses would
prefer to engage in quick, targeted advertising.
Any registration with any type of Web site may
make the consumer part of a larger database that
will not only result in unwanted email solicitations,
but will make the consumer more susceptible to
scam advertising.
Consumers should be aware that Web sites,
even those related directly to the medical profession, have different types of privacy policies.
Some highlights from these privacy policies are
included in Table 1.
When accessing any Web site where personal
information is disclosed, consumers who have
any privacy concerns at all should be aware of
the privacy policies related to the Websites. As
the examples indicate, not all information that
might be thought to be private or even protected
by HIPAA is necessarily private. For instance,
there are many pharmaceutical Web sites that
are not United States Web sites and not subject
to United States laws. Although many of these
websites have American sounding names, the
consumer should be aware that when logging
in, entering data, and ordering, that information
entered might not be as private as the consumer
thought.
In addition, many Web sites contain somewhat
of a caveat emptor proviso indicating to the
consumer that if there are any links accessed from
the first website, the consumer should understand
that the privacy policies of the first Web site do
not apply. Further, various Web sites disclose that
private data may be accessed by a company doing
Table 1.
Web Md
There is a lengthy privacy policy that includes information about what cookies are
collected. The policy also provides that personally identifiable information will not be
disclosed except 1) to meet legal requirements and 2) when there is a threat requiring
disclosure. The site informs the consumer that the consumer will be informed of
material changes to the policy and provides that complaints may be lodged with
TRUSTe privacy watchdog
American Academy of
Family Physicians (AAFP)
A shorter privacy policy discloses use of cookies and states that member information
may be provided to constituent chapters. The policy also provides that some
information may be disclosed for purposes of targeted sales. There is a disclaimer
providing that some information may be disclosed when legally required. The site also
forewarns the consumer that it cannot be held responsible for the actions of third
parties whom have links within the site.
Merck.com
Merck provides that consumers may elect a level of privacy protection. The policy
states that, Personal information about you will be accessible to Merck, including its
subsidiaries, divisions, and groups worldwide, and to individuals and organizations
that use personal information solely for and at the direction of Merck, and further
provides that information will be disclosed only to those working on its behalf.
Policy gives consumer ability to opt out of disclosure, but also provides that
aggregate information is sometimes disclosed for research purposes. There may be
disclosure as required by law.
Revolutionhealth
Policy provides that the information provided by the consumer may be used to acquire
information about other people in your demographic area for targeted advertising. The
policy states that information may be disclosed for legal reasons, or when a threat is
involved (e.g., national security). The site has a disclaimer that if third party sites are
accessed, the privacy policies of the third party sites should be reviewed. Consumers
have an opportunity to opt out of particular disclosures.
MedRx-One
(No prescription necessary site)
Non U.S. company; one line privacy policy: medrx-one pledges that the information
you enter will not be shared with any parties not directly involved with the ordering or
delivery process without your expressed consent (except for fraud cases) and that any
e-mails you receive from medrx-one will be related to your order. The terms of use
indicate that local laws (i.e., country of origin) will apply to any legal issues.
CVS
Walmart
Walmarts privacy policy provides, We may use or disclose your PHI for prescription
refill reminders, to tell you about health-related products or services, or to recommend
possible treatment alternatives that may be of interest to you, and, We may
disclose your PHI to a family member or friend who is involved in your medical
care or payment for your care, provided you agree to this disclosure, or we give you
an opportunity to object to the disclosure. If you are unavailable or are unable to
object, we will use our best judgment to decide whether this disclosure is in your best
interests.
9
0
Additionally, the Privacy Rule requires supplemental authorization if a health care provider
intends to disseminate any private information
for the purpose of research, some types of marketing, or fundraising. Information related to
psychotherapy is also considered overly sensitive,
and HIPAA requires supplemental authorization
go to an email address, or may even require information such as an address. Because these sites
are not health care providers, there is no prohibition that would prevent continuous emailings, or
postal mailings that a person may not want about
a condition that a person may not want others
to know about. Because there is no health care
provider relationship, there is nothing prohibiting any of these sites from selling information
to other entities.
There are numerous privacy concerns that
may develop by virtue of the fact that lines blur
as the electronic trail becomes more extensive. In
an ordinary situation, a health care provider deals
with secretaries who may deal with insurance
companies, who may deal with other insurance
companies, who may deal with outsourcing of
billing and/or data processing, who may ultimately
deal with collection agencies for unpaid bills.
Although each of these entities would technically
be bound to keep data private under an initial
business entity and employee agreement, the more
people involved, the more difficult it is to control
who has what data.
Where telemedicine and cybermedicine are
concerned, the trail goes out even farther. In telemedicine situations, there may be staff in a room
unseen by a patient having nothing to do with the
procedure or medical consultation (such janitors,
computer technicians, or camera operators). For
both telemedicine and cybermedicine, there are
Web designers and engineers who may have access to information in databases within the Web
site. None of these ancillary workers are generally
bound to keep health information private.
When the Web site itself is set up to work as
a place where health care is provided, there is a
good argument that these technical employees
are bound by HIPAA as employees dealing with
health treatment information; however, that characterization is debatable if dealing with a third
party Web administrator (i.e., an independent
contractor), or when the Web administrators are
part of a larger conglomerate, such as when a
retailer (e.g., Walmart) has a site where prescriptions may be filled. There is also some uncertainty
as to what is the status of a third-party owner of
a website when that owner is not in the health
care field but rather has purchased the website
for investment purposes, or even when the Web
site itself is a subsidiary of a health care company
(Nath, 2006).
In all instances involving use of the Internet for
medical care, the consumer must not assume that
whatever personal data is being handled will be
kept confidential. If the consumer is attempting to
protect private data, that consumer must be aware
of the policies of the entity from wherever any
type of medical information is shared.
2.
A New York court legally enjoined a psychoanalyst from circulation a book in which
detailed information concerning a patient
was written (Doe v. Roe, 1977).
The Oregon Supreme Court ruled that a physician was liable for revealing his patients
identity to the patients natural child who
had been adopted (Humphers v. Inter. Bank,
references
42 C.F.R. 482.51
42 U.S.C. 290dd-2 (2006).
American Board of Medical Specialties. (2007).
Retrieved June 1, 2007 from http://www.abms.
org
Fox v. Smith, 594 So. 2d 596 (Miss. 1992). Retrieved June 1, 2007, from LexisNexis Cases, Law
School database.
Boyer, M. C. (2004). Texas administrative agencies tackle compliance with the health insurance
portability and accountability acts privacy rule.
Texas Tech Journal of Administrative Law, 5,
100-111.
Can Google diagnose illness better than doctors? (Nov. 6, 2006). The Daily Mail (London).
Retrieved June 4, 2007, from http://www.seroundtable.com/archives /006667.html
Chamness, J. N. Liability of physicians for communicating over the internet. Retrieved June 1,
2007, from http://www.Cornelius-collins.com/
CM/Whats New/asp
Chiang, M., & Starren, J. (2002). Telemedicine
and HIPAA: Data confidentiality and HIPAA.
Retrieved May 23, 2007, from http://www.ideatel.
org/syllabus/hipaa.html
Controlling the Assault of Non-Solicited Pornography and Marketing Act of 2003, 15 U.S.C.
7706.
Granade, P. F., & Sanders, J. H. (1996). Implementing telemedicine nationwide: Analyzing the
legal issues. 63 Defense Counsel, 67.
Mencik, S. (1999). Are secure internet transactions really secure? Retrieved June 6, 2007, from
http://www.jsweb.net/paper.htm
Metz, J. (2004). Practical insights to HIPAA:
Overview and general background regarding
HIPAA. Retrieved May 21, 2007, from http://www.
dilworthlaw.com/pdf/hipaa.pdf
Miller, R. D. (2006). Health care law. Sudbury,
Massachusetts: Jones Bartlett Publishers.
Moser v. Stallings, 387 N.W.2d 599 (Iowa 1986).
Retrieved June 1, 2007 from LexisNexis law
School database.
Nath, S. W. (2006). Relief for the e-patient?
Legislative and judicial remedies to fill HIPAAs
privacy gaps. George Washington Law Review,
74, 532-540.
ODonnell v. Blue Cross Blue Shield of Wyo., 173
F. Supp. 2d 1176, 1179-80 (D. Wyo. 2001).
Office of Health Information Technology (2004).
Retrieved June 21, 2001 from http://www.Whitehouse.gov/omb/egov/gtob/health_informatics.
htm
Ohio Rev. Code Ann. 2743.43 (A)-(D) (LexisNexis 2007) Expert testimony on liability issues.
Retrieved June 1, 2007, from LexisNexis Ohio
Cases.
Ostrov, B. (2007, March 14). Menlo Park teens
suicide shines light on shadowy market. The
Mercury News. Retrieved September 18, 2007,
from LexisNexis News library.
Patients Rights Conditions of Participation
(Revised 2007). 42 C.F. R. 482.13, 482.24 and
482.51.
Pettus v. Cole, 57 Cal. Rptr. 2d 46 (Cal. Ct. App.
1996).
Pillar, C. (2001, November 7). Web mishap:
Kids psychological files posted. L.A. Times,
Nov., A1.
8
suggested reAdIngs
Beaver, K., & Herold, R. (2003). Practical guide
to HIPAA privacy and security compliance. New
York: CRC Press.
Hall, G. (2006). Privacy crisis: Identity theft
prevention plan and guide to anonymous living.
United States: James Clark King, LLC.
HIPAAHealth Insurance Portability and Accountability Act of 1996, 42 U.S.C. 1320(d).
Sanbar, S. S., Fiscina, S., & Firestone, M. H.
(2007). Legal medicine (6th ed.). Philadelphia:
Elsevier Health Sciences.
Slack, W. V., & Nader, R. (2001).Cybermedicine:
How computing empowers doctors and patients
for better health care. United States: Jossey-Bass
Inc.
Web sItes
Data Privacy Lab. (2007). The laboratory at
Carnegie Mellon seeks balanced integrated solutions that weave technology and policy together.
Retrieved, November 4, 2007, from http// www.
cs.cmu.edu
9
0
Chapter XIX
AbstrAct
This chapter explores the current status and practices of online privacy protection in Japan. Since the
concept of privacy in Japan is different from that in western countries, the background of online privacy
concepts and control mechanisms are discussed. The chapter then introduces Japans Act on the Protection of Personal Information along with the privacy protection system in Japan. Following the discussion
of the privacy law, Japans privacy protection mechanisms to support and implement the new act are
examined. To help companies make smooth adjustments and transitions, a four-stage privacy protection
solution model is presented. Further, this chapter discusses two case studies to exemplify the problems
and dilemmas encountered by two Japanese enterprises. The cases are analyzed and their implications
are discussed. The chapter is concluded with future trends and research directions.
Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.
IntroductIon
In the past, privacy protection was not considered
as necessary for business in Japan. Instead, the
market determined how companies were to deal
with consumer private information. However,
information technology (IT) has advanced rapidly and all business standards were changed
to use electric files. Companies began to store
tremendous amounts of information to a database rather than using paper-based file cabinets.
IT has changed business structure but it has
also exacerbated privacy problems, private data
leaks, unauthorized data collection, and the loss
of private data.
After more and more privacy-related problems
were revealed by the media, consumers began
to pay attention to the protection of their private
information. As a result, the Japanese government established the Act on the Protection of
Personal Information in 2005 to protect consumers and regulate companies business activities
associated with customers private information
(Yamazaki, 2005). After this law was launched,
many companies exposed their weaknesses in
their privacy protection system and unethical
private data use. The role of privacy had begun
to shift to the consumer side. When consumers
decided to purchase or do business transactions
online, they assumed that there would be a reliable system and trustworthy privacy protection
(Tahara & Yokohari, 2005).
The organization of this chapter can be
overviewed as follows. In the next section, the
background of online privacy concepts and
control mechanisms are discussed. The chapter
then explores Japans Act on the Protection of
Personal Information along with the privacy
protection system in Japan. Following the discussion of the privacy law, Japans privacy protection mechanisms to support and implement the
new act are discussed. To help companies make
smooth adjustments and transitions, the authors
present a four-stage privacy protection solution
bAckground
The concept of privacy in Japan is different from
that in western countries. Japan is a country with a
very high density of population. People are living
right next to each other, and it seems like there
are no boundaries and there is no privacy. However, these are the characteristics of the Japanese
people who indeed understand and respect privacy
(Makoto et al., 2005). Even though there is only a
thin wall between rooms, people can have privacy
with as if behavior. For example, even though
one person knows anothers secret, he or she will
act as if they do not know the secret (Mizutani,
Dorsey, & Moor, 2004). It describes how Japanese
people respect each others boundaries and help
keep secrets. It is also important to understand that
the Japanese culture is group-based. Within the
group, people respect each other and try to think
and move in the same direction. Although they
have their own minds and thoughts, they always
consider how other people in the group think first,
and then decide what to do, heavily depending on
the groups opinions. Often people do not use eye
or body contact as frequently as western people
do because of the different perception of privacy
they have in mind (Makoto et al., 2005).
However, the Internet has created a new environment for privacy. People can obtain, access,
and manage enormous amounts of information
without actual face-to-face interaction. People
can express their opinions anonymously and
they can act any way they like on the Internet.
Anonymity has a major impact on the Japanese
conception of privacy because people no longer
have to depend on the group mind, since on the
Opt-In
Opt-Out
In opt-out, a company must guarantee the right of
individuals to opt out their private information.
It requires companies to post their privacy policy,
including the subjects about opt-out policy, on
their Web site so consumers can access to understand before they actually send their private
information to company. Companies also have to
provide the opportunity for consumers to request
that their private information not be used for third
party use whenever they want to do so (Tahara
& Yokohari, 2005).
Once a company collects private information
according to the opt-out policy, it can use private
information for third party use until consumers
ask to stop doing it. It is considered as an agreement for third-party use between consumers and
company when consumers send them out to the
company. Under the opt-out policy, a company is
willing to share private information with a thirdparty, such as direct phone marketing and direct
e-mail advertisement. Most consumers are not
aware of this policy and in most cases companies
can take advantage of consumers to make their
business easier.
the Act
Until 2005, there was no law enforcement for
privacy protection in Japan but only guidelines.
limitations
Data Collection
The Act does not define any subjects as prohibited to collect private data from consumers. In
the U.S., it is prohibited to collect sensitive data,
such as political, religion, racial, and ethical view,
medical history, sexual orientation, and criminal
record. In addition, there is the Children Privacy
Act to protect young people under the age of 13.
However, no such types of protection laws exist
in Japan and companies can collect any kind of
data from anyone (Yamazaki, 2005).
Data Transfer
8
p3p system
The new media development association in Japan
decided to adopt the platform for privacy preferences (P3P) for privacy protection. However,
the adjustments based on the Japanese law and
customs are required to use P3P in Japan because
the P3P was mainly developed in Europe and
the U.S. According to a survey done by the new
media development association (2006), 11% of
companies used P3P on their Web site in 2005.
Comparing the surveys from 2004 and 2005,
the use of P3P decreased by 7% because of the
following issues:
In order to promote P3P as a standard technology in Japan, it is important to clear these issues.
The history of privacy protection in Japan is
not long and consumers attention remains low.
Consumers involvement is key to the popularity of protection mechanisms and will help these
mechanisms become used by more and more
companies in Japan.
privacy seals
In Japan, there is a privacy seal system called
privacy mark. They have a cross relationship
with TrustE and BBBonline in the U.S., privacy
mark uses JIS Q 15001, a privacy protection
management standard, to observe candidates. If
candidates are qualified, they can use privacy mark
on their Web site and join other privacy-marked
companies (Okamura, 2004). The privacy mark
system started in 1998. At that time there was no
law to protect private information. As a response,
Japan Private Information Development Asso-
four-stAge prIvAcy
protectIon solutIon model
Privacy protection has become a major issue in
Japan since the Act was in effect. Companies have
been trying to integrate their systems and receive
their customers confidence. In this section, the
guidelines to establish a privacy protection system
are presented. Adapted from Tahara and Yokohari
(2005) and Yamazaki (2005), a four-stage model
is proposed as follows.
9
cAse studIes
case I: yahoo! bbs data leak
8
8
Employee working rules: The working environment was almost unrestricted. There
was no rule for dealing with customers files.
Employees could copy any files to a personal
device if they could access the database.
In summary, two major privacy breaches occurred internally instead of externally. Both cases
exhibit major problems in the stage of establishing
privacy policy, which resulted from lack of access
control, physical security control, and security
training and education. The companies were not
fully compliant with the Act because it was the
first privacy law for commercial companies. However, their weak privacy protection had degraded
their businesss reputation. The companies were
required to report their privacy incidents to the
government according to the Act. Both companies
need to review their privacy protection system
and reorganize their system to prevent similar
incidents from happening again.
future trends
The year of 2005 was a major turning point for
all businesses in Japan. The first law on the protection of personal information for commercial
companies was launched. Companies were trying
to adapt to the new law. The Act on the Protection of Personal Information has brought new
perspective of privacy to all Japanese companies
and consumers. The stage of privacy protection in
8
conclusIon
For Japanese companies and consumers, it is
important to understand the details of their new
privacy law and make adjustments from the traditional privacy protection system to the new one.
The Act also helps reveal many privacy breaches
and violations. Many suits related to privacy are
expected. However, companies cannot escape
from the required changes since online privacy
has become a priority problem for consumers in
Japan.
Consumer online privacy is receiving more
and more attention because consumers are being educated to know the importance of private
information protection and the impact of privacy
breaches. For companies, protecting consumers
privacy has become a primary issue since it will
help gain their customers trust and business.
However, companies should take the responsibility
to not only enhance their online privacy protection
systems but also educate their employees and customers and incorporate the privacy concept into
Japanese culture toward a new stage. In e-commerce, online privacy can be protected by means
of the online privacy protection mechanisms such
as opt-in, privacy mark, and P3P. A privacy mark
seal program can help a company provide more
information in privacy policy for consumers. P3P
allows consumers to choose their privacy level by
themselves rather than let the company determine
the level of privacy that they will have. The opt-in
policy will allow consumers to select how their
private information is used online.
Weakness of privacy protection could cost
companies even more than what was exhibited
in the Yahoo! BB and Michinoku Bank cases.
When implementing a privacy protection system,
companies need to ensure the compliance of the
regulatory codes and legal obligations.
8
How does a company in Japan make adjustment in compliance with the new Act on the
protection of personal information?
How are consumers attentions shifting to
the new Act, which somewhat contradicts
Japanese culture? Japanese people respect
privacy in their culture but the new law
will stimulate their thinking and need for
privacy.
The longitudinal change of the Act should
be observed and analyzed. There is a lot of
room for the Act to be corrected or adjusted
as time goes on. Research can help establish
new models for the future privacy protection
system.
references
Act on the protection of personal information.
Cabinet Office, Government of Japan (2005).
http://www5.cao.go.jp/seikatsu/kojin/foreign/act.
pdf
Analysis of Japanese privacy protection technology (2005). Retrieved from New Media Develpment Association: http://www.nmda.or.jp/enc/privacy/pr_rep_2006.pdf
Analysis of new technology for privacy protection
system (2006, March). Retrieved from New Media Develpment Association: http://www.nmda.
or.jp/enc/privacy/pr_rep_2006.pdf
Arai, T. (2005). Outline of personal informtion
protection act. Information Management, 2-15.
Investigation result of data leak from Lawson. (2003). Retrieved from CNET Japan:
h t t p: //ja p a n .c n e t .c o m / n e w s / m e d i a /s t o ry/0,2000056023,20060378,00.htm
AddItIonAl reAdIngs
Ackerman, M. (2004). Privacy in pervasive environment: next generation labeling protocols.
Personal and Ubiquitous Computing, 8(6),
430-439.
Agrawal, R., Kiernan, J., Srikant, R., & Xu, Y.
(2003). An XPath-based preference language for
P3P. In Proceedings of the Twelfth International
World Wide Web conference (WWW2003) (pp.
629-639).
8
Kawai, M. (2004). Anzen sinwa houkai no paradokkus (Paradox of the Collapse of Belief in Safety
in Japan). Iwanami, Tokyo.
Cassidy, C., & Chae, M. (2006). Consumer information use and misuse in electronic business:
An alternative to privacy regulation. Information
Systems Management, 23(3), 75-87.
Milne, G., Rohm, A., & Bahl, S. (2004). Consumers protection of online privacy and identity.
Journal of Consumer Affairs, 38(2), 217-232.
8
8
88
Compilation of References
42 C.F.R. 482.51
42 U.S.C. 290dd-2 (2006).
Abercrombie, N., & Longhurst, B. (1998). Audiences:
A sociological theory of performance and imagination.
CA: Sage Publication.
Aberdeen Group. (2005). Third brigadebusiness
value research seriesmost important security action: Limiting access to corporate and customer data.
Whitepaper. Retrieved October 2007, from http://www.
thirdbrigade.com/uploadedFiles/Company/Resources/
Aberdeen%20White%20Paper%20--%20Limiting%20
Access%20to%20Data.pdf
Aberer, K., & Despotovic, Z. (2001). Managing trust in
a peer-2-peer information system. In Proceedings of the
2001 ACM CIKM International Conference on Information and Knowledge Management, Atlanta, Georgia (pp.
310-317). New York: ACM.
Act on the protection of personal information. Cabinet
Office, Government of Japan (2005). http://www5.cao.
go.jp/seikatsu/kojin/foreign/act.pdf
Agrawal, D., & Aggarwal, C. (2001). On the design
and quantification of privacy preserving data mining
algorithms. In Proceedings of the Twentieth ACM SIGMOD-SIGACT-SIGART Symposium on Principles of
Database Systems, PODS01, Santa Barbara, California
(pp. 247-255). New York: ACM.
Aiken, K., & Boush, D. (2006). Trustmarks, objectivesource ratings, and implied investments in advertising:
Investigating online trust and the context-specific nature
of internet signals. Academy of Marketing Science Journal, 34(3), 308-323.
Air Defense Press Release. (2005, February 17). AirDefense monitors wireless airwaves at RSA 2005 conference. Retrieved October 2007, from http://airdefense.
net/newsandpress/02_07_05.shtm
Akenji, L. (2004). The eight basic consumer rights.
Retrieved November 8, 2006, from http://www.tudatosvasarlo.hu/english/article/print/254
Allen, A. (2000). Gender and privacy in cyberspace.
Stanford Law Review, 52(5), 1175-1200.
Allens Arthur Robinson. (2007). International data
flow. Retrieved October 2, 2007, from www.aar.com.
au/privacy/over/data.htm
Ambrose, S., & Gelb, J. (2006). Consumer privacy litigation and enforcement actions in the United States. The
Business Lawyer, 61, 2.
American Board of Medical Specialties. (2007). Retrieved
June 1, 2007 from http://www.abms.org
American Civil Liberties UnionACLU. (2007). Privacy
section of their web site. Retrieved July 12, 2007, from
http://www.aclu.org/privacy
American Heritage. (2006). American Heritage Dictionary of the English Language (4th ed.). Retrieved April
16, 2007, from http://dictionary.reference.com/browse/
copyright
American Management Association. (2005). Electronic
monitoring & surveillance survey. Retrieved October
2007, from http://www.amanet.org/research/pdfs/
EMS_summary05.pdf
Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.
Compilation of References
89
Compilation of References
Australian Institute of Criminology. (2006). More Malwareadware, spyware, spam and spim. High Tech
Crime Brief, 1(2006), 1-2.
Australian Privacy Foundation. (2005). Rule 3objectives and purposes. Retrieved April 2, 2007, from
http://www.privacy.org.au/About/Objectives.html
Australian Privacy Foundation. (2006). Identity checks
for pre-paid mobile phones. Retrieved April 2, 2007,
from http://www.acma.gov.au/webwr/_assets/main/
lib100696/apf.pdf
Australian Privacy Foundation. (2006). International
instruments relating to privacy law. Retrieved February
23, 2007, from http://www.privacy.org.au/Resources/
PLawsIntl.html
Awad, N. F., & Krishnan, M. S. (2006). The personalization privacy paradox: An empirical evaluation of information transparency and the willingness to be profiled online
for personalization. MIS Quarterly, 30(1), 13-28.
Barnes, G. R., Cerrito, P. B., & Levi, I. (1998). A mathematical model for interpersonal relationships in social
networks. Social Networks, 20(2), 179-196.
Barreto, M., & Ellemers, N. (2002). The impact of anonymity and group identification on progroup behavior
in computer-mediated groups. Small Group Research,
33, 590-610.
Basili, J., Sahir, A., Baroudi, C., & Bartolini, A. (2007,
January). The real cost of enterprise wireless mobility
(Abridged ed.). The Aberdeen Group. Retrieved October
2007, from http://www.aberdeen.com/summary/report/
benchmark/Mobility_Management_JB_3822.asp
Baumer, D. L., Earp, J. B., & Poindexter, J. C. (2004).
Internet privacy law: a comparison between the United
States and the European Union. Journal of Computers
& Security, 23(5), 400-412.
Baylor, K. (2006, October 26). Killing botnets McAfee.
Retrieved March 2007, from http://blogs.techrepublic.
com.com/networking/?cat=2
90
Compilation of References
Belanger, F., Hiller, J. S., & Smith, W. J. (2002). Trustworthiness in electronic commerce: The role of privacy,
security, and site attributes. Journal of Strategic Information Systems, 11(3-4), 245-270.
Belch, G. E., & Belch, M. A. (2004). Advertising and
promotion: An integrated marketing communications
perspective (6th ed.). New York: McGraw-Hill/Irwin.
Bellman, S., Johnson, J. J., Kobrin, S. J., & Lohse, L. L.
(2004). International differences in information privacy
concerns: A global survey of consumers. The Information
Society, 20, 313-324.
Beltramini, R. F. (2003). Application of the unfairness
doctrine to marketing communications on the Internet.
Journal of Business Ethics, 42(4), 393-400.
Benassi, P. (1999). TRUSTe: An online privacy seal program. Communications of the ACM, 42(2), 56-59.
Bennett, C. (2006). Keeping up with the kids. Young
Consumers, Quarter 2, 28-32.
Bennett, C. J. (1992). Regulating privacy. Ithaca, NY:
Cornell University Press.
Berendt, B., Gnther, O., & Spiekermann, S. (2005).
Privacy in e-commerce: Stated preferences vs. actual
behavior. Communications of the ACM, 48(4), 101-106.
Bernard, A. (2006). McAfees top ten security threats
for 2007. Retrieved October, from http://www.cioupdate.
com/print.php/3646826
Berner, R. (2006, May 29). I sold it through the grapevine.
Business Week, pp. 32-33.
Bhargava, B. (2006, September). Innovative ideas in
privacy research (Keynote talk). In Proceedings of the
Seventeenth International Workshop on Database and
Expert Systems Applications DEXA06, Krakw, Poland (pp. 677-681). Los Alamitos, CA: IEEE Computer
Society.
Bhargava, B., & Zhong, Y. (2002). Authorization based
on evidence and trust. In Y. Kambayashi, W. Winiwarter,
& M. Arikawa (Eds.), Proceedings of 4th International
Conference on Data Warehousing and Knowledge
9
Compilation of References
Carbo, J., Molina, J., & Davila, J. (2003). Trust management through fuzzy reputation. International Journal of
Cooperative Information Systems, 12(1), 135155.
9
Compilation of References
9
Compilation of References
Childrens Online Privacy Protection Act. (2000). Enacted April 22, 2000. Retrieved from http://www.epic.
org/privacy/kids/
Cho, H., & LaRose, R. (1999). Privacy issues in Internet surveys. Social Science Computer Review, 17(4),
421-434.
Choice. (2006). The eight basic consumer rights. Retrieved November 5, 2006, from http://www.choice.com.
au/viewArticle.aspx?id=100736&catId=100528&tid=100
008&p=1&title=The+eight+basic+consumer+rights
Chor, B., Fiat, A., Naor, M., & Pinkas, B. (2000). Tracing traitors. IEEE Transactions on Information Theory,
44(3), 893-910.
Chou, C., & Hsiao M. C. (2000). Internet addiction, usage,
gratification, and pleasure experience: the Taiwan college
students case. Computers & Education,35, 65-80.
Christ, R. E., Berges, J. S., & Trevino, S. C. (2007). Social
networking sites: To monitor or not to monitor users and
their content? Intellectual Property & Technology Law
Journal, 19(7), 13-17.
Chua, S. L., Chen, D.T., & Wong, A. F. L. (1999). Computer
anxiety and its correlates: A meta-analysis. Computers
in Human Behaviors, 15, 609-623.
Cincu, J., & Richardson, R. (2006). Virus attacks named
leading culprit of financial lossby U.S. companies in
2006 CSI/FBI computer crime and security survey.
Retrieved July 13, 2006, from http://www.gocsi.com/
press/20060712.jhtml
Claburn, T. (2004). Dell believes education is best way to
fight spyware. InformationWeek, October 20. Retrieved
September 30, from http://www.informationweek.com/
showArticle.jhtml;jsessionid=GHVMAU4IX1LXGQS
NDLOSKHSCJUNN2JVN?articleID=50900097&que
ryText=Dell+Believes+Education+Is+Best+Way+To+F
ight+Spyware
Clark, R. (1989). The Australian privacy act 1988 as an
implementation of the OECD data protection guidelines.
9
Clarke, R. (1998). Direct marketing and privacy. Retrieved March 24, 2007, from http://www.anu.edu.
au/people/Roger.Clarke/DV/DirectMkting.html
Clarke, R. (1999). Internet privacy concerns confirm
the case for intervention. Communications of the ACM,
42(2), 60-67.
Clarke, R. (1999). Internet privacy concerns confirm
the case for intervention. Communications of the ACM,
42(2), 60-67.
Class FPR: Privacy. (1999). Common criteria for
information technology security evaluation. Part 2:
Security functional requirements. Version 2.1. (Report
CCIMB-99-032) (pp.109-118). Ft. Meade, MD: National
Information Assurance Partnership (NIAP). Retrieved
June 5, 2007, from http://www.niap-ccevs.org/cc-scheme/
cc_docs/cc_v21_part2.pdf
Clayton, G. (2000). Privacy evaluation: Dell. Retrieved
July 20, 2006, from http://www.informationweek.com/
privacy/dell.htm
Clearswift. (2006 October). Simplifying content securityensuring best-practice e-mail and web use. The
need for advanced, certified email protection. Retrieved
October 2007, from http://whitepapers.zdnet.com/whitepaper.aspx?&scid=280&docid=271750
ClickZ. (2003). Users still resistant to paid content.
Jupitermedia, April 11. Retrieved April 16, 2007, from
http://www.ecommerce-guide.com/news/news/print.
php/2189551
Compilation of References
Compilation of References
Culnan, M. (2000). Protecting privacy online: Is selfregulation working? Journal of Public Policy & Marketing, 19(1), 20-26.
Culnan, M. J. (1995). Consumer awareness of name
removal procedures: Implications for direct marketing.
Journal of Direct Marketing, 7, 10-19.
Culnan, M. J. (1999). The Georgetown Internet privacy
policy survey: Report to the Federal Trade Commission. Retrieved April 15, 2005, from http://www.msb.
edu/faculty/culnanm/gipps/gipps1.pdf
Culnan, M. J. (2000). Protecting privacy online: Is
self-regulation working? Journal of Public Policy &
Marketing, 19(1), 20-26.
Culnan, M., & Armstrong, P. (1999), Information privacy concerns, procedural fairness, and impersonal
trust: An empirical evidence. Organization Science,
10(1), 104-115.
Cranor, L. F., Reagle, J., & Ackerman, M. S. (1999). Beyond concern: Understanding net users attitudes about
online privacy (Tech. Rep. No. TR 99.4.3). Middletown,
NJ: AT&T Labs-Research. Retrieved June 5, 2007, from
citeseer.ist.psu.edu/cranor99beyond.html
Curtis, K. (2005, September). The importance of selfregulation in the implementation of data protection
principles: the Australian private sector experience.
Paper presented at the 27th International Conference of
Data Protection and Privacy Commissioners, Montreux,
Switzerland.
Cravens, A. (2004), Speeding ticket: The U.S. residential broadband market by segment and technology.
In-Stat/MDR.
9
Compilation of References
DAstous, A. (2000). Irritating aspects of the shopping environment. Journal of Business Research, 49, 149-156.
Dalal, R. S. (2006). Chipping away at the constitution:
The increasing use of RFID chips could lead to an erosion of privacy rights. Boston University Law Review,
86(April), 485.
Daub, T. R. (2001). Surfing the net safely and smoothly:
A new standard for protecting personal information
from harmful and discriminatory waves. Washington
University Law Quarterly, 79, 913-949.
Davies, S. (2004, February). The loose cannon: An
overview of campaigns of opposition to national identity
card proposals. Paper presented at the Unisys seminar:
e-ID: Securing the mobility of citizens and commerce
in a Greater Europe, Nice.
Day, J. (1999, February 22). Report: Teen online spending increases. Retrieved July 12, 2007, from http://www.
ecommercetimes.com/perl/story/366.html
Dell Inc. Australia. (2004). Dells privacy policy. Retrieved March 2, 2006, from http://www1.
ap.dell.com/content/topics/topic.aspx/ap/policy/en/
privacy?c=au&l=en&s=gen
Dell Inc. Australia. (2005). Dells online policies. Retrieved March 28, 2006, from http://www1.
ap.dell.com/content/topics/topic.aspx/ap/policy/en/au/
termsau?c=au&l=en&s=gen
Dell Inc. Australia. (2007). Online communication
policy. Retrieved June 5, 2007, from http://www.dell.
com/content/topics/global.aspx/corp/governance/en/online_comm?c=us&l=en&s=corp
Denoon, D. J. (2007). Internet drug pushing up again.
More websites advertising, selling controlled prescription drugs. Health News. Retrieved June 21, 2007, from
WebMD, http:// www.webmd.com/news/200770518/interent-drug-pushing-up-again
Department of Economic and Social Affairs (UN). (2003).
United nations guidelines for consumer protection (as
expanded in 1999). New York: United Nations.
Desai, M. S., Richards, T. C., & Desai, K. J. (2003). Ecommerce policies and customer privacy. Information
Manage & Computer Security, 11(1), 19-27.
Diaz, C., Seys, S., Claessens, J., & Preneel, B. (2003,
April). Towards measuring anonymity. In R. Dingledine
& P. F. Syverson, (Eds.), Proceedings of the 2nd International Workshop on Privacy Enhancing Technologies PET 2002, San Francisco, CA. Lecture Notes in
Computer Science (Vol. 2482, pp. 184-188). Heidelberg,
Germany: Springer.
Dietrich, W. (2006). Are journalists the 21st centurys
buggy whip makers? Nieman Reports, 60(4), 31-33.
Dinev, T., & Hart, P. (2006). An extended privacy calculus model for e-commerce transactions. Information
Systems Research, 17(1), 61-80.
Dinev, T., & Hu, Q. (2007). The centrality of awareness
in the formation of user behavioral intention toward
protective information technologies. Journal of the Association for Information Systems, 8(7), 386-408.
Dinev, T., Bellotto, M., Hart, P., Russo, V., Serra, I.., &
Coluatti, C. (2006). Internet users privacy concerns and
beliefs about government surveillance: An exploratory
study of differences between Italy and the United
States. Journal of Global Information Management,
14(4), 57-93.
Dingledine, R., Freedman, M. J., & Molnar, D. (2000).
The free haven project: Distributed anonymous storage
service. In H. Federrath (Ed.), Workshop on Design
Issues in Anonymity and Unobservability (pp. 6795).
Springer Verlag.
Dobosz, B., Green, K., & Sisler, G. (2006). Behavioral
marketing: Security and privacy issues. Journal of Information Privacy & Security, 2(4), 45-59.
Doe v. Roe, 93 Misc. 2d 201, 400 N.Y.S.2d 668 (Sup.
Ct. 1977).
Dominick, J. (1999). Who do you think you are? Personal
home pages and self-presentation on the world wide
web. Journalism and Mass Communication Quarterly,
76, 646-658.
9
Compilation of References
Eden, J. M. (2005). When big brother privatizes: Commercial surveillance, the privacy act of 1974 and the future
of RFID. Duke Law & Technology Review, 20.
Endpointsecurity. (2004). What is endpoint security? Retrieved October 2007, from http://www.endpointsecurity.
org/Documents/What_is_endpointsecurity.pdf
98
Compilation of References
99
Compilation of References
00
Compilation of References
0
Compilation of References
0
Compilation of References
0
Compilation of References
Harris, A. J., & Yen, D. C. (2002). Biometric authentication: Assuring access to information. Information
Management and Computer Security, 10(1), 12-19.
Harvard Law Review. (2007). Developments in the law
of media. 120(4), 990.
Haymarket Media. (2007). Spam hits records levels in
February. Retrieved March 20, 2007, from http://www.
crn.com.au/story.aspx?CIID=75798&r=rss
Health Insurance Portability and Accountability Act of
1996, 42 U.S.C. 1320(d).
Health Insurance Portability and Accountability Act
of 1996HIPPA (Kennedy-Kassebaum Act). (n.d.).
Retrieved from http://aspe.hhs.gov/admnsimp/pl104191.
htm
Hemphill, T. (2001). Identity theft: A cost of business?
Business and Society Review, 106(1), 51-63.
Hemphill, T. (2002). Electronic commerce and consumer privacy: Establishing online trust in the U.S.
digital economy. Business and Society Review, 107(2),
221-239.
Herbert, N. (2006). Conquering spam in concert: Antispam legislative efforts in the Asia Pacific region. Law
Technology, 39(2), 1-12.
Herring, S. C. (1992). Gender and participation in computer-mediated linguistic discourse. Washington, DC:
ERIC Clearinghouse on Languages and Linguistics,
Document no. ED 345552.
sandiego.com/uniontrib/20060316/news_lz1e16hinman.
html
Hoffman, D. L., & Novak, T. P. (1996). Marketing in hypermedia computer-mediated environments: conceptual
foundations. Journal of Marketing, 60(3), 50-68.
Hoffman, D. L., Novak, T. P. & Peralta, M. A. (1999,
April-June). Information privacy in the marketspace:
Implications for the commercial uses of anonymity on
the web. Information Society, 15(2), 129-139.
Hoffman, D. L., Novak, T. P., & Peralta, M. (1999).
Building consumer trust online. Communications of the
ACM, 42(4), 80-85.
Hoffman, D. L., Novak, T. P., & Peralta, M. A. (1999).
Information privacy in the marketspace: Implications
for the commercial uses of anonymity on the web. The
Information Society, 15(4), 129-139.
Hoffman, D., Novak, T. P., & Peralta, M. (1999). Building
consumer trust online. Association for Computing Machinery. Communications of the ACM, 42(4), 80-85.
Hofstede, G. (1997). Culture and organizations. New
York: McGraw Hill.
Holmes, S. (2006, May 22). Into the wild blog yonder.
Business Week, pp. 84-86.
Holt, T.J. (2006, June). Gender and hacking. Paper
presented at the CarolinaCon 06 Convention, Raleigh,
NC.
Hildner, L. (2006). Defusing the threat of RFID: Protecting consumer privacy through technology-specific
legislation at the state level. Harvard Civil Rights-Civil
Liberties Law Review, 41(Winter), 133.
Hornle, J. (2005). Country of origin regulation in crossborder media: one step beyond the freedom to provide
services? International and Comparative Law Quarterly,
54, 89-126.
0
Compilation of References
0
Compilation of References
0
Compilation of References
0
Compilation of References
Kimery, K., & McCord, M. (2006). Signals of trustworthiness in e-commerce: Consumer understanding
of third-party assurance seals. Journal of Electronic
Commerce in Organizations, 4(4), 52-74.
Krill, P. (2002). DoubleClick discontinues web tracking service. ComputerWorld. Retrieved February 24,
2002, from http://www.computerworld.com/storyba/
0,4125,NAV47_STO67262,00.html
King, J., Bond, T., & Blandford, S. (2002). An investigation of computer anxiety by gender and grade. Computers
in Human Behavior, 18, 69-84.
Krone, T. (2006). Gaps in cyberspace can leave us vulnerable. Platypus Magazine, 90 (March 2006), 31-36.
Kryczka, K. (2004). Ready to join the EU information society? Implementation of e-commerce directive
2000/31/EC in the EU acceding countriesthe example
of Poland. International Journal of Law & Information
Technology, 12, 55.
Kuner, C. (2007). European data protection law:
Corporate regulation and compliance. USA: Oxford
University Press.
Lade, D (2007, July 15). Getting medication in privacy
is part of internet appeal, but there are risks. Sun Sentinel. Retrieved September 17, 2007, from LexisNexis
current News file.
08
LaRose, R., Mastro, D. A., & Eastin, M. S. (2001). Understanding internet usage: A social cognitive approach
to uses and gratifications. Social Computer Review, 19,
395-413.
Lauer, T., & Deng, X. (2007). Building online trust
through privacy practices. International Journal of
Information Security, 6(5), 323-331.
Lawson, P., & Lawford, J. (2003). Identity theft: The
need for better consumer protection. Ottawa: The Public
Interest Advocacy Centre.
Lawson, S. (2006, Nov. 29). Google describes its wi-fi
pitch. Retrieved December 1, 2006, from ://www.pcworld.
com/article/id,123157-page,1/article.html
Compilation of References
Lichtenstein, S., Swatman, P., & Babu, K. (2003). Adding value to online privacy for consumers: remedying
deficiencies in online privacy policies with a holistic approach. In Proceedings of the 36th Hawaii International
Conference on System Sciences.
LII. (2007). US Code Collection, Title 17, Copyrights,
Legal Information Institute, Cornell Law School.
LII. (2007). U.S. Code Collection, Title 18, S 2510, The
Electronic Communications Privacy Act of 1986, Legal
Information Institute, Cornell Law School.
LII. (2007). United States Constitution Article I Section
8, Legal Information Institute, Cornell Law School.
LII. (2007c). The Bill Of Rights: U.S. Constitution,
Amendment IV, Legal Information Institute, Cornell
Law School.
Lilien, L., & Bhargava, B. (2006). A scheme for privacypreserving data dissemination. IEEE Transactions on
Systems, Man and Cybernetics, Part A: Systems and
Humans, 36(3), 503-506.
Lilien, L., Gupta, A., & Yang, Z. (2007). Opportunistic
networks for emergency applications and their standard
implementation framework. In Proceedings of the First
International Workshop on Next Generation Networks for
First Responders and Critical Infrastructure, NetCri07,
New Orleans, LA (pp. 588-593).
Lilien, L., Kamal, Z. H., Bhuse, V., & Gupta, A. (2006).
Opportunistic networks: The concept and research challenges in privacy and security. In P. Reiher, K. Makki,
& S. Makki (Eds.), Proceedings of the International
Workshop on Research Challenges in Security and
Privacy for Mobile and Wireless Networks (WSPWN06),
Miami, FL (pp. 134-147).
Lin, C. A. (1996). Looking back: The contribution of
Blumler and Katzs uses of mass communication to
communication research. Journal of Broadcasting &
Electronic Media, 40(4), 574-581.
Lisse, J. (2007). Bechet disease. Retrieved June 3, 2007,
from http://www. emedicine.com/med.topic218.htm
09
Compilation of References
Liu, C., Marchewka, J., Lu, J., & Yu, C. (2004). Beyond
concern: A privacy-trust-behavioral intention model
of electronic commerce. Information & Management,
42(1), 127-142.
Livingstone, S. (2004). The challenge of changing audiences: Or, what is the audience researcher to do in the
age of the Internet? European Journal of Communication, 19(1), 75-86.
Marchiori, M. (2002). The platform for privacy preferences 1.0 (P3P1.0) specification. W3C recommendation. W3C. Retrieved June 5, 2007, from http://www.
w3.org/TR/P3P/
Lohmann, F. (2002). Fair Use and Digital rights management, Computers, Freedom & Privacy, Electronic
Frontier Foundation.
Longhurst, B., Bagnall, G., & Savage, M. (2004). Audiences, museums and the English middle class. Museum
and Society, 2(2), 104-124.
Lurie, P., & Zieve, A. (2006). Sometimes the silence
can be like thunder: Access to pharmaceutical data at
the FDA. Law and Contemporary Problems, 69(Summer), 85-97.
Lynch, E. (1997). Protecting consumers in the cybermarket. OECD Observer, 208(Oct/Nov), 11-15.
Mack, A. (2000, January 3). Op-Ed Re: Sometimes the
patient knows best. New York Times. Retrieved June 5,
2007, from LexisNexis News service.
MacRae, P. (2003). Avoiding eternal spamnation.
Chatswood, NSW: Australian Telecommunications Users
Group Limited (ATUG).
Majoras, D. P., Swindle, O., Leary, T. B., Harbour, P. J., &
Leibowitz, J. (2005). The US SAFE WEB Act: Protecting
consumers from spam, spyware, and fraud. A legislative
recommendation to congress. Washington D.C.: Federal
Trade Commission (US).
Malhotra, N. K., Kim, S. S., & Agarwal, J. (2004). Internet users information privacy concerns (IUIPC): The
construct, the scale, and a causal model. Information
Systems Research, 15(4), 336-355.
0
Compilation of References
Compilation of References
Microsoft. (2003). Windows media player 9 series privacy settings. Retrieved July 7, 2007, from http://www.
microsoft.com/windows/windowsmedia/player/9series/
privacy.aspx
Milberg, S. J., Smith, H. J., & Burke, S. J. (2000). Information privacy: Corporate management and national
regulation. Organizational Science, 11(1), 35-57.
Miller, R. D. (2006). Health care law. Sudbury, Massachusetts: Jones Bartlett Publishers.
Milloy, M., Fink, D., & Morris, R. (2002, June). Modelling online security and privacy to increase consumer
purchasing intent. Paper presented at the Informing
Science + IT Education Conference, Ireland.
M i l ls , E . (20 05, Ju ly 14). CN ET.c om. Re trieved November 7, 2006, from news.com.com/
Google+balances+privacy,+reach/2100-1032_3Milne, G. R. (2000). Privacy and ethical issues in database/interactive marketing and public policy: A research
framework and overview of the special issue. Journal of
Public Policy & Marketing, 19(1), 1-6.
Milne, G. R. (2003). How well do consumer protect
themselves from identity theft? The Journal of Consumer
Affairs, 37(2), 388-402.
Milne, G. R., & Culnan, M. J. (2002). Using the content
of online privacy notices to inform public policy: A
longitudinal analysis of the 1998-2001 U.S. web surveys.
The Information Society, 18(5), 345-359.
Milne, G. R., Culnan, M. J., & Greene, H. (2006). A longitudinal assessment of online privacy notice readability.
Journal of Public Policy & Marketing, 25(2), 238-249.
Milne, G. R., & Rohm, A. J. (2000). Consumer privacy
and name removal across direct marketing channels:
Exploring opt-in and opt-out alternative. Journal of
Public Policy & Marketing, 19(2), 238-249.
Milne, G. R., Rohm, A. J., & Bahl, S. (2004). Consumers protection of online privacy and identity. Journal
of Consumer Affairs, 38(2), -232217.
Compilation of References
from http://csdl2.computer.org/comp/proceedings/
hicss/2002/1435/07/14350188.pdf
Morinaga, S., Yamanishi, K., Tateishi, K., & T. Fukushima, T. (2002). Mining product reputations on the web.
In Proceedings of the 8th ACM SIGKDD International
Conference on Knowledge Discovery and Data Mining
(pp. 341349). New York: ACM Press. Retrieved June
5, 2007, from citeseer.ist.psu.edu/morinaga02mining.
html
Compilation of References
Compilation of References
Compilation of References
Compilation of References
Compilation of References
htmlthefreecountry.com (2007). Free anonymous surfing. Retrieved October 25, 2007, from http://www.
thefreecountry.com/security/anonymous.shtml
Reinard, J. C. (2006). Communication research statistics.
CA: Sage.
Reips, U.-D. (2007). Internet users perceptions of
privacy concerns and privacy actions. International
Journal of Human-Computer Studies 65(6), 526-536.
Reiter, M., & Rubin, A. (1999). Anonymous web transactions with crowds. Communications of the ACM,
42(2), 32-48.
Reiter, M., & Rubin, A. (1999). Crowds: Anonymity for
web transactions. Communications of the ACM, 42(2),
32-48.
Report to Congressional Requestors. (2005). Information
security emerging cybersecurity issues threaten federal
information systems. Retrieved December 12, 2006, from
http://www.gao.gov/new.items/d05231.pdf
Rescorla, E. (2005). Is finding security holes a good idea?
IEE Security and Privacy, 3(1), 14-19.
Reuters. (1999). Teen pleads guilty to government
hacks. Retrieved October 13, 2007, from http://scout.
wisc.edu/Projects/PastProjects/net-news/99-11/99-1123/0007.html
Rice, S. (2000). Public environmental recordsa
treasure chest of competitive information. Competitive
Intelligence Magazine, 3(3), 13-19.
Richards, N. M. (2006). Reviewing the digital person:
Privacy and technology in the information age by Daniel
J. Solove. Georgetown Law Journal, 94, 4.
Richardson, R. (2007). 2007 CSI Computer Crime and
Security Survey. Retrieved October 12, 2007, from
http://www.GoCSI.com
Riegelsberger, J., Sasse, M., & McCarthy, J. (2003). Shiny
happy people building trust?: Photos on e-commerce
websites and consumer trust. In Proceedings of the
SIGCHI Conference on Human Factors in Computing
Systems (pp. 121-128). Ft. Lauderdale, FL.
8
Compilation of References
Royal, C. (2005). A meta-analysis of journal articles intersecting issues of internet and gender. Journal of Technical
Writing and Communication, 35(4), 403-429.
Ruddock, P. (2006). Australian law reform commission to
review privacy act. Retrieved June 7, 2007, from hhttp://
www.ag.gov.au/agd/WWW/MinisterRuddockHome.
nsf/Page/Media_Releases_2006_First_Quarter_31_
January_2006_-_Australian_Law_Reform_Commission_to_review_Privacy_Act_-_0062006#
Rule, C. (2002). Online dispute resolution for business.
San Francisco: Jossey-Bass.
Russ, A. (2001). Digital rights management overview.
SANS Information Security Reading Room, July 26.
Retrieved April 16, 2007, from http://www.sans.org/reading_room/whitepapers/basics/434.php
Saarenp, T., & Tiainen, T. (2003). Consumers and ecommerce in information system studies. In M. Hannula,
A-M. Jrvelin, & M. Sepp (Eds.), Frontiers of e-business
research: 2003 (pp. 62-76). Tampere: Tampere University
of Technology and University of Tampere.
Sabater, J., & Sierra, C. (2002). Social ReGreT, a reputation model based on social relations. ACM SIGecom
Exchanges, 3(1), 44-56.
Sagan, S. (2000). Hacker women are few but strong. Retrieved August 5, 2004, from http://abcnews.go.com/sections/tech/DailyNews/hackerwomen000602.html#top
Salehnia, A. (2002). Ethical issues of information systems.
Hershey, PA: Idea Group Incorporated.
Salganik, M. W. (2000, August 10). Health data on 858
patients mistakenly emailed to others; medical information was among messages sent out by Kaiser Health Care.
Baltimore Sun, Aug., 1C.
9
Compilation of References
0
Shaw, E., Post, J., & Ruby, K. (1999). Inside the mind of
the insider. Security Management, 43(12), 34-44.
Shaw, K., & Rushing, R. (2007). Podcast, Keith Shaw
(NetWorkWorld) talks with Richard Rushing chief security
officer at ... data, listen to this podcast. Retrieved October 2007, from http://www.networkingsmallbusiness.
com/podcasts/panorama/2007/022807pan-airdefense.
html?zb&rc=wireless_sec
Sheehan, K. B. & Hoy, M. G. (1999). Flaming, complaining, abstaining: How online users respond to privacy
concerns. Journal of Advertising, 28(3), 37-51.
Compilation of References
Smith, H. J., Milberg, S. J., & Burke, S. J. (1996). Information privacy: Measuring individuals concerns
about organizational practices. MIS Quarterly, June,
167-196.
Sheehan, K. B., & Hoy, M. G. (1999). Flaming, complaining, abstaining: How online users respond to privacy
concerns. Journal of Advertising, 28(3), 37-51.
Sheehan, K. B., & Hoy, M. G. (2000). Dimensions of
privacy concern among online consumers. Journal of
Public Policy & Marketing, 19(1), 62-73.
Shimeall, T. (2001, August 23). Internet fraud, Testimony
of Timothy J. Shimeall, Ph.D. CERT, Analysis Center
Software Engineering Institute, Carnegie Mellon University Pittsburgh, PA; Before the Pennsylvania House
Committee on Commerce and Economic Development,
Subcommittee on Economic Development, retrieved
October 2007, available http://www.CERT. org/congressional_testimony/Shimeall_testimony_Aug23.html
Shimp, T. (2007). Advertising, promotion, and other
aspects of integrated marketing communications (7th
ed.). Mason, OH: Thomson South-Western.
Shinder, D. (2007, February 9). How SMBs can enforce
user access policies. Retrieved April 2007, from http://
articles.techrepublic.com.com/5100-1009_11-6157054.
html?tag=nl.e101
Singh, B. (2002). Consumer education on consumer rights
and responsibilities, code of conduct for ethical business, importance of product labelling. Kualar Lumpur:
Consumers International.
Sitton, J. V. (2006). When the right to know and the right
to privacy collide. Information Management Journal,
40(5), 76-80.
Sixsmith, J., & Murray, C. D. (2001). Ethical issues in the
documentary data analysis of internet posts and archives.
Qualitative Health Research, 11(3), 423-432.
Smith, H., Milburg, S., & Burke, S. (1996). Information
privacy: Measuring individuals concerns about organizational practices. MIS Quarterly, 20(2), 167-196.
Compilation of References
Compilation of References
disclosure, and data access: Theory and practical applications for statistical agencies (26 pages). Washington,
D.C.: Urban Institute. Retrieved June 5, 2007, from http://
privacy.cs.cmu.edu/people/sweeney/explosion2.pdf
Thompson, B. (2003, February 21). Is Google too powerful? Retrieved December 12, 2006, from http://news.
bbc.co.uk/2/hi/technology/2786761.stm
Timothy, R. (1999). The construction of the world wide
web audience. Media, Culture & Society, 21(5), 673684.
Compilation of References
Tyler, B. J. (1998). Cyberdoctors: The virtual housecallthe actual practice of medicine on the internet
is here; is it a telemedical accident waiting to happen?
Indiana Law Review, 13, 259-290.
Trammell, K. D., Williams, A. P., Postelincu, M., & Landreville, K. D. (2006). Evolution of online campaigning:
Increasing interactivity in candidate web sites and blogs
through text and technical features. Mass Communication & Society, 9(1), 21-44.
Treasury (Australia). (2006). The Australian guidelines
for electronic commerce (March 2006). Canberra, ACT:
Treasury (Australia)
TRUSTe. (2007). Retrieved December 9, 2007, from
http://www.truste.org/
Trustworthy Computing White Paper. (2003). Redmond,
Washington: Microsoft. Retrieved June 5, 2007, from
http://www.microsoft.com/mscorp/twc/twc_whitepaper.
mspx
Tschudin, C. (1999). Apoptosisthe programmed
death of distributed services. In J. Vitek & C. D. Jensen
(Eds.), Secure internet programming. Security issues
for mobile and distributed objects. Lecture Notes in
Computer Science (Vol. 1603, pp. 253-260). Heidelberg,
Germany: Springer.
Turkle, S. (1984). The second self: Computers and the
human spirit. New York:Simon and Schuster.
Turow, J. (2003). Americans & online privacy: The system
is broken. Report from the Annenberg Public Policy
Center of the University of Pennsylvania, June.
Turow, J., & Hennessy, M. (2007). Internet privacy and
institutional trust: Insights from a national survey. New
Media & Society, 9(2), 300-318.
Tygar, J. D., & Yee, B. (1994). Dyad: A system for using
physically secure coprocessors. In Proceedings of the
Joint Harvard-MIT Workshop Technological Strategies
for Protecting Intellectual Property in the Networked
Multimedia Environment. Annapolis, MD: Interactive
Compilation of References
us/congress/bill.xpd?tab=summary&bill=s109-1608
USA Patriot Act of 2001 enacted October 23, 2001. (2001).
Retrieved from http://www.govtrack.us/congress/bill.
xpd?bill=h107-3162
USG. (1998). The Digital Millennium Copyright Act
of 1998, U.S. Copyright Office, Pub. 05-304, 112 Stat.
2860.
Vaile, D. (2004). Spam cannednew laws for Australia.
Internet Law Bulletin, 6(9), 113-115.
Van Dyke, T., Midha, V., & Nemati, H. (2007). The
effect of consumer privacy empowerment on trust and
privacy concerns in e-commerce. Electronic Markets,
17(1), 68-81.
Vasek, S. (2006). When the right to know and right
to privacy collide. Information Management Journal,
40(5), 76-81.
Vasiu, L., Warren, M., & Mackay, D. (2002, December).
Personal information privacy issues in B2C e-commerce:
a theoretical framework. Paper presented at the 7th Annual CollECTeR Conference on Electronic Commerce
(CollECTeR02), Melbourne, Victoria
Vaughn, R. G. (1984). Administrative alternatives and
the federal freedom of information act. Ohio State Law
Journal, 45(Winter), 185-214.
Vivisimo. (2006). Restricted access: Is your enterprise search solution revealing too much? Retrieved
October 2007, from via http://Vivisimo.com/ or
http://www.webbuyersguide.com/bguide/whitepaper/
wpDetails.asp_Q_wpId_E_NzYyMQ
Vogelstein, F. (2007, April 9). Text of wireds interview
with Google CEO Eric Schmidt. Retrieved July 15,
2007, from http://www.wired.com/techbiz/people/
news/2007/04/mag_schmidt_trans?curren
W3C (2002b). The platform for privacy preferences
1.0 (P3P1.0) specification. W3C P3P Working Group.
Retrieved April 16, 2007, from http://www.w3.org/TR/
P3P/
W3C. (2002). The platform for privacy preferences 1.0
(P3P1.0) specification. Retrieved October 25, 2007, from
http://www.w3.org/TR/P3P/
Compilation of References
Compilation of References
Working Group on Electronic Commerce and Consumers (Canada). (2004). Canadian code of practice for
consumer protection in electronic commerce. Ottawa:
Office of Consumer Affairs, Industry Canada.
World Wide Web Consortium (W3C). (2006). The platform for privacy preferences 1.1 (P3P1.1) specification.
World Wide Web Consortium. Retrieved July 7, 2007,
from http://www.w3.org/TR/P3P11/
World Wide Web Consortium (W3C). (2007). The
platform for privacy preferences (P3P) project. World
Wide Web Consortium. Retrieved July 5, 2007, from
http://www.w3.org/P3P/
Xie, E., Teo, H-H, & Wan, W. (2006). Volunteering
personal information on the internet: Effects of reputation, privacy notices, and rewards on online consumer
behavior. Marketing Letters, 17, 61-74.
Yamazaki, F. (2005). Privacy protection law. Nikkei
BP.
Yank, G. C. (2004 December 21). Canning spam:
Consumer protection or a lid on free speech? Retrieved
October 2007 from http://www.law.duke.edu/journals/
dltr/articles/2004dltr0016.html
Yao, M. Z. (2005). Predicting the adoption of self-protections of online privacy: A test of an expanded theory
of planned behavior model. Unpublished dissertation,
University of California, Santa Barbara. Ackerman, M.
S., & Lorrie F. C. (1999). Privacy critics: UI components
to safeguard users privacy. In Proceedings of the ACM
Conference on Human Factors in Computing Systems
(CHI99) (pp. 258-259). Pittsburgh, PA.
Yao, M., Rice, R., & Wallis, K. (2007). Predicting user
concerns about online privacy. Journal of the American
Society for Information Science and Technology, 58(5),
710-722.
Yianakos, C. (2002). Nameless in cyberspaceprotecting online privacy. B+FS, 116(6), 48-49.
Compilation of References
8
9
Tom S. Chan is an associate professor at the Information Department, Southern New Hampshire
University at Manchester, New Hampshire, USA. He holds an EdD from Texas Tech University and a
MSCS from the University of Southern California. Prior to SNHU, he was an assistant professor at Marist
College, and as project manager and software designer, specialized in data communication at Citibank.
He has published works in the area of instructional design, distance learning, technology adaptation,
information security, and Web design.
Jengchung V. Chen is assistant professor in telecommunications management at National Cheng Kung
University, Taiwan. He has published articles dealing with privacy issues in journals like International
Journal of Organizational Analysis, Labor Law Journal, and Information, Communication, and Society.
He holds a PhD in communication and information sciences from the University of Hawaii and a masters
in policy and management from SUY-Stony Brook and computer science from Polytechnic University.
Andy Chiou is an alumnus of New York University with a BA in sociology and economics, and is,
as of time of this writing, currently completing his MBA at National Cheng Kung University in Taiwan.
Upon completion of his MBA, Andy plans to continue pursuing a doctorate degree in the United States.
His current areas of interest are general management and cross cultural issues.
Ken Coghill was born in Australia (1944), has been a veterinarian, a public servant and elected to
Wodonga Council, and as a member and speaker of parliament. He joined Monash University in 1996,
where he teaches Governance and Business & Government in masters programs and supervises PhD research students studying diverse aspects of governance. Assoc. Professor Coghill is a co-director, Monash
Governance Research Unit, where he directs and undertakes research on integrated governance, that is,
the dynamic, evolving inter-relationships of the public, corporate and civil society sectors as they affect
the governance of nation-states.
J. Stephanie Collins earned a PhD in management information systems in 1990 from the University
of Wisconsin. She has taught in the field since 1988 and has published papers in various journals, and
presented at conferences. She has published papers on Information Technology Outsourcing and Technology Applications for Economic Development, IT Education, and on technical issues. She has also worked
as an IT consultant, and has developed several systems. Her current research is focused on how the uses
of internet technologies change the environment for business and for education.
G. Scott Erickson is associate professor and chair of the Marketing/Law Department in the School of
Business at Ithaca College, Ithaca, NY. He holds a PhD from Lehigh University and masters degrees from
Thunderbird and SMU. He has published widely on intellectual property, intellectual capital, competitive
intelligence, and a number of other related topics. His book with Helen Rothberg, From Knowledge to
Intelligence, was published by Elsevier in 2005. His consulting work began over 20 years ago with the
Alexander Proudfoot Company and continues today.
Louis K. Falk received his doctorate in mass communication from the University of Southern Mississippi where he graduated with an emphasis in advertising and public relations. Dr. Falk is an associate professor in the Department of English & Communication, University of Texas At Brownsville. His
research interests include the impact of new technologies on marketing, advertising, and public relations.
Dr. Falk has recently been published in a variety of journals to include: The Journal of Website Promotion, Journal of E-Business, and the Journal of Promotion Management. Dr. Falk is also an elected board
0
member and Webmaster of the International Academy of Business Disciplines. His Web site address is
http://www.louisfalk.org/
Philip Flaherty was awarded a Bachelor of Civil Law (BCL) from NUI, Galway in 2005; he was
awarded a diploma in legal Irish in 2004. He holds a LLM in public law from NUI, Galway. He co-authored
a consultation paper on statute law restatement for the Law Reform Commission in 2007 and conducted
research on the commissions e-conveyancing road map project. He is currently researching the Settled
Land Acts and will produce a consultation paper on this area of law for the commission in 2008. Philip
will join the Dublin law firm McCann FitzGerald in 2008. Philip is a regular contributor to legal conferences and legal journals.
Anil Gurung is an assistant professor of business and information management at Neumann College.
He received his PhD from the University of Texas at Arlington. Current research interests are in the areas
of IT adoption, information security and privacy, ecommerce, and cultural and social aspects of business
computing. He has published in various journals and conference proceedings.
Huong Ha is currently holding the position of deputy course director at TMC Business School, TMC
Academy, Singapore and a PhD candidate at Monash University, Australia. She holds a masters degree
in public policy from the National University of Singapore. She has many book chapters, journal articles,
reviewed conference papers, and encyclopedia articles published. She has been awarded a research grant
by Consumer Affairs Victoria (Australia), a distinguished paper award (Turkey and the USA) and many
international travel grants.
Naoki Hamamoto received his MBA in computer information systems from Western Michigan
University, Kalamazoo, Michigan. He was involved several major research projects during his graduate
study. He is currently an information professional in internal system auditing. Mr. Hamamotos research
interests include global information security management, Web 2.0 social networking and community,
and business process management.
Thomas J. Holt is an assistant professor in the Department of Criminal Justice at the University of
North Carolina at Charlotte. He has a doctorate in criminology and criminal justice from the University
of MissouriSaint Louis. His research focuses on computer crime, cyber crime, and the role that technology and the Internet play in facilitating all manner of crime and deviance. Dr. Holt has authored several
papers on the topics of hacking, cyber crime, and deviance that have appeared in journals such as Deviant
Behavior and the International Journal of Comparative and Applied Criminal Justice. He is also a member
of the editorial board of the International Journal of Cyber Criminology.
Chiung-wen (Julia) Hsu received her PhD in communication from SUNY Buffalo in 2003. She is
now an assistant professor of Department of Radio & Television at National Cheng Chi University, Taiwan. Julias research interests include communication technology, journalism, mass communication, and
Internet research, especially online privacy issues. She is interested in the users behavioral differences
between online and offline worlds, and in different Internet platforms. She developed a situational model
and has conducted several empirical studies. Julias research has been published in journals such as the
Asian Journal of Communication, Telematics & Informatics, Online Information Review, and Cyberpsychology & Behavior.
Anurag Jain is an assistant professor at the Bertolon School of Business, Salem State College, MA. He
has over 12 years of industry experience including: strategic brand management, financial planning, global
bilateral business promotion, and IT services. His research has appeared in the proceedings for several
leading conferences that include Americas Conference on Information Systems and Decision Sciences
Institute, and Southwest Decision Science Institute, and the Northeast Decision Sciences. His research
interests at present are towards IT capabilities, value, and the management of IT resources; knowledge
management, adaptive and sustainable competitive enterprise; business intelligence and activity monitoring;
and influence of information technology on organizations and society. He holds a PhD from the University
of Texas-Arlington. He holds a master of science degree from The University of Illinois at Urbana-Champaign; a post graduate diploma in business management from Sydenham Institute of Management; and a
bachelor of commerce degree from The University of Bombay.
Thejs Willem Jansen holds a master of science (computer science engineering) from the Technical
University of Denmark. He co-authored his MSc thesis entitled Privacy in Government IT-Systems with
Sren Peen, which developed the model presented in this chapter and resulted in a paper published at the
Sustaining Privacy in Autonomous Collaborative Environments 2007 workshop. Thejs Jansen currently
holds a position as an IT auditor at PricewaterhouseCoopers, where he reviews IT security procedures by
identifying issues, developing criteria, reviewing and documenting client processes and procedures. His
work includes working with Sarbanes-Oxley clients.
Christian Damsgaard Jensen holds a master of science (computer science) from the University of
Copenhagen (Denmark) and a PhD (computer science) from Universit Joseph Fourier (Grenoble, France).
Dr. Jensen currently holds a position as associate professor at the Technical University of Denmark, where
he supervised the MSc thesis project of Thejs Willem Jansen and Sren Peen. Dr. Jensen works in the areas
of system security and distributed systems, with a particular focus on pervasive computing systems. Dr.
Jensen has published more than 50 papers in international peer-reviewed journals, conferences, symposia,
and workshops.
Sean Lancaster is a lecturer with the Department of Decision Sciences and Management Information Systems at Miami University. He teaches undergraduate courses on information systems & business
strategy, Web design, VisualBasic.NET, database design, and e-commerce. Sean earned his MBA from
Miami University in 2002.
Suhong Li is an associate professor of computer information systems at Bryant University. She earned
her PhD from the University of Toledo in 2002. She is a member of Council of Supply Chain Management Professionals, Decision Science Institute, and International Association for Computer Information
Systems. She has published in academic journals including Journal of Operations Management, OMEGA:
the International Journal of Management Science, Decision Support Systems, Journal of Computer Information Systems, Journal of Managerial Issues, and International Journal of Integrated Supply Management. Her research interests include supply chain management, electronic commerce, and adoption and
implementation of IT innovation.
Leszek Liliens research focuses on opportunistic networks or oppnets (a specialized kind of ad hoc
networks); and trust, privacy, and security in open computing systems. He serves on the editorial boards of
the International Journal of Communication Networks and Distributed Systems, The Open Cybernetics &
Systemics Journal, and Recent Patents on Computer Science. He was the main organizer and chair of the
International Workshop on Specialized Ad Hoc Networks and Systems (SAHNS 2007), held in conjunction with the 27th IEEE International Conference on Distributed Computing Systems (ICDCS 2007). He
is a senior member of the Institute of Electrical and Electronics Engineers (IEEE).
Elizabeth Ann Maharaj is statistician and a senior lecturer in the Department of Econometrics and
Business Statistics at the Caulfield Campus of Monash University. She teaches subjects on business data
analysis, business forecasting, and survey data analysis. Her main research interests are in time series
classification and forecasting. She has presented several papers on time series classification at international
conferences and much of it has been published in international journals. Elizabeth has also been involved
in research projects in climatology, environmental science, labour markets, human mobility, and finance,
and she has also published in journals in some of these fields.
Karin Mika has been associated with the first-year legal writing program at Cleveland-Marshall College of Law since 1988. She has also worked as an adjunct professor of English at Cuyahoga Community
College and is a research consultant for various firms and businesses in the Cleveland area. In addition,
Professor Mika was faculty advisor for the law schools moot court program and currently judges at various national moot court competitions. She has lectured on essay writing technique for several bar review
courses and has written bar exam essay questions for both the California and Minnesota bar examiners. Professor Mikas areas of scholarly research are varied and she has published in the areas of Native
American law, Internet law, and health care. Administrative and teaching responsibilities: legal writing,
research, and advocacy program.
Shahriar Movafaghi received a PhD in computer science from Northwestern University, with over 20
years of hands on technical experience. Dr. Movafaghi has published numerous papers in areas of digital
rights, data warehousing, databases, system architecture, software engineering, object-oriented technology, application development, and teaching techniques for IT. He has architected and led many software
system projects in the financial, apparel, publishing, and computer hardware/software industries as well
as directed government-funded research and development projects. He has taught courses at various universities including SNHU, UNH, BU, and UMASS Lowell.
Charles OMahony is a legal researcher with the Law Reform Commission of Ireland. He was awarded
a BA in 2003 and a LLB in 2004 from NUI, Galway. He holds a masters in law (LLM) from University
College London and a LLM in public law from NUI, Galway. He was the principal legal researcher for
the Commissions Third Programme of Law Reform and authored a report on the Third Programme. He
is currently preparing a consultation paper on reform of the jury system in Ireland, which is due for publication in 2008. He is a senior tutor in law at University College Dublin, where he has tutored courses on
legal systems and methods, tort law, and constitutional frameworks.
Betty J. Parker is associate professor of marketing & advertising at Western Michigan University,
where she teaches undergraduate and graduate courses in Internet marketing, media research, and advertising. She has published papers about the Internet as a communication tool and the public policy aspects
of marketing alcohol, food, and prescription drugs. She holds a PhD from the University of Missouri, an
MBA from the University of Illinois-Chicago, and a BA from Purdue University.
Andrew Pauxtis is currently a graduate student in the information systems management masters
program at Quinnipiac University in Hamden, CT. His areas of interest in regards to information technology include Web development and design technologies, search engine technologies, search engine
optimization, Internet marketing, and Web 2.0. He graduated from Quinnipiac University in 2007 with a
BA in mass communications.
Sren Peen holds a master of science (computer science engineering) from the Technical University of
Denmark. He co-authored his MSc thesis entitled Privacy in Government IT-Systems with Thejs Jansen,
which developed the model presented in this chapter and resulted in a paper published at the Sustaining
Privacy in Autonomous Collaborative Environments 2007 workshop. Since his graduation, Sren Peen has
completed his national service in the Danish Defence Forces and is currently employed as an IT-specialist
at IBMs Copenhagen Crypto Competence Center.
Alan Rea earned his BA at The Pennsylvania State University, an MA at Youngstown State University,
an MS at the University of Maryland, Baltimore County, and his PhD at Bowling Green State University.
Alan is an associate professor of business information systems at Western Michigan University. Since
1997 Alan has taught courses in Web development and design, system administration, and various objectoriented language programming courses. Alan researches topics in artificial intelligence, hacker culture,
security, and virtual reality. He has published articles in these fields, as well as authored several textbooks
on an assortment of information technology topics.
Bernadette H. Schell, the founding dean of the Faculty of Business and Information Technology at the
University of Ontario Institute of Technology in Canada, has authored four books on the topic of hacking:
The Hacking of America: Whos Doing It, Why, and How (2002); Contemporary World Issues: Cybercrime
(2004); Websters New World Hacker Dictionary (2006); and Contemporary World Issues: The Internet and
Society (2007). She has also written numerous journal articles on topics related to violence in society and
is the author of three books dealing with stress-coping in the workplace (1997), the stress and emotional
dysfunction of corporate leaders (1999), and stalking, harassment, and murder in the workplace (2000).
Angelena M. Secor received her MBA in computer information systems from Western Michigan
University, Kalamazoo, Michigan. She is currently an IT consultant in the healthcare field. Mr. Secors
research interests include information security, information privacy, and offshore outsourcing of healthcare services.
Hy Sockel received his doctorate in management information systems from Cleveland State University.
Dr. Sockel is a visiting professor in the management information systems area at Indiana University South
Bend. His research interests include the impact of technology on the organization and its employees, electronic commerce, database technologies, and application systems. Dr. Sockel has recently been published
in a variety of journals including: Journal of Management Systems, The Journal of Website Promotion,
and the International Journal of Web Engineering and Technology.
J. Michael Tarn is a professor of business information systems at Western Michigan University. He
holds a PhD and an MS in information systems from Virginia Commonwealth University. Dr. Tarn specializes in multidisciplinary research, involving ICT, EC, and strategic management. He has published
various articles in professional journals, book chapters, and refereed conference proceedings. Professor
Tarn coauthored the first scholarly book in ES education, Enterprise Systems Education in the 21st Century. He is managing editor of International Journal of Management Theory and Practices. His areas of
expertise are information security management, data communications management, Internet research,
international MIS, and critical systems management.
Barbara J. Tyler is a registered nurse and attorney who taught legal writing and drafting for 16 years, as
well as directing the Legal Writing Department for 6 years. She serves as a consultant to practitioners and
firms on medical malpractice claims, risk management, and writing and drafting. She was most rewarded
in her teaching career to serve as faculty-advisor to the Cleveland-Marshall Journal of Law and Health for
6 years, as well as advisor for 4 years to the Delta Theta Phi Law fraternity. Professor Tyler was honored
with the Wilson Stapleton law alumni award for teaching excellence in 2005, as well as the International
Delta Theta Phi law fraternity most outstanding teacher of the year 2005-06. She is active in many legal
and community organizations. Her research and writing interests are varied, and she has published in the
areas of medicine and law, Internet law, insurance law, and art law, as well as learning theory.
Bruce A. White is the chair of information systems management at Quinnipiac University in Hamden,
CT. He is also on the Educational Foundation for the Institute for Certification of Computer Professionals,
on the editorial review board for the Journal of Information Systems Education and the Global Journal of
Information Management. He has chaired the ISECON conference four times. His current research is on
Web 2.0 technologies, assessment processes, health information systems, and outsourcing.
David C. Yen is currently Jennifer J. Petters chair in Asian business and professors of MIS of the Department of Decision Sciences and Management Information Systems at Miami University. He assumed
Raymond E. Glos professor in business from 2005-2007 and was a department chair from 1995-2005. After
receiving his PhD in MIS and MS in computer sciences in 1985, Professor Yen is active in research. He
has published books and articles which have appeared in Communications of the ACM, Decision Support
Systems, Information & Management, Information Sciences, Computer Standards and Interfaces, Information Society, Omega, International Journal of Organizational Computing and Electronic Commerce, and
Communications of AIS, among others. Professor Yens research interests include data communications,
electronic/mobile commerce, and systems analysis and design.
Chen Zhang is an assistant professor of computer information systems at Bryant University. He received
his MS and PhD in computer science from the University of Alabama in 2000 and 2002, and a BS from
Tsinghua University, Beijing, China. Dr. Zhangs primary research interests fall into the areas of computer
networks and distributed systems. He is a professional member of ACM and IEEE.
Index
Adobe.com 7
AdWords 1, 3, 7, 8, 9, 10, 11, 12, 13
aesthetic experience 224, 225
AltaVista 5
Amazon 4, 205, 260
anonymity 19, 30, 65, 83, 86, 100, 117, 118, 217,
235, 356, 371, 394, 397, 398, 409, 416, 418
anonymity set 93, 100, 101, 113
anonymous surfing tools 278
ANSI bombs 40
Asia Pacific Economic Co-Operation (APEC) 129
asset theft 36
attack machine 41
Australian Consumers Association (ACA) 134
Australian Guidelines for Electronic Commerce
(AGEC) 132
automated trust negotiation (ATN) 96
Automatic Processing of Personal Data 333, 340
CartManager 19
casual consumer 2
Childrens Online Privacy and Protection Act 17
Childrens Online Privacy Protection Act (COPPA)
262, 271
civil liberties 307, 308, 309, 406
click streams 181
Coca-Cola 264, 283
code of ethics 137
common biometric exchange file format (CBEFF)
305
common identifiers 69, 74
computer-mediated communication (CMC) 233
computer cache 27
computer underground (CU) 190, 192, 195
confidential business information (CBI) 316
confidentiality 66, 105, 118, 121, 422
confidentiality-integrity-availability (CIA) 88
consumer protection 2, 17, 18, 111, 123, 124, 125,
126, 127, 128, 129, 131, 132, 133, 134, 135,
136, 137, 138, 139, 141, 142, 143, 144, 145,
146, 232, 255, 280, 326, 327, 340, 347, 358,
361, 389, 395, 397, 403, 408, 413, 414, 418,
420, 427
consumer rights 126
content abuse 36
cookie 6, 7, 22, 259, 260, 266, 274, 419
Council of Europes European Convention of Human
Rights 333
CPA Web Trust 271
CPU cycles 100
Crackers 194, 195
credit card data 18, 315
cyber crime 196
Cybercrime Act 42
cyberfly 92, 122
cybermedicine 347, 355, 361, 363, 367, 368, 409,
420
B
banner ads 11, 181, 263
BBBOnline Privacy 134, 269, 276, 277
Better Business Bureau (BBB) 88
BioAPI 305
biometric characteristics 300, 303
black listing 48
blogs 52, 53, 210, 217, 218, 229, 237, 257, 261, 264,
285, 390, 400, 415, 424
Boeing 261, 264, 282
bot 8, 40, 41, 42, 50
bot herder 41, 42
botnet 41, 42, 43
Botnet Task Force 43
Brussels Convention on Jurisdiction and Recognition
of Foreign Judgments 330
business-to-business (B2B) 372
business to customer (B2C) 372
Copyright 2009, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.
Index
D
data-processing 65
data-storage 65, 66
Data Protection Directive 63, 127, 128, 320, 333,
335, 336, 337, 338, 339, 340, 341, 343, 345
denial of service (DoS) 36
Department of Justice 43, 313
diffused audiences 219
digital fingerprint 244, 245, 246
digital rights management (DRM) 240, 241
disposition to trust 94, 95, 170
disruption 36
distributed DoS (DDoS) 36
Do Not Call (DNC) Registry 285, 286
E
e-commerce xvii, 3, 51, 81, 86, 95, 112, 114, 117,
120, 142, 143, 144, 145, 146, 148, 151, 154,
155, 160, 161, 162, 163, 168, 170, 171, 173,
175, 177, 178, 182, 185, 186, 187, 188, 189,
191, 195, 196, 216, 241, 243, 249, 251, 252,
263, 266, 270, 271, 272, 273, 277, 279, 280,
301, 326, 327, 328, 329, 331, 332, 333, 337,
339, 340, 342, 344, 372, 373, 384, 385, 386,
391, 397, 398, 403, 405, 406, 408, 411, 412,
413, 415, 418, 419, 420, 422, 424, 425
e-retailer 136
Electronic Commerce Directive 329
Electronic Privacy Information Center (EPIC) 84,
167, 183, 188, 398
electronic risk management 38
encryption system 247
end user 47
Entertainment Software Rating Board (ESRB) 271
European Court of Human Rights 334, 345
European extra-judicial network (EEJ-Net) 328
European Parliament 55, 61, 63, 66, 80, 82, 84, 399
European Union (EU) 128, 149, 326, 327
Excite 5
exposure 38
F
Facebook 181, 182, 183, 184, 187, 293, 294, 393,
399, 409, 422
Fair and Accurate Credit Transactions 22, 23
falsification of private information 27
FBI 43, 45, 46, 53, 194, 209, 210, 211, 394, 402, 426
Federal Information Security Management Act 23,
25
Federal Trade Commission (FTC) 4, 17, 249, 258,
270, 286
file-sharing 41
firewalls 46
flooding 197
focus 38, 292
Fortune 100 207, 269, 271, 273, 274, 275, 276, 277,
282
Freedom of Information Act (FOIA 311
FTC v. Petco 18, 25
G
gender-neutral IDs 28
Gmail 1, 8, 9, 13, 264
Google 1, 2, 3, 4, 5, 6, 7, 8, 9, 1, 9, 8, 9, 10, 11, 12,
13, 14, 15, 21, 39, 49, 54, 166, 180, 181, 183,
185, 191, 264, 314, 366, 392, 393, 395, 398,
402, 404, 408, 411, 412, 414, 417, 423, 424,
425
Google Calendar 3
Google Checkout 3, 11
Google Desktop 3, 12
Google Earth 3, 7
Google Groups 3
Google Maps 3
Google News 3
Google Notebook 3
Google Talk 3
gray listing 48
H
half-life 38
Health Insurance Portability and Accountability Act
(HIPPA) 17, 20, 24, 31, 86, 207, 253, 271,
307, 348, 352, 353, 366, 368, 404
Homeland Security Act 17, 23, 31, 198, 404
House of Representatives 17
hybrid listing 48
I
identity theft 18, 19, 307
Identity Theft and Assumption Deterrence Act 307
IIA Content Regulation Code of Practice 133
informational privacy 90, 322, 391
information technology (IT) 34, 371
Institute of International Management (IIM) 292
Index
integrity 24, 25, 45, 47, 51, 87, 88, 95, 103, 104,
105, 114, 154, 232, 242, 258
intellectual property rights (IPR) 196, 197, 207
Internet cafs 34
Internet itinerary 124
Internet protocol (IP) 274
Internet relay chat (IRC) 42
Internet service providers (ISPs) 285
IP-based black list technique 42
IP delivery 6
iTunes 166, 168, 182, 185, 389, 391, 410
obfuscating 94
Olmstead 314
online privacy policy (OPP) 271
online privacy protection behavior xvii, 152, 158,
160, 161
online trust 208
Operation Bot Roast 43
opt-out system 378
Organisation for Economic Co-Operation and Development (OECD) 128
outsourcing 69, 73
PageRank feature 6
patient rights 360
PayPal 43, 99
PC Magazine 6, 9, 13, 135, 144, 145, 411, 416
persistence 38
personal computers (PCs) 191
personal digital assistants (PDAs) 33
personal privacy xv, xviii, 1, 3, 4, 57, 81, 90, 165,
167, 171, 187, 222, 242, 294, 295, 297, 313,
318, 319, 321, 423, 426
phishing 21, 132, 147, 212, 213
phreaking 125, 197
Platform for Privacy Preferences (P3P) 134, 372
prevalence 38
Privacy Act 1988 130, 131, 132, 147
privacy enhancing technology (PET) 58
privacy evaluation 67
privacy protection behavior xvii, 152, 156, 158, 159,
160, 161
Privacy Rights Clearinghouse 38, 54, 59, 83, 362,
368, 369, 417
pseudo-voluntary data dissemination 90
public key infrastructure (PKI) 246
K
keyloggers 42
L
LAN 53, 88, 415
legal privacy controls 105
limiting data 69, 74
Lycos 5
M
malware 18, 21, 30, 39, 40, 140, 390
McDonalds 261
media drenching gratification 230
Medicaid 351, 353
Medicare 351, 353
Michinoku Bank 382, 383, 384
MiniStore 166, 185, 410
MIT.edu 7
mobile encryption 44
multiple identities 28, 158, 175, 181
MX hosts 19
N
NASA.gov 7
National Principles for the Fair Handling of Personal
Information (NPPs) 130
non-governmental organisations (NGOs) xvi, 58
NSF.gov 7
NSW Office of Fair Trading 126, 144, 414
O
ODonnell v. Blue Cross Blue Shield of Wyo. 354,
367, 414
8
Q
questionnaires 81
R
Rbot 42
Real.com 7
Record Industry Association of America (RIAA) 285
remote process control (RPC) 41
repudiation 36
reputation deviation 95
Index
V
verifying trust 97, 98
vulnerability management 50
W
Walt Disney 261, 282
watermarking 86, 94, 244, 245, 253, 396
Web bugs 177, 181, 192, 260, 274
WebTrends 166, 167, 187, 426
white listing 48
WiFi 88
Wired Magazine 4, 11
World Wide Web Consortium (W3C) 167, 187, 249,
372, 427
Y
Yahoo! 3, 5, 21, 381, 382, 383, 384, 385, 422
YouTube 3, 9, 10, 11
Z
zombie machine 41
9