End-User Privacy in Human-Compute...
Foundations and Trends R in Human���Computer Interaction Vol. 1, No. 1 (2007) 1���137 c 2007 G. Iachello and J. Hong DOI: 10.1561/1100000004 End-User Privacy in Human���Computer Interaction Giovanni Iachello1 and Jason Hong2 1 Georgia Institute of Technology, USA, email@example.com 2 Carnegie Mellon University, USA, firstname.lastname@example.org Abstract The purpose of this article is twofold. First, we summarize research on the topic of privacy in Human���Computer Interaction (HCI), outlining current approaches, results, and trends. Practitioners and researchers can draw upon this review when working on topics related to privacy in the context of HCI and CSCW. The second purpose is that of charting future research trends and of pointing out areas of research that are timely but lagging. This work is based on a comprehensive analysis of published academic and industrial literature spanning three decades, and on the experience of both ourselves and of many of our colleagues.
1 Introduction Privacy is emerging as a critical design element for interactive systems in areas as diverse as e-commerce , health care , o���ce work , and personal communications. These systems face the same fun- damental tension. On the one hand, personal information can be used to streamline interactions, facilitate communication, and improve ser- vices. On the other hand, this same information introduces risks, rang- ing from mere distractions to extreme threats. Government reports [239, 283], essays , books [17, 93, 196, 303], and media coverage [252, 295, 312] testify on peoples��� concerns regard- ing the potential for abuse and general unease over the lack of control over a variety of computer systems. Similarly, application developers worry that privacy concerns can impair the acceptance and adoption of their systems. No end-to-end solutions exist to design privacy-respecting sys- tems that cater to user concerns. Lessig provided a very high level framework for structuring the protection of individuals��� privacy, which leverages four forces: laws, social norms, the market, and techni- cal mechanisms . However, the challenge is in turning these broad guidelines into actionable design solutions. Our thesis is that 2
1.1 Why Should HCI Researchers Care About Privacy? 3 researchers in Human���Computer Interaction (HCI) and Computer- Supported Cooperative Work (CSCW) can greatly improve the protec- tion of individual���s personal information, because many of the threats and vulnerabilities associated with privacy originate from the interac- tions between the people using information systems, rather than the actual systems themselves. Approaching the topic of privacy can be daunting for the HCI prac- titioner, because the research literature on privacy is dispersed across multiple communities, including computer networking, systems, HCI, requirements engineering, management information systems (MIS), marketing, jurisprudence, and the social sciences. Even within HCI, the privacy literature is fairly spread out. Furthermore, many IT pro- fessionals have common-sense notions about privacy that can turn out to be inaccurate. Hence, the goal of this article is to provide a unified overview of privacy research in HCI, focusing specifically on issues related to the design and evaluation of end-user systems that have privacy implica- tions. In Section 2, we present two philosophical outlooks on privacy that will help the practitioner frame research questions and design issues. We also show how privacy research has evolved in parallel with HCI over the past 30 years. Section 3 presents an overview of the research literature, structured along an ideal inquiry-build-evaluate development cycle. Finally, in Section 4, we outline key research chal- lenges, where we think that HCI methods and research approaches can make a significant impact in furthering our knowledge about informa- tion privacy and personal data protection. In the remainder of this section, we explain why we think privacy research is challenging and interesting for HCI, and map out relevant literature published in HCI conferences and journals, and in neighbor- ing fields such as MIS and CSCW. 1.1 Why Should HCI Researchers Care About Privacy? Human���computer interaction is uniquely suited to help design teams manage the challenges brought by the need of protecting privacy and personal information. First, HCI can help understand the many notions
4 Introduction of privacy that people have. For example, Westin describes four states of privacy: solitude, intimacy, anonymity, and reserve . Similarly, Murphy lists the following as expressions of privacy: ���to be free from physical invasion of one���s home or person,��� ���the right to make certain personal and intimate decisions free from government interference,��� ���the right to prevent commercial publicity of one���s own name and image,��� and ���the control of information concerning an individual���s per- son��� . These perspectives represent different and sometimes con- flicting worldviews on privacy. For example, while some scholars argue that privacy is a fundamental right, Moor claims that privacy is not a ���core value��� on par with life, security, and freedom, and asserts that privacy is just instrumental for protecting personal security . Second, a concept of tradeoff is implicit in most discussions about privacy. In 1890, Warren and Brandeis pointed out that privacy should be limited by the public interest, a position that has been supported by a long history of court rulings and legal analysis . Tradeoffs must also be made between competing interests in system design. For example, the developer of a retail web site may have security or busi- ness requirements that compete with the end-user privacy require- ments, thus creating a tension that must be resolved through tradeoffs. Because HCI practitioners possess an holistic view of the interaction of the user with the technology, they are ideally positioned to optimally work through and solve these tradeoffs. Third, privacy interacts with other social concerns, such as con- trol, authority, appropriateness, and appearance. For example, while parents may view location-tracking phones as a way of ensuring safety and maintaining peace of mind, their children may perceive the same technology as smothering and an obstacle to establishing their iden- tity. These relationships are compellingly exemplified in Goffman���s description of the behavior of individuals in small social groups . For instance, closing one���s o���ce door not only protects an individual���s privacy, but asserts his ability to do so and emphasizes the difference from other colleagues who do not own an individual o���ce. Here, the discriminating application of HCI tools can vastly improve the accuracy and quality of the assumptions and requirements feeding into system design.
1.1 Why Should HCI Researchers Care About Privacy? 5 Fourth, privacy can be hard to rationalize. Multiple studies have demonstrated that there is a difference between privacy preferences and actual behavior [8, 39]. Many people are also unable to accurately evaluate low probability but high impact risks , especially related to events that may be far removed from the time and place of the ini- tial cause . For example, a hastily written blog entry or impulsive photograph on MySpace may cause unintentional embarrassment sev- eral years down the road. Furthermore, privacy is fraught with excep- tions, due to contingent situations and historical context. The need for flexibility in these constructs is reflected by all the exceptions present in data protection legislation and by social science literature that describes privacy as a continuous interpersonal ���boundary-definition process��� rather than a static condition . The use of modern ���behav- ioral��� inquiry techniques in HCI can help explicate these behaviors and exceptions. Finally, it is often di���cult to evaluate the effects of technology on privacy. There are few well-defined methods for anticipating what pri- vacy features are necessary for a system to gain wide-scale adoption by consumers. Similarly, there is little guidance for measuring what level of privacy a system effectively offers or what its overall return on investment is. Like ���usability��� and ���security,��� privacy is a holistic property of interactive systems, which include the people using them. An entire system may be ruined by a single poorly implemented com- ponent that leaks personal information, or a poor interface that users cannot understand. In our opinion, HCI is uniquely suited to help design teams manage these challenges. HCI provides a rich set of tools that can be used to probe how people perceive privacy threats, understand how people share personal information with others, and evaluate how well a given system facilitates (or inhibits) desired privacy practices. Indeed, the bulk of this paper examines past work that has shed light on these issues of privacy. As much as we have progressed our understanding of privacy within HCI in the last 30 years, we also recognize that there are major research challenges remaining. Hence, we close this article by identifying five
6 Introduction ���grand challenges��� in HCI and privacy: ��� Developing standard privacy-enhancing interaction tech- niques. ��� Developing analysis techniques and survey tools. ��� Documenting the effectiveness of design tools, and creating a ���privacy toolbox.��� ��� Furthering organizational support for managing personal data. ��� Developing a theory of technological acceptance, specifically related to privacy. These are only few of the challenges facing the field. We believe that focusing research efforts on these issues will lead to bountiful, timely and relevant results that will positively affect all users of information technology. 1.2 Sources Used and Limitations of this Survey In this survey paper, we primarily draw on the research literature in HCI, CSCW, and other branches of Computer Science. However, read- ers should be aware that there is a great deal of literature on pri- vacy in the MIS, advertising and marketing, human factors, and legal communities. The MIS community has focused primarily on corporate organiza- tions, where privacy perceptions and preferences have a strong impact on the adoption of technologies by customers and on relationships between employees. The advertising and marketing communities have examined privacy issues in reference to privacy policies, and the effects that these have on consumers (e.g., work by Sheehan ). The legal community has long focused on the implications of spe- cific technologies on existing balances, such as court rulings and the constitutional status quo. We did not include legal literature in this article because much scholarly work in this area is di���cult to use in practice during IT design. However, this work has some bearing on HCI and researchers may find some analyses inspiring, including articles on data protection , the relation between legislation and technology
1.2 Sources Used and Limitations of this Survey 7 , identity , data mining , and employee privacy . As one specific example, Strahilevitz outlines a methodology for helping courts decide on whether an individual has a reasonable expectation of privacy based on the social networking literature . As another example, Murphy discusses whether or not the default privacy rule should allow disclosure or protection of personal information . Privacy research is closely intertwined with security research. How- ever, we will not refer HCI work in the security field. Instead, we direct readers to the books Security and Usability  and Multilateral Secu- rity in Communications  for more information. We also only tangentially mention IT management. Management is becoming increasingly important in connection to privacy, espe- cially after the enactment of data protection legislation . However, academia largely ignores these issues and industry does not publish on these topics because specialists perceive knowledge in this area as a strategic and confidential asset. Governments occasionally publish reports on privacy management. However, the reader should be aware that there is much unpublished knowledge in the privacy management field, especially in CSCW and e-commerce contexts. This survey paper also focuses primarily on end-users who employ personal applications, such as those used in telecommunications and e-commerce. We only partially consider applications in workplaces. However, perceived control of information is one of the elements of acceptance models such as Venkatesh et al.���s extension  of the Technology Acceptance Model . Kraut et al. discuss similar accep- tance issues in a CSCW context , pointing out that in addition to usefulness, critical mass and social influences affect the adoption of novel technologies.
2 The Privacy Landscape In this chapter, we introduce often-cited foundations of the privacy discourse. We then discuss two perspectives on privacy that provide useful characterizations of research and design efforts, perspectives that affect how we bring to bear the notions of law and architecture on the issue of privacy. These perspectives are (1) the grounding of privacy on principled views as opposed to on common interest, (2) the differences between informational self-determination and personal privacy. Finally, we provide a historical outlook on 30 years of privacy HCI research and on how privacy expectations co-evolved with technology. 2.1 Often-Cited Legal Foundations In this section, we describe a set of legal resources often cited by privacy researchers. In our opinion, HCI researchers working in the field of privacy should be familiar with all these texts because they show how to approach many privacy issues from a social and legal standpoint, while uncovering areas where legislation may be lacking. Many authors in the privacy literature cite a renowned 1890 Harvard Law Review article by Judges Warren and Brandeis entitled The Right 8
2.1 Often-Cited Legal Foundations 9 to Privacy as a seminal work in the US legal tradition . Warren and Brandeis explicitly argued that the right of individuals to ���be let alone��� was a distinct and unique right, claiming that individuals should be protected from unwarranted publications of any details of their per- sonal life that they might want to keep confidential.1 In this sense, this right to privacy relates to the modern concept of informational self-determination. It is interesting to note that Warren and Brandeis did not cite the US Constitution���s Fourth Amendment,2 which protects the property and dwelling of individuals from unwarranted search and seizure (and, by extension, their electronic property and communica- tions). The Fourth Amendment is often cited by privacy advocates, especially in relation to surveillance technologies and to attempts to control cryptographic tools. The Fourth Amendment also underpins much privacy legislation in the United States, such as the Electronic Communications Privacy Act, or ECPA.3 Constitutional guarantees of privacy also exist in other legal texts, for example the EU Convention on Human Rights [61, Article 8]. In the United States, case law provides more material for HCI practitioners. Famous cases involving the impact of new technologies on the privacy of individuals in the United States include Olmstead vs. United States (1928), which declared telephone wiretapping constitu- tional Katz vs. United States (1967), again on telephone wiretapping and overturning Olmstead Kyllo vs. United States (2001), on the use of advanced sensing technologies by police and Barnicki vs. Vopper (2001) on the interception of over-the-air cell phone transmissions. Regulatory entities such as the FTC, the FCC, and European Data Protection Authorities also publish rulings and reports with which HCI professionals working in the field of privacy should be familiar. 1 Warren and Brandeis claimed that the right to privacy is unique because the object of privacy (e.g., personal writings) cannot be characterized as intellectual property nor as a property granting future profits. 2 ���The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, [. . . ].��� 3 The ECPA regulates the recording of telecommunications and personal communications at the US Federal level, including wiretapping by government agencies. It generally out- laws any recording of which at least one party being recorded is not aware and requires various types of warrants for wiretapping or recording other telecommunication data for law enforcement purposes.
10 The Privacy Landscape For example, the EU Article 29 Working Party has issued a series of rulings and expressed opinions on such topics as the impact of video surveillance, the use of biometric technologies, and the need for simpli- fied privacy policies. Finally, HCI researchers often cite legal resources such as the Euro- pean Data Protection Directive of 1995  and HIPAA, the US Health Insurance Portability and Accountability Act of 1999 . Many of these data protection laws were inspired by the Fair Infor- mation Practices (discussed in more detail in Section 3.5.1), and impose a complex set of data management requirements and end-user rights. HCI practitioners should be aware that different jurisdictions use legislation differently to protect privacy, and that there is much more to privacy than the constitutional rights and laws described above. 2.2 Philosophical Perspectives on Privacy Arguments about privacy often hinge on one���s specific outlook, because designers��� values and priorities influence how one thinks about and designs solutions . In this section, we present alternative perspec- tives on privacy without advocating one particular view. The reader should instead refer to ethical principles suggested by professional orga- nizations, such as the ACM or the IFIP [25, 41]. Still, we believe that an understanding of different perspectives is useful, because it provides a framework for designers to select the most appropriate approach for solving a specific problem. 2.2.1 Principled Views and Common Interests The first perspective contrasts a principled view with a communitar- ian view. The principled view sees privacy as a fundamental right of humans. This view is supported by modern constitutions, for example the US 4th Amendment, and texts such as the European Convention on Human Rights . In contrast, the communitarian view emphasizes the common interest, and espouses an utilitarian view of privacy where individual rights may be circumscribed to benefit the society at large . For an example of how this dichotomy has been translated into a
2.2 Philosophical Perspectives on Privacy 11 framework for assessing the privacy concerns brought about by ubiqui- tous computing technologies, see work by Terrel, Jacobs, and Abowd [159, 278]. The tension between principled approaches and utilitarian views is reflected in debates over the use of many technologies. For example, Etzioni discusses the merits and disadvantages of mandatory HIV test- ing and video surveillance. In the case of information and communica- tion technologies, the contrast between these two views can be seen in the ongoing debate between civil liberties associations (e.g., the Elec- tronic Frontier Foundation) and governments over strong encryption technologies and surveillance systems. These contrasting views can also help explain differences in approaches in the privacy research community. For example, some privacy-enhancing technologies (PETs) have been developed more as a matter of principle than on solid commercial grounds. Some researchers in the privacy community argue that the mere existence of these PETs is more important for their impact on policy debate than their actual widespread use or even commercial viability. Reportedly, this is the reason why organizations such as the Electronic Frontier Foundation support some of these projects. 2.2.2 Data Protection and Personal Privacy The second perspective contrasts data protection with personal privacy. Data protection (also known as informational self-determination) refers to the management of personally identifiable information, typically by governments or commercial entities. Here, the focus is on protecting such data by regulating how, when, and for what purpose data can be collected, used, and disclosed. The modern version of this concept stems from work by Alan Westin and others [302, 303], and came about because of concerns over how databases could be used to collect and search personal information . Westin���s work led to the creation of the influential Fair Information Practices (FIPS), which are a set of guidelines for personal information management. The FIPS include notions such as purpose specification, participation, and accountability (see Section 3.5.1). The FIPS have
12 The Privacy Landscape greatly influenced research on privacy, including standards like P3P , privacy policies on web sites, and data management policies . More recently, the FIPS have been reinterpreted with reference to RFID systems  and ubiquitous computing . In contrast, personal privacy describes how people manage their privacy with respect to other individuals, as opposed to large organi- zations. Drawing from Irwin Altman���s research on how people man- age personal space , Palen and Dourish argue that privacy is not simply a problem of setting rules and enforcing them, but rather an ongoing and organic ���boundary definition process��� in which disclosure and identity are fluidly negotiated . The use of window blinds and doors to achieve varying levels of privacy or openness is an example of such boundary setting. Other scholars have made similar observa- tions. Darrah et al. observed that people tend to devise strategies ���to restrict their own accessibility to others while simultaneously seeking to maximize their ability to reach people��� . Westin argued that ���Each individual is continually engaged in a personal adjustment process in which he balances the desire for privacy with the desire for disclosure and communication��� . Altman���s work is in part inspired by Goffman���s work on social and interpersonal relations in small groups [119, 120]. One of Goffman���s key insights is that we project different personas to different people in different situations. For example, a doctor might present a professional persona while working in the hospital, but might be far more casual and open with close friends and family. The problem with respect to the design of interactive systems is that these roles cannot always be easily captured or algorithmically modeled. Personal privacy appears to be a better model for explaining peo- ples��� use of IT in cases where the information requiring protection is not well defined, such as managing one���s availability to being interrupted or minute interpersonal communication. Here, the choice of whether or not to disclose personal information to others is highly situational depending on the social and historical context of the people involved. An example of this is whether or not to disclose one���s location when on-the-go using cell phones or other kinds of ���friend finders��� . Cur- rent research suggests that these kinds of situations tend to be di���cult
2.2 Philosophical Perspectives on Privacy 13 to model using rigid privacy policies that are typical of data protection guidelines . In summary, data protection focuses on the relationship between individual citizens and large organizations. To use a blunt expression, the power of knowledge here lies in quantity. In contrast, personal pri- vacy focuses more on interpersonal relationships and tight social circles, where the concern is about intimacy. This distinction is not just academic, but has direct consequences on design. Modeling privacy according to data protection guidelines will likely result in refined access control and usage policies for personal information. This is appropriate for many IT applications today, rang- ing from healthcare to e-commerce. Typical design tools based on the data protection viewpoint include privacy policies on web sites, consent checkboxes, certification programs (such as TRUSTe), and regulations that increase the trust of consumers toward organizations. For applications that manage access to one���s physical space or atten- tion or interpersonal communication (e.g., chat, email, and social net- working sites, as well as some location-enhanced applications such as person finders), a data protection outlook may result in a cumbersome design. For example, imagine highly detailed policies to limit when others can send instant messages to you. Instead, IM clients provide a refined moment-by-moment control of availability through ���away��� features and plausible deniability. For applications affecting personal privacy, negotiation needs to be dialectic and continuous, making it easy for people to project a desired persona, depending on social con- text, pressures, and expectations of appropriate conduct. How should these different views of privacy be reconciled? Our best answer to this question is that they should not be. Each approach to privacy has produced a wealth of tools, including analytic instru- ments, design guidelines, legislation, and social expectations. Further- more, many applications see both aspects at work at the same time. For example, a social networking web site has to apply a data protec- tion perspective to protect the data they are collecting from individuals, a personal privacy perspective to let individuals project a desired image of themselves, and a data protection perspective again to prevent users from crawling and data mining their web site.
14 The Privacy Landscape 2.3 An Historic Perspective on Privacy Privacy is not a static target: changes in technology, in our under- standing of the specific social uses of such technologies, and in social expectations have led to shifts in the focus of privacy research in HCI. In this section, we discuss changes in the expectation of privacy over the past three decades and summarize the consequences of these changes on HCI practice. 2.3.1 Changes in Expectations of Privacy While the basic structures of social relations ��� for example, power rela- tions and the presentation of self ��� have remained relatively stable with technical evolution , there have been large shifts in perceptions and expectations of privacy. These shifts can be seen in the gradual adop- tion of telecommunication technologies, electronic payment systems, and surveillance systems, notwithstanding initial privacy worries. There are two noteworthy aspects on how privacy expectations have changed. The first is that social practice and expectations co-evolve with technical development, making it di���cult to establish causal effects between the two. The second aspect is that privacy expecta- tions evolve along multi-dimensional lines, and the same technology can have opposite effects on different types of privacy. Social practice and technology co-evolve. For example, the intro- duction of digital cameras, or location technology in cell phones, hap- pened alongside the gradual introduction of legislation [78, 284, 286] and the emergence of a social etiquette regulating their use. Legislation often follows technical development, but in some cases it preempts tech- nical development. For example, digital signature legislation in some European countries was enacted well before the technology was fully developed, which may have in fact slowed down adoption by negatively affecting its usability . It is often di���cult to tease cause and effect apart: whether social practices and expectations drive the development of technology or vice-versa. Some observers have noted that the relationship between social constructs and technology is better described as co-evolution. Latour talks of ���socio-technological hybrids,��� undividable structures
2.3 An Historic Perspective on Privacy 15 encompassing technology as well as culture ��� norms, social practices and perceptions . Latour claims that these hybrids should be stud- ied as a whole. This viewpoint is reflected by HCI researchers, includ- ing the proponents of participatory design [88, 251] and researchers of social computing . Iachello et al. even go as far as claiming that in the domain of privacy, adoption patterns should be ���designed��� as part of the application and can be influenced to maximize the chances of successful acceptance . The reader should note that in some cases, technologies that affect privacy are developed without much public debate. For example, Geo- graphic Information Systems (GIS) classify geographic units based on census, credit, and consumer information. Curry and Philips note that GIS had a strong impact on the concepts of community and individual, but were introduced almost silently, over the course of several decades, by a combination of government action, develop- ments in IT, and private enterprises, without spurring much public debate . Understanding these changes is not a straightforward task, because technical development often has contradictory effects on social prac- tice. The same artifact may produce apparently opposite consequences in terms of privacy, strengthening some aspect of privacy and reduc- ing others. For example, cell phones both increase social connected- ness, by enabling distant friends and acquaintances to talk more often and in a less scheduled way than previously possible, but also raise barriers between physically co-present individuals, creating ���bubbles��� of private space in very public and crowded spaces such as a train compartment . From this standpoint, privacy-sensitive IT design becomes an exer- cise of systematically reconciling potentially conflicting effects of new devices and services. For example, interruption management systems based on sensing networks (such as those prototyped by Nagel et al. ) aim at increasing personal and environmental privacy by reduc- ing unwanted phone calls, but can affect information privacy due to the collection of additional information through activity sensors. We highlight this issue of how expectations of privacy change over time as an ongoing research challenge in Section 4.5.