5. Privacy Interfaces Flashcards

1
Q

Just-in-time privacy interfaces

A

Just-in-time interfaces are shown in the same transactional context as the data practice they pertain to, which supports reasoning in the moment and means they can be specific and short, communicating only the most relevant information and choices.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Context-dependent privacy interfaces

A

Context-dependent privacy interfaces are triggered by certain aspects of the user’s context.88 For instance, being in physical proximity to an IoT sensor may cause the device to announce its presence (e.g., flashing an LED, beeping, sending a description of its data practices to the user’s phone). Other context factors might be someone accessing previously uploaded information or analyzing the user’s previous privacy settings behavior to warn about potentially unintended privacy settings. For example, Facebook displays a message warning that it is about to post publicly when the user’s last post was public, but they are typically not in the habit of posting publicly.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Periodic reminders

A

Periodic reminders are useful to remind users about data practices that they agreed to previously and to renew consent if the practice is still ongoing, especially if the data practice occurs in the background, invisible to the user.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Persistent privacy indicators

A

Persistent privacy indicators are shown whenever a data practice is active. For instance, cameras often have lights to indicate when the camera is recording. Persistent indicators can provide an unobtrusive cue about especially critical data practices, but there’s also a risk that the indicator will not be not noticed.89

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

On demand privacy information and controls

A

On demand privacy information and controls allow users to seek out and review privacy information or their privacy settings and opt-outs any time. On-demand interfaces should be made available in a well-known or easily findable location (e.g., a website should have a “privacy” subdomain or “/privacy/” folder and provide links to privacy controls from their privacy policy and in relevant menus).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Channels to deliver privacy interfaces

A

Privacy interfaces can be delivered through different communication channels:
Primary (the device the user is using)
Secondary channel (if the primary device is constrained, using a different one)
Public channels

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Modality to present privacy interfaces

A

Privacy interfaces can be presented in different ways, with different interaction modalities:
Visual privacy interfaces
Auditory privacy interfaces
Haptic and other modalities may also be leveraged as privacy interfaces. For instance, device vibration could be used as an indicator for data collection.
Olfactory displays could be used to inform about privacy risks with different scents (e.g., lavender scent when visiting a privacy-friendly website; sulphur when visiting a privacy-invasive one). Ambient lights could also serve as privacy indicators. Could taste or skin conduction be used in privacy interfaces? Although it might not be immediately apparent how less conventional modalities could be used for privacy interfaces, the important point is to creatively explore even unconventional design opportunities. Such exploration often leads to helpful insights that can inform practical solutions.
Machine-readable specifications of privacy notices and controls enable the consistent presentation and aggregation of privacy information and controls from different systems or apps. Mobile apps have to declare their permission requests in a machine-readable format, and the mobile operating system is responsible for providing permission prompts and managing the user’s consent. IoT devices that lack screens and input capabilities could broadcast their machine-readable privacy notices to smartphones of nearby users, which then present the respective privacy information and choices to the user.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Control of privacy notices

A

User choices, consent dialogs and privacy settings can be delivered in different ways that affect how users interact with them.
Blocking privacy controls force users to interact with the privacy interface in order to be able to proceed. Blocking controls are useful when the user must make a choice, e.g., when consent is needed. However, how choices are presented affects whether the interaction is actually an expression of the user’s preference, and so should be considered carefully. For example, presenting a complex privacy policy and providing only the options to accept or not use the app is not suitable for eliciting consent, as users are likely to click the warning away without reading. Preferable are prompts that are specific to a single data practice and provide options to both allow or deny the practice (e.g., mobile permission requests). All choices should be equally easy for the user to exercise.
Nonblocking privacy controls do not interrupt the interaction flow but are rather integrated as user interface elements into the UX. For example, social media apps might provide an audience selector (e.g., private, friends, public) within the interface for creating a post. The control is available but does not have to be used and, at the same time, reminds the user of their current privacy settings.
“Decoupled privacy controls are not part of the primary UX. They are useful to provide the user the opportunity to inspect their privacy settings or the data the system has collected about the user. Common examples are privacy settings and privacy dashboards. The advantage of decoupled privacy controls is that they can be more comprehensive and expressive than integrated controls; the downside is that users need to actively seek them out. Good practice is to provide decoupled privacy controls at a central place and then point to them from other, more concise privacy notices and controls where appropriate.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Notice and choice model

A

Transparency and user rights are core concepts of privacy legislation and guidelines globally, ranging from the Organisation of Economic Co-operation and Development (OECD) privacy guidelines, to the U.S. Federal Trade Commission’s (FTC’s) fair information practice principles, to Europe’s GDPR, and privacy legislation in many other countries.9 While specific requirements may vary, companies that collect or process personally identifiable information (PII) typically have to be transparent about their data practices and to inform data subjects about their rights and options for controlling or preventing certain data practices. This is often known as the notice and choice model.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Privacy harm

A

A negative impact of data processing on the data subject’s privacy. Sometimes people observe tangible harm, but often they are not aware of the harm. Privacy harms may also not manifest immediately but rather much later.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Control paradox

A

Control paradox, as perceived control over privacy may lead to increased sharing, which in turn may increase privacy risks.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Bounded rationality

A

Generally, humans are limited in their ability and time to acquire, memorize and process all information relevant to making a fully informed and rational decision. Behavioral economists call this deviation from the ideal of a fully rational actor bounded rationality. To compensate for the inability and impracticality of considering all potential outcomes and risks, humans rely on heuristics in their decision-making to reach a satisfactory solution rather than an optimal one. However, decision heuristics can lead to inaccurate assessments of complex situations.23 Rational decision-making is further affected by cognitive and behavioral biases—systematic errors in judgment and behaviors.
Bounded rationality also persists in privacy decision-making.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Dark Patterns

A

“Dark patterns are interface or system designs that purposefully exploit cognitive and behavioral biases in order to get people to behave a certain way regardless of whether that behavior aligns with their preferences. Some common privacy dark patterns include the following:
Default settings—Default settings frequently exploit status quo bias. Similarly, preselecting a certain option nudges users towards accepting that choice.
Cumbersome privacy choices—Making it more difficult, arduous and lengthy to select a privacy-friendly choice compared with a privacy-invasive one deters users from privacy-friendly options.
Framing—How a choice is described and presented can affect behavior. An emphasis on benefits, a de-emphasis on risks or the presentation of trust cues may lead to people making riskier privacy decisions than they would with a neutral presentation of choices.
Rewards and punishment—Users are enticed to select a service’s preferred choice with rewards or are deterred from privacy-friendlier choices with punishments.
“Forced action—Users must accept a data practice or privacy choice in order to continue to a desired service (hyperbolic discounting), regardless of whether they actually agree with the practice. They are forced to act against their privacy preference.
Norm shaping—Other people’s observed information-sharing behavior, say, on a social media service, shapes the perceived norms of information sharing on a platform and individuals’ own disclosure behavior. For example, a controlled experiment showed that people who see more revealing posts in the feed of a photo-sharing service tend to consider such photos more appropriate and are more likely to share more revealing information themselves than those who see less revealing photos.42 Thus, the algorithm for determining news feed content, which might purposefully highlight or suppress certain content to activate such anchoring effects, has a lot of power in steering users’ behavior, regardless of whether the displayed posts are actually representative of user behavior.
Distractions and delays—Even small distractions or delays can create a distance between awareness of privacy risks and behavior that can cancel out the effects of a privacy notice.
Such manipulations of privacy behavior are unethical, as they constrain people in their self-determination and agency over their privacy. Moreover, manipulations of privacy behavior further exacerbate the misalignment between people’s privacy preferences, expectations and their actual behavior and the data practices to which they are subject.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Usability

A

ISO 9241-11 defines usability as the “extent to which a system, product or service can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use.”
“components that determine a system’s usability:
Learnability—How easy is it for users to accomplish basic tasks the first time they encounter the system?
Efficiency—Once users have learned the system, how quickly can they perform tasks?
Memorability—When users return to the system after a period of not using it, how easily can they reestablish proficiency?
Errors—How many errors do users make, how severe are these errors and how easily can they recover from the errors?
Satisfaction—How pleasant is it to use the system?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Usability

A

Utility is about functionality. Does the system support users in satisfying their needs and accomplishing their goals? An interface can be very usable, but it is useless if it does not align with users’ actual needs and expectations.
For example, the unsubscribe mechanism might be easy, fast and pleasant to use (great usability), but only gives users the option to unsubscribe from all of an organization’s communication or none of them, even though some users might want to unsubscribe from marketing emails but continue receiving important notifications about their account activity. As a result, people may not use the very usable opt-out mechanism because it is not useful for them.
A system with high utility meets the exact needs of users. A useful system has both good utility and good usability.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

User Experience

A

Usability is important but, on its own, often not sufficient to characterize what constitutes a good or bad experience for users. UX design takes a more holistic perspective that places users and their needs at the center and “encompasses all aspects of the end-user’s interaction with the company, its services, and its products.”47 This might include the actual product; terms of service and privacy policies tied to the product; the product’s purchase, unboxing and sign-up experience as well as customer support, documentation, manuals, privacy settings and so on.
A system’s UX encompasses the extent to which a system meets users’ needs (utility), usability, aesthetics and simplicity, and the joy, emotional reactions and fulfillment a system provides. UX design therefore integrates user interface design, engineering, visual design, product design, marketing and branding as well as business models and related considerations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

User-centred design process

A

UX design follows a principled and systematic process. At the center of the design process are users—typically a set of anticipated user populations and stakeholders. While different methodologies follow slightly different steps, generally the user-centered design process consists of three phases—research, design, evaluation—with projects typically going through multiple iterations of the user-centered design process to iteratively refine designs and better align them with user needs.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

UX Design research & analysis phase

A

UX design starts with research and analysis. The goal of this phase is to understand the context in which a certain system and its UX will operate and function. An important aspect of this is identifying a system’s user populations and then analyzing their specific characteristics and needs as they relate to both the system and its context of use. This often includes learning about people’s mental models of systems or processes in order to identify and understand potential misconceptions.
Common user research methods and activities include semistructured interviews, diary studies, contextual inquiry, survey research and usability tests (with the current system or related competitors). User research is typically complemented by desk research, involving competitive analysis, heuristic evaluation and review of relevant academic research and literature. Personas, scenarios, user journeys and affinity diagrams are tools used for synthesizing findings into higher-level insights that can be used and referenced throughout the design process.48
Based on the research findings, the requirements for the UX design are defined.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

UX Design Phase

A

The requirements identified in the research phase inform the design phase. UX design aims to find and create solutions that meet user needs, as well as other system requirements. Good UX design takes users’ cognitive and physical characteristics into account. UX design is highly iterative and user centric. Solutions are developed in an iterative process that aims to put potential solution designs into the hands of users early on and throughout the refinement process in order to ensure that designs are properly supporting user needs. Designs typically start with very basic prototypes, often sketches and paper prototypes, which look far from the final product but make it possible to simulate and test interaction flows before investing time, effort and resources in further development.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

UX Evaluation

A

Throughout the design process, designs should be iteratively evaluated with current or future users of a system. As such, the design phase and the evaluation phase are closely interlinked. The purpose of UX evaluation is to validate that the system’s designs and prototypes indeed meet the user needs and requirements identified in the research phase. Evaluation methods are often the same or similar to the user research methods mentioned in Section 5.3.2.1, with the addition of A/B testing and production deployment of developed solutions. UX validation may include both quantitative assessments (e.g., log file analysis, interaction times, success rates) and qualitative assessments (e.g., perceived usability, perceived cognitive load, joy of use, comprehension), with both providing important insights to evaluate and further refine designs and potentially also the design requirements.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Value-sensitive design

A

Value-sensitive design is a design approach that accounts for ethical values, such as privacy, in addition to usability-oriented design goals.51 Value-sensitive design methods help to systematically assess the values at play in relation to a specific technology and respective stakeholders and the ways the technology might meet or violate those values. They also help to iteratively develop designs that are sensitive to and respectful of those values.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Privacy reminders

A

Organizations may choose or be required to remind people about data practices they are subject to. For instance, financial institutions in the United States are required to provide an annual privacy notice to their customers under the Gramm-Leach-Bliley Act (GBLA). However, privacy reminders can also take a more proactive shape and make users aware of data practices they had previously agreed to or nudge them to check and update their privacy settings.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Privacy notices

A

Privacy notices aim to provide transparency about an organization’s data practices and other privacy-related information, such as measures taken to ensure security and privacy of users’ information. Privacy notices need to be provided to users—typically before a data practice takes place—and explain what information about data subjects is being collected, processed, retained or transferred for what purpose. Laws and regulations in different countries may pose specific transparency requirements in terms of what information needs to be provided, when it needs to be provided and how it needs to be provided. Privacy notices can take different shapes and forms

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Privacy policies

A

Privacy policies are probably the most common type of privacy notice. A privacy policy holistically documents an organization’s data collection, processing and transfer practices and also includes other privacy-related information. While most common, privacy policies are also among the most ineffective privacy user interfaces. Most people do not read privacy policies, as they have little incentive to do so.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

Informational privacy resources

A

Informational privacy resources are type of privacy notices that complement privacy policies with informational resources that summarize or highlight important data practices and aim to communicate the value and role privacy plays for the organization to its users. These resources serve the purpose of educating users about how their data is used and protected, as well as what privacy choices and settings are available to them. The content should be consistent with the organization’s privacy policy but presented more concisely and often with visual aids or videos to make the information more accessible and easier to understand for a general audience. While better at informing users about data practices than privacy policies, these informational privacy resources still need to be actively sought out by users, which means most users won’t see them.

26
Q

Integrated privacy notices

A

Integrated privacy notices are type of privacy notices that are presented in a relevant part of the service or product’s UX. For instance, an account creation process may include short explanations about how requested information (e.g., email address) will be used and protected.

27
Q

Privacy indicators

A

Privacy indicators are types of privacy notices, cues in a user interface or on a device. Privacy indicators are particularly useful for conveying either the state of a data practice (e.g., an LED indicating when a video camera is recording) or the state of a privacy setting (e.g., a small icon in a social media site’s message posting interface indicating the post’s audience).

28
Q

Privacy reminders

A

Privacy reminders are privacy notices with which the organizations may choose or be required to remind people about data practices they are subject to. For instance, financial institutions in the United States are required to provide an annual privacy notice to their customers under the Gramm-Leach-Bliley Act (GBLA). However, privacy reminders can also take a more proactive shape and make users aware of data practices they had previously agreed to or nudge them to check and update their privacy settings.

29
Q

User consent

A

User consent is an established way to legitimize a data practice. To be valid, consent needs to be a freely given, specific, informed and unambiguous indication of an individual’s agreement.57 This means consent should pertain to a single data practice rather than be bundled together with consent for multiple data practices. Individuals need to be provided with sufficient information to make a consent decision, and it must be equally possible to agree or disagree. Consent interfaces typically need to enable users to provide or deny initial consent as well as check their consent status and possibly change it, which may mean revoking consent.

30
Q

Integrated consent prompt

A

Integrated consent prompt is a type of implementing consent interfaces —Opt-in consent requests are typically integrated into a system’s UX. Examples include requests to accept a product’s terms of service or privacy policy before use, or checkboxes to opt into data use for nonessential purposes during account creation. Requiring users to uncheck prechecked boxes to opt out of a data practice during setup is a discouraged dark pattern.

31
Q

Decoupled opt-out consent interface

A

Decoupled opt-out consent interface type —Opt-outs might be decoupled from a user’s interaction with a system, e.g., when they are described in a privacy policy, or in separate advertising or cookie policies, or when they are part of privacy settings. The challenge with decoupled opt-outs is that people have to seek them out and may not be aware of their existence or of the data practice they pertain to.

32
Q

Integrated opt-out consent interface

A

Integrated opt-out—Certain opt-outs are integrated with the UX. One example is an Unsubscribe link in email communication. The advantage of integrated opt-outs is that they are present in the context in which people might need and want to use them.

33
Q

Delegated consent interface

A

Delegated consent interface type: Sometimes consent is not directly obtained by the first party but rather by a different service or platform provider. One example for delegated opt-in consent is a mobile permission. Apps do not directly ask users for access to resources on the smartphone (e.g., location, contacts, text messages), but instead programmatically declare required permissions to the smartphone. The smartphone operating system then generates a respective permission prompt, asking the user to accept or deny a permission request. An advantage of this model is that the presentation and functionality of permission requests are consistent regardless of the underlying app, which facilitates learnability of the interface and reduces cognitive load. A challenge with such delegated opt-outs is that the first party has little control over the usability of the consent mechanisms.

34
Q

Privacy settings

A

Privacy settings typically aggregate the privacy choices and controls available to a user of a given product or service in one place. Privacy settings are typically available from within account settings or referenced in the privacy policy, ideally both. Privacy settings interfaces can integrate a multitude of controls, including controls to inspect and change previously made consent decisions, settings to regulate the granularity of information, settings to regulate the visibility of one’s information (e.g., what information is public, visible to friends, or private), and settings to manage the sharing of information with users or with other apps or services. We distinguish two types of privacy settings:
First-party privacy settings—Privacy settings commonly refer to settings made available by the service provider or product manufacturer. The service provider controls what settings are made available, how they are made available and how they are integrated with the overall UX.
Platform privacy settings—What information an app or service has access to is also often controlled by the privacy settings of the respective platform used to provision the service, such as a mobile operating system for mobile apps, a smart assistant platform for voice-based skills or actions, or a web “browser. Platform privacy settings may be specific to a certain service or app (e.g., app X can access location; app Y can’t) or may allow users to set general privacy settings (e.g., whether third-party cookies are accepted or tracking protections are enabled).

35
Q

Privacy dashboards

A

Privacy dashboards typically give users access to the data an organization has about them. This may be in the form of activity timelines, data summaries or access to the actual data. Privacy dashboards should further provide support for other mandated user/data subject rights, such as enabling users to correct data or request rectification of inaccurate data. Users should further be able to export their data as well as delete some or all of it.
Often, privacy dashboards not only facilitate data access, edit, deletion and support but also include all of an organization’s privacy settings, and possibly privacy information resources and privacy policies, in order to create a single destination for users for all things privacy.

36
Q

Usability issues of privacy interfaces

A

CONFLATING COMPLIANCE AND USER NEEDS: Privacy notices are not written as tools for creating awareness and transparency for users. Similarly, privacy controls are implemented according to regulatory requirements, but may not be sufficient to meet the actual privacy control needs of users. Yet, those privacy notices and controls are treated as if they were providing transparency and control to users.
Designing for users’ privacy needs does not require neglecting or ignoring regulatory requirements. Rather, it is essential to distinguish between the information that must be provided for compliance reasons and the information that users might need to make informed privacy decisions—they are sometimes the same, but often not. As users are unlikely to read privacy policies, relevant information needs to be provided to them through additional means in order to reduce surprise. Similarly, privacy controls and choices required by regulation must be provided in a usable and useful manner. Furthermore, additional privacy controls may be necessary to ensure that users can control privacy in alignment with their context-specific privacy preferences and expectations.
LACK OF MEANINGFUL CHOICES: accept all or don’t use it.
POOR INTEGRATION WITH UX: Privacy notices and consent prompts are often shown at inopportune times, without regard for the user’s primary task. Providing users with all privacy-relevant information when downloading an app or signing up for a service is common practice, but futile from a UX perspective.
POOR DISCOVERABILITY: Often privacy notices, opt-outs and other controls are decoupled from a system’s primary UX. This dissociation of privacy interfaces from a system has severe usability consequences. Without being exposed to privacy interfaces as part of the system’s UX, users require substantial digital literacy and an advanced mental model of how a certain technology works in order to anticipate a system’s data practices and what privacy information or privacy controls might be available to them.
CONFUSING INTERFACES:
When privacy interfaces are not designed carefully, they may be confusing and suffer from usability issues that could have been uncovered through user testing. Some common issues include wording or signage (e.g., an icon) that is confusing or ambiguous, privacy choices and opt-outs whose effects are unclear, or privacy controls that behave contradictory to expectations.

37
Q

Privacy design principles

A

USER-CENTRIC: A UX is successful when it meets users’ needs. For privacy interfaces, that means first understanding what the privacy needs of different stakeholders and user populations are, both their informational needs and their control needs. Identifying those needs requires investigating and understanding people’s privacy preferences, concerns and expectations with respect to a specific information system. It further requires understanding users’ mental models of the technology, its data practices and its privacy protections. Such insights help determine what information may be necessary to help users make informed decisions or create awareness of otherwise unexpected data practices. “When designing privacy notices and controls, it is helpful to be aware of how humans receive, process and react to information. Wogalter’s communication-human information processing (C-HIP) model explains how humans perceive, process and react to warnings. The Human in the Loop (HILP) model adapts the C-HIP model for security (and privacy).
RELEVANT: “To be useful for privacy decision-making and behavior, privacy information and controls need to be relevant to the user’s specific context or transaction. “relevant information and controls should be easily accessible from within a context and organized according to the steps in user journeys typical for different user populations. Rather than providing abstract descriptions, privacy notices should be explicit about what information is specifically being collected, processed or shared in a given context or transaction, why this data practice is necessary and how it benefits the data subject (if at all), and what controls are available regarding the practice.
UDERSTANDABLE: “When presenting privacy information or providing privacy controls, it is important that the information and controls are understandable by the users they are targeting. This UX best practice is starting to find its way into privacy regulation. For instance, the GDPR requires that “any information and communication, where processing is addressed to a child, should be in such a clear and plain language that the child can easily understand.
ACTIONABLE: “To be useful, information needs to be actionable.75 It is typically not advisable to give users a privacy notice without an associated action or choice they can take. While privacy information can support users in making more informed privacy decisions, as well as help correct potentially misaligned mental models or expectations, without something to do with that information, there is not much point in the increased awareness. Users have little incentive to engage with provided information unless there are also options for them to realize their privacy decisions. Therefore, privacy information and privacy controls should go hand in hand whenever possible.
INTEGRATED: “Privacy information and controls should be integrated with a system’s primary UX rather than added on. Relevant information should be provided at points in a system’s UX where users actually make privacy decisions, or where it might be important to make users aware of data practices or privacy risks in order to help them develop an accurate mental model of the system and its information flows. Similarly, privacy controls and indicators (e.g., to see and control a social media post’s audience) should be available at interaction points when they matter (e.g., when writing the post).
Whenever possible, privacy interfaces should use the same interaction methods as the system’s primary UX. This ensures that interacting with privacy interfaces feels natural in a system’s context and is as easy as any other user interaction with the system.

38
Q

C-HIP Model

A

Communication human information processing model (Wogalter) explains how humans perceive, process and react to warnings.

39
Q

Privacy design process

A

To put the privacy design principles into practice, it helps to follow a systematic privacy design process. We discuss a general process that combines UX, PIA, and value-sensitive design. This process consists of six steps:
BUILD ON PRIVACY ASSESSMENT, PRIVACY MANAGEMENT AND PRIVACY ENGINEERING PRACTICE to systematically identify a system’s user rights and transparency requirements
IDENTIFY USERS AND THEIR PRIVACY NEEDS by identifying stakeholders and eliciting their privacy expectations and privacy concerns as well as their privacy information needs and privacy control needs
IDENTIFY UNEXPECTED DATA PRACTICES, which are those that users are unaware of or might be surprised by, to help prioritize which data practices and controls to highlight
INTEGRATE PRIVACY INTERFACES INTO SYSTEM’S UX by determining which privacy notices and controls are most relevant to a user at which points in the UX
LEVERAGE THE AVAILABLE DESIGN SPACE for privacy notices and controls to develop user-centric privacy interfaces that work within a system’s constraints
CONDUCT USER TESTING to evaluate the usability and usefulness of developed privacy interfaces

40
Q

Types of users (privacy design perspective)

A

PRIMARY
SECONDARY
INCIDENTAL
Depending on the system, other or additional user groups may need to be considered. There may also be specific regulatory requirements for data collection, notice and user rights regarding protected user groups, such as children.
System designers and privacy professionals need to understand how the privacy of each identified user group may be affected by the system. Often a system’s user groups are affected by the same data practices.

41
Q

Layered notices

A

Notices and controls can be layered. A short notice may highlight a practice, fact or control to gain the user’s attention and provide a link to additional information or more controls, which in turn might consist of multiple layers that users can reveal and explore as needed. In UX design, this design pattern is called details on demand—providing an overview or summary first, with options for users to retrieve details.82 Thus, rather than trying to provide a single notice or control that does everything, it’s better to craft a privacy UX that is composed of many complementary privacy notices and controls at different levels of detail tailored to the respective user group and context.
A privacy UX combines privacy interfaces shown at different times, using different modalities and channels, and varying in terms of content and granularity in a structured approach. With such an integrated and multilayered approach, individual users still receive information and choices for data practices that pertain to them but won’t be confronted with data practices that are irrelevant for them until they are using a respective feature of the system. Users should still have the option to read the full privacy policy and explore all privacy settings whenever they want it. “Creating and maintaining a privacy user experience composed of multilayered privacy interfaces does incur engineering and management costs, but it has multiple benefits. Users’ privacy needs and the system’s data practices can be better aligned. Where user consent is required, this approach can yield properly informed consent. Reducing surprise and explaining privacy protections is likely to facilitate user trust.

42
Q

Dimensions to be considered in

A

Schaub et al. identified four design dimensions that should be considered in privacy interface design:
TIME (The point in time at which privacy information or controls are presented to the user has a substantial impact on a privacy interface’s usability and utility. Users may ignore privacy interfaces shown at inopportune times, but may seamlessly interact with a consent prompt that appears exactly when it is needed.)
CHANNEL (Privacy interfaces can be delivered through different communication channels and can be: primary, secondary, public)
MODALITY (Privacy interfaces can be presented in different ways, with different interaction modalities.)
CONTROL (User choices, consent dialogs and privacy settings can be delivered in different ways that affect how users interact with them.)

43
Q

At set up privacy interfaces

A

At setup interfaces are often shown on initial use. However, only information and choices that are truly essential before use should be communicated at setup because users’ attention is typically focused on the primary UX at this point.

44
Q

Just in time privacy interfaces

A

Just-in-time interfaces are shown in the same transactional context as the data practice they pertain to, which supports reasoning in the moment and means they can be specific and short, communicating only the most relevant information and choices.

45
Q

Context-dependant

A

Context-dependent privacy interfaces are triggered by certain aspects of the user’s context.88 For instance, being in physical proximity to an IoT sensor may cause the device to announce its presence (e.g., flashing an LED, beeping, sending a description of its data practices to the user’s phone). Other context factors might be someone accessing previously uploaded information or analyzing the user’s previous privacy settings behavior to warn about potentially unintended privacy settings.

46
Q

Periodic reminders

A

Periodic reminders are useful to remind users about data practices that they agreed to previously and to renew consent if the practice is still ongoing, especially if the data practice occurs in the background, invisible to the user.

47
Q

Persistent privacy indicators

A

Persistent privacy indicators are shown whenever a data practice is active. For instance, cameras often have lights to indicate when the camera is recording. Persistent indicators can provide an unobtrusive cue about especially critical data practices, but there’s also a risk that the indicator will not be not noticed.

48
Q

On-demand privacy notices

A

On demand privacy information and controls allow users to seek out and review privacy information or their privacy settings and opt-outs any time. On-demand interfaces should be made available in a well-known or easily findable location.

49
Q

Visual privacy interfaces

A

Visual privacy interfaces are most common. Privacy notices are presented as text or with icons and illustrations; privacy controls rely on graphical user interfaces. Many aspects can affect the usability of visual privacy interfaces, including color, font, white space, combinations of icons and text, the layout of interface elements and how the state of settings is conveyed. Privacy concepts are difficult to represent as icons only; combining icons with explanatory text or using more expressive visualizations and videos is preferable.

50
Q

Auditory privacy interfaces

A

Auditory privacy interfaces use sounds or voice to convey privacy information, or voice commands to enable privacy controls. Certain sounds can convey data collection practices very effectively.

51
Q

Haptic privacy interfaces

A

Haptic and other modalities may also be leveraged as privacy interfaces. For instance, device vibration could be used as an indicator for data collection.

52
Q

Machine readable privacy notices

A

Machine-readable specifications of privacy notices and controls enable the consistent presentation and aggregation of privacy information and controls from different systems or apps.92 Mobile apps have to declare their permission requests in a machine-readable format, and the mobile operating system is responsible for providing permission prompts and managing the user’s consent. IoT devices that lack screens and input capabilities could broadcast their machine-readable privacy notices to smartphones of nearby users, which then present the respective privacy information and choices to the user.

53
Q

Blocking privacy controls

A

Blocking privacy controls force users to interact with the privacy interface in order to be able to proceed. Blocking controls are useful when the user must make a choice, e.g., when consent is needed. However, how choices are presented affects whether the interaction is actually an expression of the user’s preference, and so should be considered carefully.

54
Q

Nonblocking privacy controls

A

Nonblocking privacy controls do not interrupt the interaction flow but are rather integrated as user interface elements into the UX. For example, social media apps might provide an audience selector (e.g., private, friends, public) within the interface for creating a post. The control is available but does not have to be used and, at the same time, reminds the user of their current privacy settings.

55
Q

Decoupled privacy controls

A

Decoupled privacy controls are not part of the primary UX. They are useful to provide the user the opportunity to inspect their privacy settings or the data the system has collected about the user. Common examples are privacy settings and privacy dashboards. The advantage of decoupled privacy controls is that they can be more comprehensive and expressive than integrated controls; the downside is that users need to actively seek them out. Good practice is to provide decoupled privacy controls at a central place and then point to them from other, more concise privacy notices and controls where appropriate.

56
Q

Usability testing and user studies

A

Usability testing and user studies play several important roles: they can help to assess needs, examine tradeoffs, evaluate designs and find root causes of problems.
At the beginning of a design process, exploratory user studies can inform design requirements by providing feedback on which user needs are not being met by existing systems, as well as identify users’ pain points and points of confusion. Usability testing is critical for determining that a design actually meets its stated requirements. Before declaring that their privacy settings allow users to exercise privacy choices quickly and easily, organizations should run usability tests that assess whether users can find privacy settings (discoverability), use them to select the settings that match their personal preferences and understand the effect of the settings they selected.
Finally, after discovering a usability deficiency, usability testing is useful to determine the underlying cause of the problem, which will hopefully lead to solutions.

57
Q

Formative and summative evaulation

A

Usability evaluation can be categorized as either formative or summative.101 Formative evaluations are used to gain insights into which aspects of a prototype or product could use improvements, while summative evaluations are used to draw comparisons between a prototype or product and some benchmark (e.g., previous version, competing product). Generally, formative evaluations are small scale and focus on gathering rich qualitative insights that can be used to improve a product. They may be conducted iteratively, with several rounds of evaluation and design changes. Summative evaluations are generally conducted once the design team believes they are done or even after the product has shipped. They are generally larger studies that focus on gathering quantitative data. Summative studies are usually done to validate that the product (hopefully) meets the design requirements or performs better than a previous version or a competing product.

58
Q

Online panels and crowdsourcing services

A

Researchers can pay to gain access to these panels to recruit participants for their studies. Some organizations maintain their own in-house panels, recruited from among users of a particular product or members of a university community, for example.
Crowdsourcing platforms such as Amazon Mechanical Turk allow people to sign up as workers and perform tasks for pay (generally at rates similar to minimum wage). These platforms can be a quick and inexpensive way to recruit participants for online surveys and other online user studies. If you use these services, you need to be especially careful about how you screen participants for any desired attributes and make sure that they don’t just randomly click through your survey in order to get a quick payment. Some crowdsourcing services such as Prolific and CrowdFlower have been shown to deliver more diverse participants and achieve higher-quality survey results than the more popular Mechanical Turk.

59
Q

A/B Testing

A

A/B testing refers to tests where some users of a product or service see version A and others see version B. Generally, there are small differences between A and B and some metric to compare the two.

60
Q

Testing metrics

A

Before testing begins, it is important to identify a set of metrics that will be used. Metrics might relate to any of the areas discussed in section 5.2, including discoverability, awareness, comprehension, utility, behavior, satisfaction and other factors. For example, when testing a privacy tool or interface, metrics might include speed with which users complete a task, number of errors made while completing a task and accuracy of users when answering questions about the meaning of symbols or other interface components. Measuring the effectiveness of consent or opt-out interfaces poses challenges. The percentage of users who consent or opt out is not an indicator of the effectiveness of the interface. What we really want to capture is user awareness and comprehension of choices and whether they successfully select options that align with their personal preferences. One way to measure this would be to survey users shortly after they engage with a consent interface to find out what they believe they consented to (or chose not to consent to) and how this aligns with their preferences.

61
Q

Ecological validity of usability testing

A

Ecological validity refers to the realism of the methods, materials and setting of a user study or usability test. The most reliable results are obtained through field testing. In field tests, designs are evaluated as part of a production system. The advantage of field testing is that new designs are tested under real-world conditions with real users rather than under artificial lab conditions with recruited participants. However, this also limits them to late stages of the design process, when a design has sufficiently matured to be used in production systems without placing users at risk or creating liabilities for the organization. “One component of ecological validity is the context in which user study tasks are embedded. If a study participant sitting in a usability lab is shown a privacy policy or any other privacy interface and asked questions about it without being provided with any context or reason for wanting to read the policy, the resulting usability evaluation will lack ecological validity. In this case, participants may pay more attention to the policy than they would in real life, when privacy is likely not their primary concern. Thus, comprehension testing without a real-world context is likely to result in a best-case scenario. Awareness and behavior testing are likely to be difficult to conduct meaningfully without embedding tasks in context. “Often context is provided by embedding tasks in a hypothetical scenario such as those mentioned above. However, participants may not be motivated to behave realistically in a hypothetical scenario, especially one in which they have to imagine a particular privacy or security risk. Sometimes study designs employ deception to increase ecological validity and observe how users behave when simulated privacy or security risk is introduced.

62
Q
A