Things I Don't Get That Need Review Flashcards

Cards needed since I am a complete dumb fuck

1
Q

ADCollection Limitation

A

FIPPS Principle: Means that organizations should only collect personal data that is necessary for a specific purpose, and they should collect it in a lawful and fair manner. It also means they should ask for consent from the individual before collecting the data when possible.

Example: Imagine a company offering a free newsletter. If they ask for your email to send it to you, that’s fine. But if they also ask for your home address, birth date, and phone number without explaining why they need that information, they would be violating the Collection Limitation principle, because they’re collecting more information than necessary.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Openness Principle

A

FIPPS Principle: Means that organizations should be clear and transparent about their data practices. They should openly communicate how they collect, use, store, and share personal information. Individuals should be able to easily find information about what data is being collected and how it’s being used.

Example: If you’re using a social media app, the company should provide a clear privacy policy explaining what kind of personal information they collect (like your name, email, location, etc.), why they collect it, and who they might share it with (like advertisers). If they change how they use your data, they should let you know.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Individual Participation Principle

A

FIPPS Principle: People should have control over their personal information. This principle gives individuals the right to know what data organizations have about them, request access to it, correct any errors, and even ask for the information to be deleted or removed if it’s not necessary anymore.

Example: If a company collects your information when you sign up for a service, you should be able to log into your account later, see the information they have on file (like your name and email), and, if it’s wrong, request a correction. You should also be able to delete your account and have your personal data removed if you no longer want to use the service. If the company refuses to let you view or change your information, they’re violating the Individual Participation Principle.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Purpose Specification Principle

A

FIPPS Principle: Means that organizations must clearly state the specific reasons why they are collecting personal data before or at the time of collection. They should also only use the data for those stated purposes and not for anything else unless they get permission.

Example: Imagine you sign up for a fitness app that collects your name, age, and workout details. If the app says it collects this data to give you personalized workout plans, that’s the purpose they specified. If later, they decide to sell your data to a marketing company without telling you or getting your consent, they would be violating the Purpose Specification principle because they are using your data for something they didn’t originally specify.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Data Quality Principle

A

FIPPS Principle: Means that organizations should ensure the personal information they collect is accurate, complete, and up to date. This helps prevent any errors or misunderstandings that could occur from incorrect or outdated data.

Example: If you apply for a credit card and provide your current income, the company should make sure to store that information accurately. If they accidentally enter the wrong amount and use that incorrect data to make decisions, they are violating the Quality and Integrity Principle. Keeping data correct and up to date is key to preventing issues.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Use Limitation Principle

A

FIPPS Principle: Means that organizations should only use personal data for the purposes they have clearly stated and agreed to. They cannot use the data for any other reason unless they get permission from the individual.

Example: If you sign up for an online shopping site and they collect your address to send you products, they can’t use that same address to send marketing mail or sell it to another company unless they ask for and receive your permission.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Security Safeguards Principle

A

FIPPS Principle: Means that organizations must take steps to protect personal information from being lost, stolen, or accessed by unauthorized people. This includes using safeguards like encryption, strong passwords, and secure storage methods.

Example: If a hospital stores patient records electronically, they must use strong passwords, encrypt the data, and restrict access to only authorized personnel. If they fail to secure the system and an outsider hacks into the database, the hospital would be violating the Security Principle because they didn’t take proper steps to protect sensitive information.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Accountability Principle

A

FIPPS Principle: Means that organizations are responsible for ensuring that they follow the rules and practices they’ve set for handling personal information. They must make sure that their employees, systems, and processes all comply with privacy regulations, and they should take responsibility if something goes wrong.

Example: If a company promises in its privacy policy that it will protect your data but then experiences a data breach due to weak security, the company should take responsibility by notifying affected customers, fixing the security issue, and possibly offering help, like free credit monitoring. This shows they are accountable for their mistake and are actively working to resolve it. If they just ignore the problem, they are violating the Accountability Principle.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Service Level Agreement (SLA)

A

Is a contract between a service provider and a customer that clearly outlines what services will be provided, the quality or performance standards that must be met, and the responsibilities of both parties. It often includes things like how quickly issues will be resolved, how often the service will be available (uptime), and what happens if the provider doesn’t meet the agreement.

Why should a privacy professional care? It sets clear expectations about how data will be handled and protected by a service provider. For example, it can specify:

  1. Data Security Standards: The SLA may include requirements for encrypting sensitive information and regular security audits to prevent data breaches.
  2. Response Time for Security Incidents: It might state how quickly the provider must respond to a data breach or privacy issue, ensuring fast action to minimize harm.
  3. Data Privacy Compliance: The SLA can outline the provider’s responsibility to comply with privacy laws (like GDPR or HIPAA), ensuring that personal data is handled according to legal standards.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Controller (Data Controller)

A

Individual that determines the purposes and methods for collecting, using, and processing personal data. Essentially, they decide why and how personal data will be used. The Data Controller has the responsibility to ensure that data is handled in compliance with privacy laws, such as the GDPR or other data protection regulations.

Example: A hospital that collects patients’ medical information is a Data Controller because it decides how the personal data (such as medical history and contact information) will be used (e.g., for medical treatments, billing, etc.). The hospital is responsible for ensuring that the data is protected and only used for the stated purposes.

In this case, the hospital must comply with privacy laws, protect the data, and inform patients about how their information will be used. They ensure patients’ privacy rights are respected by not using their data for unauthorized purposes, like marketing, unless they get explicit consent.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Personal Processing Agreement (Data Processing Agreement)

A

A legal document between a company (Data Controller) and another party (Data Processor) that handles personal data on behalf of the company. The agreement outlines how the data should be processed, protected, and used. It ensures that both parties follow data protection laws and that the Data Processor handles the personal data securely and responsibly.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Conceptual

A

A value-sensitive type of investigation that identifies the direct and indirect stakeholders, attempting to identify their values and how it may be affected by the design.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Empirical

A

A value-sensitive type focuses on how stakeholders configure, uses or are otherwise affected by the technology.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Technical

A

A value-sensitive type that focuses on how existing technology supports or hinders human values and how the technology might be designed to support the values identified in the conceptual investigation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Privacy Audit

A

A check-up for a company’s data privacy practices. It involves reviewing how the company collects, stores, and uses personal information to make sure they are following privacy laws and policies. The goal is to find any weaknesses or areas where they might be putting people’s personal information at risk and then fix those issues.

Example: Imagine a social media company wants to make sure they are keeping users’ data safe. They hire a privacy auditor who looks at:

  • What personal data (like emails or photos) the company collects.
  • How the company is storing this data (whether it’s securely stored).
  • Who has access to the data (to make sure only authorized people
    can see it).

If the audit finds that the company is not encrypting data properly, the company would need to improve its security to protect user privacy. This ensures that the company is not only following the law but also protecting users’ personal information from breaches or misuse.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Predictability Objective

A

Means that people should be able to clearly understand how their personal data will be collected, used, shared, and protected. This allows individuals to have confidence and control over their data because they know what to expect. The goal is to ensure there are no surprises when it comes to data handling.

Example: If you sign up for a streaming service, the company should tell you upfront what data they will collect (like your viewing habits) and how they will use it (e.g., to recommend shows or send you marketing emails). If they later decide to sell your viewing data to advertisers without telling you, they would be violating the Predictability objective because you wouldn’t have expected that.

In simple terms, predictability means you should always know what a company is doing with your personal information, so there are no hidden or unexpected uses.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Software Evolution Process

A

The ongoing process of making changes and updates to software after it has been released. This includes fixing bugs, adding new features, improving performance, and ensuring the software stays up to date with the latest technology. It’s like maintaining and upgrading a car—you need to keep it in good condition and make improvements over time.

As software evolves, it’s important to also update its data privacy features. This means making sure that as new features are added or changes are made, the software continues to protect users’ personal information and complies with current privacy laws. For example, if a company adds a new feature that collects more personal data, they need to ensure this data is handled securely and that users are informed about how it will be used.

Example: A mobile app might evolve by adding a location-tracking feature to improve user experience. In this case, the app developers must update the privacy settings to ensure users’ location data is protected, get users’ consent, and explain how the new data will be used. If they fail to do this, they could put users’ privacy at risk.

So, as the software changes, privacy protections must also evolve to keep personal data safe.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Holistic Data Protection

A

Looking at data privacy and security from all angles to ensure that personal information is fully protected throughout its entire life cycle. This approach involves considering not just how data is stored but also how it’s collected, used, shared, and disposed of, while making sure all systems, people, and processes that interact with the data are secure.

Example: Imagine a hospital that handles sensitive patient data. A holistic data protection strategy would involve:

  • Securing the data when it’s collected (like ensuring patients’ health
    records are entered into secure systems).
  • Limiting access so only authorized staff can view the data.
  • Encrypting data to protect it while it’s stored.
  • Properly disposing of the data when it’s no longer needed.

This ensures that every step—from data collection to deletion—is covered, preventing data leaks or misuse.

In simple terms, holistic data protection means taking care of personal data in every possible way, from start to finish.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Record of Processing Activity (RoPA)

A

Detailed document that lists all the ways an organization collects, stores, and uses personal data. It’s like a diary for how data is handled within the company. This record helps the company keep track of their data processing activities and ensures they are complying with data privacy laws like GDPR.

Example: If a retail company collects customer information for orders, the RoPA would include:

  • What kind of data is collected (e.g., names, addresses, payment
    information).
  • Why the data is collected (e.g., to process orders and ship
    products).
  • Who has access to the data (e.g., customer service, shipping team).
  • How long the data will be kept before it’s deleted.

This document helps the company understand where personal data is being used and allows authorities to review the company’s compliance with privacy regulations.

In simple terms, RoPA is a log that helps companies track how they handle personal data to ensure they are following privacy rules and keeping data secure.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Defect

A

Flaw or mistake in the requirements, design, or implementation. It’s something that is wrong at the start, even before the system is used. It may or may not lead to problems later, but it’s there from the beginning.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Fault

A

A fault happens when the system runs and encounters the defect. It’s the specific place in the system where something goes wrong due to the defect. More specifically, it is an incorrect step, process or data definition a computer program.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Error

A

The difference between the computed, observed or measured value and the true or theoretically correct value.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Failure

A

The inability of a system or component to perform its required function within specified performance requirements.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Harm

A

The actual or potential ill effect or danger to an individual’s personal privacy. (Sometimes called a hazard).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Information Collection

A

Solove’s Taxonomy that involves the following:

  • Surveillance involves the observation and/or capturing of an individual’s activities. Example: An advertising website embeds HTML iframes into multiple third-party news, social networking and travel websites to track users by what pages they visit and what links they click on.
  • Interrogation involves actively questioning an individual or otherwise probing for information. Example: A website requires a user to enter their mobile phone number as a condition of registration, although the website’s primary function does not require the phone number and there is no statutory or regulatory requirement to do so.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

Information Processing

A

Solove’s Taxonomy that involves the following:

  • Aggregation involves combining multiple pieces of information about an individual to produce a whole that is greater than the sum of its parts. Example: A retail company correlates purchases of unscented lotions, large tote bags and prenatal vitamins to infer that a customer is likely pregnant.
  • Identification links information to specific individuals. Example: A
    website uses cookies, a recurring IP address or unique device identifier to link an individual’s browsing history to their identity.
  • Insecurity results from failure to properly protect individuals’
    information. Example: A website fails to encrypt private
    communications, thus exposing users to potential future harm.
  • Secondary use involves using an individual’s information without
    consent for purposes unrelated to the original reasons for which it was collected. Example: A retailer uses an email address for marketing purposes when the address was originally collected to correspond about a purchase.
  • Exclusion denies an individual knowledge of and/or participation in
    what is being done with their information. Example: A marketing firm
    secretly purchases consumer data to advertise to the customer under a different company name without their knowledge.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

Information Dissemination

A

Solove’s Taxonomy that involves the following:

  • Breach of confidentiality results from revealing an individual’s personal information, despite a promise not to do so. Example: A platform releases a user’s data to a third-party plug-in despite the platform’s privacy notice promising not to disclose the data to anyone.
  • Disclosure involves revealing truthful information about an individual that negatively affects how others view them. Example: A private “lifestyle” service discloses a list of members, which is obtained by groups who disapprove of the lifestyle.
  • Distortion involves spreading false and inaccurate information about an individual. Example: An employment history verification service incorrectly identifies a job applicant as a felon.
  • Exposure results from the revelation of information that we normally conceal from most others, including private physical details about our bodies. Example: A person’s prior purchase of a urinary incontinence product is used as a promotional endorsement and sent to the person’s broader social network.
  • Increased accessibility involves rendering an individual’s information more easily obtainable. Example: A children’s online entertainment service allows any adult to register and interact with child members, leaving these children accessible to strangers without parental consent.
  • Blackmail is the threat to disclose an individual’s information against their will. Example: An overseas medical claims processor threatens to release patient data to the internet unless new employment conditions are met.
  • Appropriation involves using someone’s identity for another person’s purposes. Example: An online dating service uses a customer’s personal history, including age, biography and education, to promote its website to new customers.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

Invasion

A

Solove’s Taxonomy that involves the following:

  • Intrusion consists of acts that disturb an individual’s solitude or
    tranquility. Example: A mobile alert notifies potential customers that
    they are within the proximity of a sale.
  • Decisional interference involves others inserting themselves into a
    decision-making process that affects the individual’s personal affairs.
    Example: A website limits access to negative product reviews to bias a new user toward a specific product selection.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

Traceability Matrix

A

Encoding relationships between requirements and other software artifacts. A simple checklist or table that helps keep track of what needs to be done in a project. It shows how every part of the project matches up with the original goals or requirements, making sure nothing is forgotten. In privacy, for example, a trace matrix can link requirements to a privacy law or principle.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

Standard Operating Procedure (SOP)

A

Detailed instruction manual for how to do a specific task or process in the same way every time. It explains step-by-step what needs to be done, by whom, and how to do it correctly to avoid mistakes.

In data privacy, an SOP ensures that everyone handling sensitive information, like personal data, follows the same rules to keep that information safe. This can include rules about how to collect, store, share, and protect data, making sure that the company complies with laws and regulations that protect people’s privacy.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

Confidentiality

A

A quality attribute that ensures information is only accessible by authorized individuals. It ensures that data is protected and only accessible to authorized users. It prevents unauthorized access or disclosure.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

Integrity

A

A quality attribute that ensures information has not been unintentionally modified. It ensures that data remains accurate, consistent, and unaltered except by authorized people or processes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
32
Q

Availability

A

A quality attribute that ensures information is readily available whenever needed. It refers to how easily and reliably a system or data can be accessed by authorized users when needed.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
33
Q

Privacy Design Pattern

A

is a reusable solution in system or software design that helps protect user privacy by embedding privacy principles directly into the design process. It ensures that privacy is considered and integrated from the start, not as an afterthought, to minimize risks associated with personal data. Here is the four elements of a design pattern:

  • Pattern Name (Context): Name which references the pattern.
  • Pattern Description: This explains the privacy challenge or concern that the pattern addresses. It describes the potential risk to personal data or user privacy, highlighting why the pattern is needed in that specific context.
  • Pattern Solution: This outlines the recommended approach or technique to mitigate the privacy concern. The solution provides a detailed description of how to design the system or process to protect privacy effectively.
  • Consequence: This element explains the potential outcomes, both positive and negative, of applying the pattern. It describes the benefits of enhanced privacy and any trade-offs, such as increased complexity in implementation or reduced system performance.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
34
Q

Functional

A

Describes what the system is suppose to do. It refers to the specific operations or actions the system will perform to achieve its goals. For example, a functional requirement may be that the system shall provide a link to a privacy notice at the bottom of every page (this is the functionality of what must be built into the system).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
35
Q

NonFunctional

A

Refers to the constraints or conditions that the system must meet. These often concern quality attributes or the standards of performance, security, or usability. An example would be a system shall not disclose personal information without authorization or consent. It focuses on how the system will perform or adhere to specific guidelines or constraints.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
36
Q

Functional versus Nonfunctional

A

Functional is a to-do list for the system or a list of tasks that it must do in order to perform its purpose. Nonfunctional requirements are rules or constraints that affect it performance, security or user experience (an example would be a system should load in 5 seconds for all users).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
37
Q

Identifiability

A

Quality attribute for privacy, which refers to how easily a user can be recognized or identified within a system. By controlling the types of information that are shared or logged, like using pseudonyms instead of real names, it becomes harder for unauthorized people to figure out someone’s identity. The goal is to reduce the chances of someone being identified by keeping certain information separate or hidden. An example of this attribute would be in a system where users sign in, instead of recording their full name in web server logs, the system might use a code or nickname. This way, even if someone accesses the logs, they won’t know the user’s real identity.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
38
Q

Network Centricity

A

Quality attribute for privacy in which the extent to which personal information remains local to the client (Information is retained on client side and transfer it only to complete a transaction).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
39
Q

Mobility

A

Quality attribute for privacy in which the extent to which a system moves from one location to another, as in laptop and
mobile phone capabilities.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
40
Q

Architecture

A

A CONTROL to minimize privacy risk: Focus on designing systems to minimize how easily personal data can be identified or misused. The goal is to make data less personal (pseudonymized or anonymous) and decentralize its storage and use, making it harder for threat actors to access or misuse it.

Example: Instead of storing sensitive personal data in one central location, it’s split and stored in different places. Also, instead of using real names, data could be pseudonymized by replacing names with unique codes or IDs. This way, even if someone gets access to the system, they can’t easily identify individuals.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
41
Q

Security

A

A CONTROL to minimize privacy risk: Protect data by hiding or encrypting it so that even if someone gains access to the system, they cannot read or misuse the data. This might involve encrypting data at different stages during its collection or storage.

Example: When a user submits their credit card information to an online store, the system encrypts that data as soon as it is entered, making it unreadable without a special decryption key. This prevents hackers from seeing the real credit card number even if they access the data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
42
Q

Supervision

A

A CONTROL to minimize privacy risk: Ensure that an organization, and anyone it works with (like third-party vendors), follows privacy policies and procedures. It allows an organization to monitor and enforce privacy compliance, ensuring that everyone handles personal data correctly.

Example: A company that processes personal data regularly audits its third-party service providers (such as cloud storage companies) to make sure they are following the correct privacy practices, such as encrypting personal data and not storing it for too long.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
43
Q

Balance

A

A CONTROL to minimize privacy risk: Ensuring fairness when collecting or using personal data. This means informing people about what data is being collected, giving them some control over it, and making sure the benefits of collecting that data are proportional to the risks involved.

Example: When a mobile app collects users’ location data, it asks for clear consent and provides an option to disable location tracking. This ensures that users are informed and have control over whether they want to share this sensitive information.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
44
Q

Verfication

A

Ensures the resultant system performs the way it is supposed to perform

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
45
Q

Validation

A

Ensures the requirements satisfy the needs of the intended user base

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
46
Q

When to use a PIA or DPIA

A
  • New technology or system
  • Significant changes to existing systems
  • Data-sharing initiatives (sharing of information to third parties or
    across organizational boundaries).
  • Large scale data processing (Processing large volumes of data that
    could impact on individual privacy)
  • New and emerging technologies (biometrics, facial recognition, etc.)
  • Company mergers.
47
Q

Predictability

A

A privacy engineering objective which means that everyone involved with the system—whether it’s the company using the data or the individuals providing it—should have a clear understanding of how personal data is handled. This ensures that:

  • The system follows privacy rules that can be tracked and measured (like making users agree to a privacy notice).
  • Everyone can explain what’s happening with the personal data in simple terms.
  • There are privacy protections beyond just giving notices, like techniques that make it harder to identify someone from shared data.
  • Trust is built between users and system operators, allowing for improvements in service while keeping privacy intact.
48
Q

Manageability

A

A privacy engineering objective which is about having control over personal information—like being able to change, delete, or limit who sees it. It ensures:

  • Corrections to inaccurate data can be made.
  • Only the necessary personal data is collected and shared.
  • The right people are in charge of making changes to data and
    protecting privacy preferences.
49
Q

Disassociability

A

A privacy engineering objective that focuses on minimizing the connection between personal data and individuals, as much as possible without disrupting the system’s function. It means:

  • Keeping personal information separate or hidden when possible, for example, through anonymization or pseudonymization (where real identities are masked).
  • Developing new ways to separate individuals from their data to better protect their privacy.
50
Q

Cookies

A

Small files that websites store on your computer or device to remember certain information about you. These files can be helpful for things like keeping you logged into a website or saving your preferences so that the site works the way you like it.

For example, if you go shopping on a website, the website might use cookies to remember what items you added to your cart. Even if you leave the website and come back later, the cookies help the website remember that your cart still has those items, so you don’t have to start over.

51
Q

Just-in-time notice

A

Privacy notice provided to users at the moment they are about to engage in an activity that involves data collection or processing.

52
Q

Data Mapping

A

Matching or connecting data from one source to a different destination, so they align properly and can be used together. It ensures that data is accurately transferred or integrated between systems, databases or formats. It should be noted that this is concerned with the transferring and processing of data.

53
Q

Data Schema

A

A blueprint or structure that defines how data is organized and stored in a database. It outlines the tables, fields, and the relationships between them.

Privacy Note: Data schema is important in data privacy, since it can be used to sperate customer (personal data). An example would be to separate purchase history from personal history.

54
Q

Data Pruning

A

Refers to the practice of removing outdated or unnecessary data, which reduces the amount of stored information, thereby lowering the risk of data breaches or unauthorized access.

55
Q

Digital Signature

A

Secure electronic method of verifying the authenticity and integrity of a digital document, message, or transaction. A virtual fingerprint. Here is how is works; When someone signs a document, they generate a unique encrypted code (the digital signature) using a private key. The recipient can then use the sender’s public key to verify the signature.

56
Q

Declared Data

A

Information that a user voluntarily provides to a social network or platform.

57
Q

Pharming

A

A tactic where attackers redirect users to fraudulent websites by altering the domain name system (DNS).

58
Q

ISO 27701

A

An IT framework that provides guidance on establishing, Implementing, and continually improving a privacy information management system.

59
Q

BS 10012

A

Outlines requirment for information managment system.

60
Q

NIST Cybersecurity Framework

A

Outlines controls for cybersecurity risk management across infrastructure.

61
Q

NICE Framework

A

Provides guidance on cybersecurity education, training, and workforce development.

62
Q

FAIR Model

A

Assess and quantify risks related to information security in an organization. It is a way that quantifies financial risk in financial terms using ALE (Annualized Loss Expectancy) and SLE (Single Loss Expectancy).

63
Q

OECD

A

OECD is an international organization that promotes policies aimed at improving the economic and social well being around the world. The OECD Privacy Guidelines were published in 1980 and revised in 2013. The OECD guidelines set foundational principles for data protection. The principles are similar to FIPPS, put OECD differs in that it is internationally focused and provide global standards (as opposed to US standards with FIPPS).

64
Q

Deep Inspection Monitoring System

A

Carefully examining the contents of data, looking at individual parts of network traffic, emails, and files for hidden threats like viruses or malware. It works by analyzing the details inside the data packets (which are small pieces of data that travel across the internet) to identify anything suspicious or harmful. This thorough examination helps catch more advanced threats that might slip through simpler security checks.

PRIVACY CHALLENGES: Due to the deep inspection, personal data may be collected. To minimize risk, using data masking techniques (anonymization or pseudonymization), ensuring compliance with privacy regulations, apply data minimization (focusing on specific types of traffic that pose the biggest risks), and encryption.

65
Q

Design Thinking Process

A

Five Stages:
1. Empathize: Understand what the challenge is by engaging stakeholders and trying to put yourself in their shoes.
2. Define: Define the real problem you are trying to solve.
3. Ideate: Brainstorm and come up with solutions.
4. Prototype: Focus on a solution and create a mock up (prototype)
5. Test: Try it out with users and improve

66
Q

Software Process Models

A

Requirements Engineering: This is where we figure out what the software needs to do, including user goals, system functions, and other important aspects like privacy and performance.

Design: In this stage, we plan how the software will work. This involves creating components (like building blocks) and showing how they connect, such as how data moves between a client and server.

Implementation: Here, we write the code to build the software based on the design. We also set up everything needed to get the system running for the first time.

Testing: We check to make sure the software works as expected. This includes creating tests to verify specific functions and letting users try it to see how it performs in real life.

Deployment: This is when we install and set up the software for actual use. It may also involve training users so they know how to operate it.

Maintenance: After deployment, we continue fixing any bugs and adding new features over time to keep the software up-to-date and useful.

67
Q

Information Assets

A

Customer and employee data as well as backup copies of data stored either on-site or off-site

68
Q

Physical Assets

A

workstations, laptops, portable storage devices, backup media, paper files

69
Q

Intellectual Property

A

Software code, trade secrets, brand

70
Q

Asset Classifications

A

Confidential: Information that should remain secure and private: customer information, employee Social Security numbers, payment account information

Internal use: Business information intended for internal use only: company contact directories, business plans, sales forecasts, proprietary software codes

Public: Information that can be safely shared with the public: physical address, marketing materials, customer service information

71
Q

Incident Response Plan

A

Discovery: Actively monitoring system activity or suspicious changes to system activity is essential in detecting an incident that could lead to a breach. Monitoring activity on a system could detect tampering before any data is stolen. Users are also another line of defense in the detection of privacy incidents or data breaches, by reporting suspicious activity.

Containment: A response plan should contain guidance on how to terminate an ongoing incident while preserving any evidence of the affected data and origin of the incident. Containment is key to stopping the threat before more damage is done. Do not wipe system logs. Remove and preserve affected systems from the network. Fully document your investigation and include timestamps while working through an investigation. Finally, a predetermined contingency plan should be executed that allows the organization to continue functioning at some capacity while data or resources are locked down during a privacy incident investigation.

Analyze and notify: For data breaches and other types of privacy incidents, notification laws vary among jurisdictions. To be prepared, an organization should know what their notification obligations are in such an event. Once a privacy incident or a breach has been detected and determined, legal counsel should be involved to advise the response team regarding all legal matters, including notification—to law enforcement, individuals and/or the public. Some organizations contract with a vendor to provide consumer breach notification services as they are up-to-date on laws surrounding breaches and can provide additional resources as needed.

Repercussions: Fines, lawsuits and nonmonetary repercussions often follow privacy incidents or breaches. For example, media coverage of the incident may adversely affect an organization’s reputation, resulting in decreased business and loss of consumer trust. As part of the incident response team, a security analyst would handle an incident from start to finish including reporting to senior management. A privacy technologist would act as a subject matter expert to help diagnose the incident, mitigate the issue and provide information to the security analyst.

Prevention: Privacy incidents can be used as a learning tool to address holes in security and privacy procedures, review privacy policies to identify weaknesses and train employees as needed.

Third parties: Personal information in the hands of a third party, still falls under the responsibility of the organization in the event of a breach, including provisions that describe the expectations and obligations of the vendor should an incident occur.

72
Q

Abstracting

A

Abstracting involves showing only the essential details and hiding unnecessary specifics. This helps protect sensitive information by focusing on general or high-level information instead of detailed data.

Example: Instead of displaying a person’s full birthdate, you might only show their birth year, keeping the exact day and month private.

Here are the key methods:

Grouping: Instead of processing data individually, it groups similar data together. For example, an online store might notice that people who buy hammers also buy nails, and use this to suggest related items to customers.

Summarizing: This involves putting detailed data into broader categories. For example, instead of knowing someone’s exact birthday, a company might summarize this data to only the month and day to offer special discounts.

Perturbing: This method adds a little “noise” or randomness to data, making it less specific. For example, an app that warns drivers of hazards might slightly delay the information to avoid sharing exact locations or personal details like speed or name.

73
Q

Data Hiding

A

Protect personal information by making it harder to access or connect to specific individuals, whether intentional or accidental. Here are simple explanations of the key hiding methods:

  1. Restrict:
    Only authorized people can access certain data. This is done by requiring things like logins, passwords, or encryption keys. Access may depend on a person’s role, attributes, or specific situation.

Example: Only a customer can see their own account details after logging in.

  1. Mix:
    Mixing data with others’ information makes it harder to trace specific details back to an individual. This helps protect privacy while still getting useful information.

Example: A TV company can analyze viewing trends without knowing exactly which customer watched what.

  1. Obfuscate:
    This method scrambles or hides information so it’s not easily readable or understandable. Encryption or coding techniques are commonly used to achieve this.

Example: Encrypting emails so only the intended recipient can read them.

  1. Dissociate:
    Dissociation separates a person’s identity from their data after it’s no longer needed for a specific purpose. This keeps personal information from being linked back to individuals.

Example: A restaurant needs customer details for food delivery but can delete the personal link once the delivery is complete and just keep general sales data.

  1. Masking:
    Masking changes or hides parts of data to protect it while still making it usable for analysis. This might involve removing or scrambling sensitive fields, or showing only part of the information.

Example: A database might hide part of a credit card number (showing only the last four digits) when a non-privileged user views it.

74
Q

Data Taxonomy

A

A way of organizing and labeling data so that everyone in an organization understands what it is and how it should be used. Think of it as a system of categories or a “filing system” for data, much like how a library organizes books into different sections (like fiction, non-fiction, etc.).

Key points about Data Taxonomy:

Classification: It helps to classify data into groups based on similarities. For example, all customer-related data might be grouped under “Customer Information.”

Consistency: It creates a common language across an organization, so different teams can easily understand and access the same data without confusion.

Efficiency: By organizing data well, it becomes easier to find, retrieve, and use the data for decision-making or analysis.

Governance: Having a clear data taxonomy helps ensure that data is used properly, aligns with rules and regulations, and maintains quality across the organization.

75
Q

Data Dictionary

A

Centralized repository or guide that defines and describes the structure, meaning, and usage of data elements within an organization. It’s like a reference manual that explains what each piece of data means, how it’s structured, and how it should be used.

Key elements of a Data Dictionary:

Data Element Definitions: It provides clear definitions for each data field or element, so everyone understands what the data represents. For example, it defines what “CustomerID” or “OrderDate” means.

Metadata: This includes technical details about the data, such as data type (e.g., string, number, date), size, allowable values, and formats.

Relationships: It shows how different data elements relate to each other. For instance, “EmployeeID” in one table might relate to “EmployeeName” in another.

Source: It often tracks where the data comes from, helping ensure accuracy and consistency.

Usage Guidelines: It explains how the data should be used, including any restrictions or rules that apply, such as privacy regulations or business rules.

76
Q

Technological Controls

A

In simpler terms, Technological Controls in privacy engineering are the tools and systems that ensure privacy rules (internal controls) are followed automatically by technology. These controls help protect data, manage how it’s accessed, used, and stored, and ensure compliance with privacy regulations.

Let’s break down the definition and examples provided:

Access Control Points: This is like setting up locks and keys for data. Only the people who truly need access to certain information can get to it. For example, only certain employees might have access to customer payment details.

Dataflow Control Points: Think of this as controlling the “pipes” through which data flows. It ensures that only the necessary data is collected and shared. If a company only needs a customer’s email, there’s no need to collect their phone number, limiting how much data you gather.

Retention Control Points: This means setting up automated systems to delete data once it’s no longer needed. For instance, if your company only needs to keep data for a year, these controls would ensure the system deletes that data after 12 months, preventing unnecessary storage.

How Technological Controls fit into Privacy Engineering:
Privacy Engineering is about designing technology systems with privacy in mind. The goal is to build privacy into the technology rather than adding it as an afterthought.
These technological controls translate privacy rules (like “only collect necessary data” or “delete after X time”) into automated processes within the system.
In essence, Technological Controls are automated privacy “guardrails” that help ensure your system operates in line with the privacy rules you’ve set up, making privacy a natural part of the technology itself. Does that help clarify the concept?

77
Q

Solove’s Taxonomy of needs

A
  1. Collection
    * Surveillance: Observing or monitoring someone’s activities (e.g., website tracking).
    * Interrogation: Probing or asking for personal information (e.g., inappropriate interview questions).
  2. Information Processing
    *Aggregation: Combining different pieces of personal data to create a complete profile (e.g., a credit bureau).
    *Secondary Use: Using personal data for purposes other than the original intent (e.g., using census data for different purposes).
    *Exclusion: Denying an individual access to information held about them (e.g., not informing them about how their data is used).
    *Insecurity: Failing to safeguard information, leading to exposure (e.g., data leaks).
    *Identification: Linking personal information to a specific individual (e.g., identifying a person by ZIP code and date of birth).
  3. Information Dissemination
    *Disclosure: Revealing truthful but potentially harmful personal information (e.g., exposing a home address).
    *Exposure: Making private information public (e.g., revealing sensitive health conditions).
    *Breach of Confidentiality: Breaking a promise to keep someone’s information private (e.g., a doctor sharing confidential records).
    *Increased Accessibility: Making private information more accessible than it should be (e.g., unredacted court proceedings online).
    *Appropriation: Using someone’s identity for personal or business benefit without consent (e.g., using a customer’s likeness in advertising).
    *Distortion: Spreading misleading or false information about someone (e.g., a creditor falsely reporting unpaid debt).
  4. Invasion
    * Intrusion: Disrupting someone’s solitude or private space (e.g., directing unwanted players into a private space in augmented reality).
    * Decisional Interference: Interfering with someone’s personal decisions (e.g., blocking transactions for contraceptives).
78
Q

Data Protection Office (DPO)

A

Responsible for overseeing the organization’s data protection strategy and ensuring compliance with data protection laws, such as the GDPR (General Data Protection Regulation) in Europe or other global privacy regulations. Their focus is on the legal and regulatory environment. They serve as a point of contact between the organization and regulatory authorities and also between the organization and data subjects (individuals whose data is being processed).

79
Q

Privacy Technologist

A

Focuses on the technical implementation of privacy measures within an organization. They work on embedding privacy by design and privacy-enhancing technologies (PETs) into products, services, and systems. Their role is more technical, ensuring that IT systems, software, and infrastructure comply with privacy requirements, and they often work closely with engineers and developers to implement privacy features such as encryption, anonymization, or secure data storage.

80
Q

IT Framework

A

A structured set of guidelines, standards, and best practices designed to help organizations effectively manage and govern their information technology processes. IT frameworks provide a blueprint for how IT services are delivered, managed, and aligned with business objectives. They guide organizations in achieving consistency, efficiency, security, and compliance within their IT operations.

81
Q

Accept the Risk

A

If the risk is small, it might be okay to accept it. For example, if the risk only involves consumer reviews being revealed, it may not be a big deal. Sometimes, accepting the risk is the best option if the cost to fix or avoid it is too high.

82
Q

Transfer Risk

A

This means passing the risk to someone else who can manage it better. For instance, a company could use a third-party service to handle payroll or payments securely. The company is still responsible, but the third-party is managing the risks day-to-day.

83
Q

Mitigate the Risk

A

Reducing the risk by implementing controls is a common response. This could be done by adding privacy features in software or changing business processes. For example, regularly backing up data can reduce the risk of data loss.

84
Q

Avoid the Risk

A

The company can change the system or process to completely avoid the risk. For example, instead of asking for a person’s date of birth, the system could just ask if the person is over 18, avoiding the need for more sensitive information.

85
Q

Quality Attribute

A

Nonfunctional requirements used to evaluate how a system is performing. More specifically, it refers to characteristics or features of a system or product that determine its performance and overall quality. These attributes help in assessing whether the system meets the expected requirements and delivers the necessary performance.

86
Q

Front End

A

Part of the system the user experiences. An example would be a web browser (Client) and the Web Server (Server). From a privacy standpoint, to ensure useability: effective notification of privacy practices, obtaining consent, simple tutorials or introductions to new features of a site.

87
Q

Back End

A

This is the part of a computer system/application that allows it to operate but cannot be accessed by a user. An example would be a web service and the database. From a privacy perspective, it is important to apply privacy principle pertaining to collect data, including when it used, disclosed and retained.

88
Q

High Level Design

A

Early phase of the software development process where the overall system structure and architecture are defined. It focuses on a broad view of the system on how the main components interact. Incorporating privacy in HLD involves:
* Data Minimization: Designing the system so it only collects and
processes the minimal amount of personal data necessary.
* Access Control: Defining who can access what data and ensuring
that only authorized users can view sensitive information.
* Encryption: Planning how sensitive data will be encrypted both
when it’s stored and when it’s being transferred between system
components.
* Privacy by Design: Ensuring that the design includes privacy
features (like anonymization or data deletion mechanisms) as
core parts of the system, not as afterthoughts.
* Compliance: Ensuring the design aligns with privacy regulations,
so you don’t face legal issues later.

89
Q

Low Level Design

A

Is the next step after High-Level Design (HLD). While HLD focuses on the overall architecture and components of the system, LLD dives into the specific details of how each component will work. It’s like a blueprint that outlines the exact implementation of the system’s components, including how data is processed, how algorithms are written, and how each function or module will operate. Incorporating privacy in LLD involves:
* Data Encryption: LLD will specify the exact encryption algorithms and methods used to protect sensitive data like passwords, personal identifiers, or credit card numbers. For example, it would define using AES-256 encryption for data at rest or TLS for secure communication.
* Access Control and Authentication: LLD describes how user authentication and access control are technically implemented. It defines the code that checks user permissions and ensures that only authorized users can access or modify sensitive information.
* Data Masking and Anonymization: LLD specifies how and where data should be anonymized or masked. For instance, in a data processing module, LLD might describe how personally identifiable information (PII) is replaced with anonymized identifiers.
* Logging and Auditing: The LLD includes details on how user actions will be logged for security audits while ensuring that logs don’t expose sensitive data unnecessarily. For example, it would describe logging user activity without storing full credit card numbers or personal data in the logs.
* Privacy by Default: LLD ensures that features like data retention policies or consent mechanisms are built into the code. For example, LLD could specify the implementation of automatic data deletion scripts after a certain period, as required by privacy laws.

90
Q

Control

A

Safeguard or mechanism that is put in place to ensure that the system operates correctly, securely, and in accordance with rules, regulations, or best practices. It is intended to protect the system from risks (unauthorized access, data breaches, system failures, etc.)

91
Q

Pseudonymized Data

A

Personal data that has been modified so that individuals cannot be directly identified without additional information. The data is still linked to individuals, but identifying details (like names or social security numbers) are replaced with pseudonyms, such as codes or other identifiers. However, it is still possible to re-identify the individuals if the pseudonym is matched with the original identifying information.

Example: Instead of using a patient name in collecting health data, a unique ID number is created, like Patient1567.

92
Q

Anonymized Data

A

Transforming personal data in such a way that individuals cannot be identified at all, either directly or indirectly, and re-identifying them is practically impossible.

Example: Suppose a company conducts a survey about employee job satisfaction. To anonymize the data, they remove all direct identifiers, like names, email addresses, and employee IDs. They also remove or generalize other information that could indirectly identify someone, like the exact job title or department if it is unique. Instead of “Manager in Sales Department,” the data might say “Mid-level employee.”

93
Q

Aggregation

A

A way of protecting data, where personal information is expressed in a summary form that reduces the value and quality of data as well as the connection between the data and the individual. Some types of aggregation: 1) Frequency and Magnitude data, 2) Differential Privacy and 3) Differential Identifiability.

94
Q

Differential Privacy

A

When data is aggregated, personal identifiers are removed. Even when data is summarized, it is still possible to reverse engineer the data. This technique works by having an algorithm generate random “noise” (random alterations) to the data or the results of the data analysis. The noise makes it harder to reverse-engineer individual data points, thus protecting privacy.

Example: A company wants to share data about how often users click on different links without revealing individual user behavior. By using differential privacy, they add small random variations to the data. For example, if a user clicked a link 5 times, it might be recorded as 4 or 6 clicks instead, making it harder to identify any one user’s exact actions.

95
Q

Differential Identifiability

A

When data is aggregated, personal identifiers are removed. Even when data is summarized, it is still possible to reverse engineer the data. The problem with Differential Privacy is that there lacks clear guidelines on how much noise to add before the quality of the summarized data becomes poor. This technique improves on differential privacy by setting parameters (based on the individual identification’s contribution) for the algorithm to generate noise.

96
Q

Private Information Retrieval

A

A set of techniques that allow someone to retrieve data from a database without revealing to the database (or anyone else) what specific data they are looking for. The main goal of PIR is to protect the privacy of the person retrieving the information. Normally, when you ask a database for information (like searching for a book in an online library), the database knows exactly what you’re searching for. PIR prevents this by using special protocols that allow you to ask for the data in a way that hides your request. The database can still give you the correct answer, but it has no idea which part of its data you were actually interested in.

Example: Suppose you want to look up a medical record in a database, but you don’t want the database administrators to know whose record you’re looking at. With PIR, you could request information from the database in such a way that you receive the medical record you need, but the database doesn’t know which patient’s record you retrieved. It simply provides data without knowing what specific record you asked for.

97
Q

Secure Multiparty Computation

A

A technique that allows multiple parties (computers, organizations, or individuals) to work together to perform a calculation or solve a problem without sharing their private data with each other. Each party contributes data to the computation, but no one can see the other parties’ data—only the final result is shared or known. n SMPC, each participant breaks their data into pieces and shares those pieces in a secure way with the other participants. They then perform computations using these pieces, but the system is designed so that no one can figure out the private data from the pieces they receive. After the computation is finished, they combine their results to get the final answer, without ever revealing their private data.

Example: Imagine three companies want to calculate the average salary of their employees but don’t want to share their individual salary data with each other. Using SMPC, they can each contribute their salary information, perform the necessary computation, and find out the average salary across all three companies without any company knowing the exact salaries of the other companies’ employees.

98
Q

Increased Accesibility

A

Making information searchable and findable, amplifying the accessibility of personal information.

99
Q

COPPA

A

The Children’s Online Privacy Protection Act (COPPA) Rule is a U.S. federal law designed to protect the privacy of children under 13 years old when they use the internet. Here’s a simple summary of its key points:

Who it applies to:
*Websites and online services that are directed toward children under 13.
*General audience sites that knowingly collect information from children under 13.

Key Requirements:
*Parental Consent: Websites must get verifiable parental consent before collecting personal information from children under 13.
*What is “Personal Information”: This includes things like name, address, email, phone number, geolocation, photos, and any other data that can identify a child.
*Privacy Policy: Websites must post a clear, easy-to-understand privacy policy explaining their data collection practices and how they handle children’s information.
*Parental Rights: Parents must be able to:
Review the information collected about their children.
Delete the information if they want.
Opt-out of further data collection.
*Data Security: Companies must take reasonable steps to keep children’s data secure and confidential.
*Limitations on Data Usage: Companies can only collect data for the purpose it was intended for and should not keep it longer than necessary.

Penalties:
*Companies that violate COPPA can face fines and other enforcement actions from the Federal Trade Commission (FTC).

In short, COPPA ensures that websites and apps respect children’s privacy and involve parents in decisions about what data is collected and how it is used.

100
Q

Information Needs

A

Specific types of data or information required by organizations to fulfill tasks or goals.

101
Q

Labels

A

Characteristics that point to an individual.

102
Q

Identifiers

A

Codes or strings used to represent an individual, device or browser.

103
Q

Quasi-identifiers

A

Combine data with external knowledge to identify an individual.

104
Q

Identifiability

A

Extent to which a person can be identified.

105
Q

Distribution (or isolation)

A

A method of separation, where collected information by either logically or physically segregating it. (An example, sending disability data to HR, while salary data sent to payroll).

106
Q

Single Sign On (SSO) or Cross-Enterprise Authentication

A

Allowing access to multiple systems with one set of credentials.

107
Q

Single Factor Authentication

A

Uses only one factor (usually a password) to authenticate.

108
Q

Multi-factor Authentication

A

Using two or more factors for authentication.

109
Q

Augmented Reality

A

Adds digital things to the real world.

110
Q

Virtual Reality

A

Puts the user into a completely virtual world, cutting of the real one.

111
Q

Mixed Reality

A

Combines the real and virtual worlds so they interact with each other.

112
Q

Machine Learning (ML)

A

A subset of AI that enables machines to learn from data; ML uses data to identify patterns and make predictions or decisions without human interventions.

Privacy Concern: Uses personal data in training and can lead to unintended insight or predictions about individuals. Also, data leaks and misuse of data (if not anonymized).

113
Q

Artificial Intelligence (AI)

A

Machines that simulate human intelligence to perform tasks. Uses rules and logic to mimic human-like behavior.

Privacy concern: Invasion of privacy and lack of control over personal data.

114
Q

Deep Learning (DL)

A

A specialized subset of ML that uses neural networks with multiple layers to process complex patterns in large amounts of data. The neural networks to process data in a way that mimics the human brain’s structure and functioning. DL can recognize objects, faces or scenes from raw data (photos, videos, etc.).

Privacy concern: Risk of data exposure and difficult to challenge how personal data is processed.

115
Q

Self-Representation

A

Also known at Self interference, it is when another person alters how an individual is represented or regarded. A type of interference.