Things I Don't Get That Need Review Flashcards
Cards needed since I am a complete dumb fuck
ADCollection Limitation
FIPPS Principle: Means that organizations should only collect personal data that is necessary for a specific purpose, and they should collect it in a lawful and fair manner. It also means they should ask for consent from the individual before collecting the data when possible.
Example: Imagine a company offering a free newsletter. If they ask for your email to send it to you, that’s fine. But if they also ask for your home address, birth date, and phone number without explaining why they need that information, they would be violating the Collection Limitation principle, because they’re collecting more information than necessary.
Openness Principle
FIPPS Principle: Means that organizations should be clear and transparent about their data practices. They should openly communicate how they collect, use, store, and share personal information. Individuals should be able to easily find information about what data is being collected and how it’s being used.
Example: If you’re using a social media app, the company should provide a clear privacy policy explaining what kind of personal information they collect (like your name, email, location, etc.), why they collect it, and who they might share it with (like advertisers). If they change how they use your data, they should let you know.
Individual Participation Principle
FIPPS Principle: People should have control over their personal information. This principle gives individuals the right to know what data organizations have about them, request access to it, correct any errors, and even ask for the information to be deleted or removed if it’s not necessary anymore.
Example: If a company collects your information when you sign up for a service, you should be able to log into your account later, see the information they have on file (like your name and email), and, if it’s wrong, request a correction. You should also be able to delete your account and have your personal data removed if you no longer want to use the service. If the company refuses to let you view or change your information, they’re violating the Individual Participation Principle.
Purpose Specification Principle
FIPPS Principle: Means that organizations must clearly state the specific reasons why they are collecting personal data before or at the time of collection. They should also only use the data for those stated purposes and not for anything else unless they get permission.
Example: Imagine you sign up for a fitness app that collects your name, age, and workout details. If the app says it collects this data to give you personalized workout plans, that’s the purpose they specified. If later, they decide to sell your data to a marketing company without telling you or getting your consent, they would be violating the Purpose Specification principle because they are using your data for something they didn’t originally specify.
Data Quality Principle
FIPPS Principle: Means that organizations should ensure the personal information they collect is accurate, complete, and up to date. This helps prevent any errors or misunderstandings that could occur from incorrect or outdated data.
Example: If you apply for a credit card and provide your current income, the company should make sure to store that information accurately. If they accidentally enter the wrong amount and use that incorrect data to make decisions, they are violating the Quality and Integrity Principle. Keeping data correct and up to date is key to preventing issues.
Use Limitation Principle
FIPPS Principle: Means that organizations should only use personal data for the purposes they have clearly stated and agreed to. They cannot use the data for any other reason unless they get permission from the individual.
Example: If you sign up for an online shopping site and they collect your address to send you products, they can’t use that same address to send marketing mail or sell it to another company unless they ask for and receive your permission.
Security Safeguards Principle
FIPPS Principle: Means that organizations must take steps to protect personal information from being lost, stolen, or accessed by unauthorized people. This includes using safeguards like encryption, strong passwords, and secure storage methods.
Example: If a hospital stores patient records electronically, they must use strong passwords, encrypt the data, and restrict access to only authorized personnel. If they fail to secure the system and an outsider hacks into the database, the hospital would be violating the Security Principle because they didn’t take proper steps to protect sensitive information.
Accountability Principle
FIPPS Principle: Means that organizations are responsible for ensuring that they follow the rules and practices they’ve set for handling personal information. They must make sure that their employees, systems, and processes all comply with privacy regulations, and they should take responsibility if something goes wrong.
Example: If a company promises in its privacy policy that it will protect your data but then experiences a data breach due to weak security, the company should take responsibility by notifying affected customers, fixing the security issue, and possibly offering help, like free credit monitoring. This shows they are accountable for their mistake and are actively working to resolve it. If they just ignore the problem, they are violating the Accountability Principle.
Service Level Agreement (SLA)
Is a contract between a service provider and a customer that clearly outlines what services will be provided, the quality or performance standards that must be met, and the responsibilities of both parties. It often includes things like how quickly issues will be resolved, how often the service will be available (uptime), and what happens if the provider doesn’t meet the agreement.
Why should a privacy professional care? It sets clear expectations about how data will be handled and protected by a service provider. For example, it can specify:
- Data Security Standards: The SLA may include requirements for encrypting sensitive information and regular security audits to prevent data breaches.
- Response Time for Security Incidents: It might state how quickly the provider must respond to a data breach or privacy issue, ensuring fast action to minimize harm.
- Data Privacy Compliance: The SLA can outline the provider’s responsibility to comply with privacy laws (like GDPR or HIPAA), ensuring that personal data is handled according to legal standards.
Controller (Data Controller)
Individual that determines the purposes and methods for collecting, using, and processing personal data. Essentially, they decide why and how personal data will be used. The Data Controller has the responsibility to ensure that data is handled in compliance with privacy laws, such as the GDPR or other data protection regulations.
Example: A hospital that collects patients’ medical information is a Data Controller because it decides how the personal data (such as medical history and contact information) will be used (e.g., for medical treatments, billing, etc.). The hospital is responsible for ensuring that the data is protected and only used for the stated purposes.
In this case, the hospital must comply with privacy laws, protect the data, and inform patients about how their information will be used. They ensure patients’ privacy rights are respected by not using their data for unauthorized purposes, like marketing, unless they get explicit consent.
Personal Processing Agreement (Data Processing Agreement)
A legal document between a company (Data Controller) and another party (Data Processor) that handles personal data on behalf of the company. The agreement outlines how the data should be processed, protected, and used. It ensures that both parties follow data protection laws and that the Data Processor handles the personal data securely and responsibly.
Conceptual
A value-sensitive type of investigation that identifies the direct and indirect stakeholders, attempting to identify their values and how it may be affected by the design.
Empirical
A value-sensitive type focuses on how stakeholders configure, uses or are otherwise affected by the technology.
Technical
A value-sensitive type that focuses on how existing technology supports or hinders human values and how the technology might be designed to support the values identified in the conceptual investigation.
Privacy Audit
A check-up for a company’s data privacy practices. It involves reviewing how the company collects, stores, and uses personal information to make sure they are following privacy laws and policies. The goal is to find any weaknesses or areas where they might be putting people’s personal information at risk and then fix those issues.
Example: Imagine a social media company wants to make sure they are keeping users’ data safe. They hire a privacy auditor who looks at:
- What personal data (like emails or photos) the company collects.
- How the company is storing this data (whether it’s securely stored).
- Who has access to the data (to make sure only authorized people
can see it).
If the audit finds that the company is not encrypting data properly, the company would need to improve its security to protect user privacy. This ensures that the company is not only following the law but also protecting users’ personal information from breaches or misuse.
Predictability Objective
Means that people should be able to clearly understand how their personal data will be collected, used, shared, and protected. This allows individuals to have confidence and control over their data because they know what to expect. The goal is to ensure there are no surprises when it comes to data handling.
Example: If you sign up for a streaming service, the company should tell you upfront what data they will collect (like your viewing habits) and how they will use it (e.g., to recommend shows or send you marketing emails). If they later decide to sell your viewing data to advertisers without telling you, they would be violating the Predictability objective because you wouldn’t have expected that.
In simple terms, predictability means you should always know what a company is doing with your personal information, so there are no hidden or unexpected uses.
Software Evolution Process
The ongoing process of making changes and updates to software after it has been released. This includes fixing bugs, adding new features, improving performance, and ensuring the software stays up to date with the latest technology. It’s like maintaining and upgrading a car—you need to keep it in good condition and make improvements over time.
As software evolves, it’s important to also update its data privacy features. This means making sure that as new features are added or changes are made, the software continues to protect users’ personal information and complies with current privacy laws. For example, if a company adds a new feature that collects more personal data, they need to ensure this data is handled securely and that users are informed about how it will be used.
Example: A mobile app might evolve by adding a location-tracking feature to improve user experience. In this case, the app developers must update the privacy settings to ensure users’ location data is protected, get users’ consent, and explain how the new data will be used. If they fail to do this, they could put users’ privacy at risk.
So, as the software changes, privacy protections must also evolve to keep personal data safe.
Holistic Data Protection
Looking at data privacy and security from all angles to ensure that personal information is fully protected throughout its entire life cycle. This approach involves considering not just how data is stored but also how it’s collected, used, shared, and disposed of, while making sure all systems, people, and processes that interact with the data are secure.
Example: Imagine a hospital that handles sensitive patient data. A holistic data protection strategy would involve:
- Securing the data when it’s collected (like ensuring patients’ health
records are entered into secure systems). - Limiting access so only authorized staff can view the data.
- Encrypting data to protect it while it’s stored.
- Properly disposing of the data when it’s no longer needed.
This ensures that every step—from data collection to deletion—is covered, preventing data leaks or misuse.
In simple terms, holistic data protection means taking care of personal data in every possible way, from start to finish.
Record of Processing Activity (RoPA)
Detailed document that lists all the ways an organization collects, stores, and uses personal data. It’s like a diary for how data is handled within the company. This record helps the company keep track of their data processing activities and ensures they are complying with data privacy laws like GDPR.
Example: If a retail company collects customer information for orders, the RoPA would include:
- What kind of data is collected (e.g., names, addresses, payment
information). - Why the data is collected (e.g., to process orders and ship
products). - Who has access to the data (e.g., customer service, shipping team).
- How long the data will be kept before it’s deleted.
This document helps the company understand where personal data is being used and allows authorities to review the company’s compliance with privacy regulations.
In simple terms, RoPA is a log that helps companies track how they handle personal data to ensure they are following privacy rules and keeping data secure.
Defect
Flaw or mistake in the requirements, design, or implementation. It’s something that is wrong at the start, even before the system is used. It may or may not lead to problems later, but it’s there from the beginning.
Fault
A fault happens when the system runs and encounters the defect. It’s the specific place in the system where something goes wrong due to the defect. More specifically, it is an incorrect step, process or data definition a computer program.
Error
The difference between the computed, observed or measured value and the true or theoretically correct value.
Failure
The inability of a system or component to perform its required function within specified performance requirements.
Harm
The actual or potential ill effect or danger to an individual’s personal privacy. (Sometimes called a hazard).
Information Collection
Solove’s Taxonomy that involves the following:
- Surveillance involves the observation and/or capturing of an individual’s activities. Example: An advertising website embeds HTML iframes into multiple third-party news, social networking and travel websites to track users by what pages they visit and what links they click on.
- Interrogation involves actively questioning an individual or otherwise probing for information. Example: A website requires a user to enter their mobile phone number as a condition of registration, although the website’s primary function does not require the phone number and there is no statutory or regulatory requirement to do so.
Information Processing
Solove’s Taxonomy that involves the following:
- Aggregation involves combining multiple pieces of information about an individual to produce a whole that is greater than the sum of its parts. Example: A retail company correlates purchases of unscented lotions, large tote bags and prenatal vitamins to infer that a customer is likely pregnant.
- Identification links information to specific individuals. Example: A
website uses cookies, a recurring IP address or unique device identifier to link an individual’s browsing history to their identity. - Insecurity results from failure to properly protect individuals’
information. Example: A website fails to encrypt private
communications, thus exposing users to potential future harm. - Secondary use involves using an individual’s information without
consent for purposes unrelated to the original reasons for which it was collected. Example: A retailer uses an email address for marketing purposes when the address was originally collected to correspond about a purchase. - Exclusion denies an individual knowledge of and/or participation in
what is being done with their information. Example: A marketing firm
secretly purchases consumer data to advertise to the customer under a different company name without their knowledge.
Information Dissemination
Solove’s Taxonomy that involves the following:
- Breach of confidentiality results from revealing an individual’s personal information, despite a promise not to do so. Example: A platform releases a user’s data to a third-party plug-in despite the platform’s privacy notice promising not to disclose the data to anyone.
- Disclosure involves revealing truthful information about an individual that negatively affects how others view them. Example: A private “lifestyle” service discloses a list of members, which is obtained by groups who disapprove of the lifestyle.
- Distortion involves spreading false and inaccurate information about an individual. Example: An employment history verification service incorrectly identifies a job applicant as a felon.
- Exposure results from the revelation of information that we normally conceal from most others, including private physical details about our bodies. Example: A person’s prior purchase of a urinary incontinence product is used as a promotional endorsement and sent to the person’s broader social network.
- Increased accessibility involves rendering an individual’s information more easily obtainable. Example: A children’s online entertainment service allows any adult to register and interact with child members, leaving these children accessible to strangers without parental consent.
- Blackmail is the threat to disclose an individual’s information against their will. Example: An overseas medical claims processor threatens to release patient data to the internet unless new employment conditions are met.
- Appropriation involves using someone’s identity for another person’s purposes. Example: An online dating service uses a customer’s personal history, including age, biography and education, to promote its website to new customers.
Invasion
Solove’s Taxonomy that involves the following:
- Intrusion consists of acts that disturb an individual’s solitude or
tranquility. Example: A mobile alert notifies potential customers that
they are within the proximity of a sale. - Decisional interference involves others inserting themselves into a
decision-making process that affects the individual’s personal affairs.
Example: A website limits access to negative product reviews to bias a new user toward a specific product selection.
Traceability Matrix
Encoding relationships between requirements and other software artifacts. A simple checklist or table that helps keep track of what needs to be done in a project. It shows how every part of the project matches up with the original goals or requirements, making sure nothing is forgotten. In privacy, for example, a trace matrix can link requirements to a privacy law or principle.
Standard Operating Procedure (SOP)
Detailed instruction manual for how to do a specific task or process in the same way every time. It explains step-by-step what needs to be done, by whom, and how to do it correctly to avoid mistakes.
In data privacy, an SOP ensures that everyone handling sensitive information, like personal data, follows the same rules to keep that information safe. This can include rules about how to collect, store, share, and protect data, making sure that the company complies with laws and regulations that protect people’s privacy.
Confidentiality
A quality attribute that ensures information is only accessible by authorized individuals. It ensures that data is protected and only accessible to authorized users. It prevents unauthorized access or disclosure.
Integrity
A quality attribute that ensures information has not been unintentionally modified. It ensures that data remains accurate, consistent, and unaltered except by authorized people or processes.
Availability
A quality attribute that ensures information is readily available whenever needed. It refers to how easily and reliably a system or data can be accessed by authorized users when needed.
Privacy Design Pattern
is a reusable solution in system or software design that helps protect user privacy by embedding privacy principles directly into the design process. It ensures that privacy is considered and integrated from the start, not as an afterthought, to minimize risks associated with personal data. Here is the four elements of a design pattern:
- Pattern Name (Context): Name which references the pattern.
- Pattern Description: This explains the privacy challenge or concern that the pattern addresses. It describes the potential risk to personal data or user privacy, highlighting why the pattern is needed in that specific context.
- Pattern Solution: This outlines the recommended approach or technique to mitigate the privacy concern. The solution provides a detailed description of how to design the system or process to protect privacy effectively.
- Consequence: This element explains the potential outcomes, both positive and negative, of applying the pattern. It describes the benefits of enhanced privacy and any trade-offs, such as increased complexity in implementation or reduced system performance.
Functional
Describes what the system is suppose to do. It refers to the specific operations or actions the system will perform to achieve its goals. For example, a functional requirement may be that the system shall provide a link to a privacy notice at the bottom of every page (this is the functionality of what must be built into the system).
NonFunctional
Refers to the constraints or conditions that the system must meet. These often concern quality attributes or the standards of performance, security, or usability. An example would be a system shall not disclose personal information without authorization or consent. It focuses on how the system will perform or adhere to specific guidelines or constraints.
Functional versus Nonfunctional
Functional is a to-do list for the system or a list of tasks that it must do in order to perform its purpose. Nonfunctional requirements are rules or constraints that affect it performance, security or user experience (an example would be a system should load in 5 seconds for all users).
Identifiability
Quality attribute for privacy, which refers to how easily a user can be recognized or identified within a system. By controlling the types of information that are shared or logged, like using pseudonyms instead of real names, it becomes harder for unauthorized people to figure out someone’s identity. The goal is to reduce the chances of someone being identified by keeping certain information separate or hidden. An example of this attribute would be in a system where users sign in, instead of recording their full name in web server logs, the system might use a code or nickname. This way, even if someone accesses the logs, they won’t know the user’s real identity.
Network Centricity
Quality attribute for privacy in which the extent to which personal information remains local to the client (Information is retained on client side and transfer it only to complete a transaction).
Mobility
Quality attribute for privacy in which the extent to which a system moves from one location to another, as in laptop and
mobile phone capabilities.
Architecture
A CONTROL to minimize privacy risk: Focus on designing systems to minimize how easily personal data can be identified or misused. The goal is to make data less personal (pseudonymized or anonymous) and decentralize its storage and use, making it harder for threat actors to access or misuse it.
Example: Instead of storing sensitive personal data in one central location, it’s split and stored in different places. Also, instead of using real names, data could be pseudonymized by replacing names with unique codes or IDs. This way, even if someone gets access to the system, they can’t easily identify individuals.
Security
A CONTROL to minimize privacy risk: Protect data by hiding or encrypting it so that even if someone gains access to the system, they cannot read or misuse the data. This might involve encrypting data at different stages during its collection or storage.
Example: When a user submits their credit card information to an online store, the system encrypts that data as soon as it is entered, making it unreadable without a special decryption key. This prevents hackers from seeing the real credit card number even if they access the data.
Supervision
A CONTROL to minimize privacy risk: Ensure that an organization, and anyone it works with (like third-party vendors), follows privacy policies and procedures. It allows an organization to monitor and enforce privacy compliance, ensuring that everyone handles personal data correctly.
Example: A company that processes personal data regularly audits its third-party service providers (such as cloud storage companies) to make sure they are following the correct privacy practices, such as encrypting personal data and not storing it for too long.
Balance
A CONTROL to minimize privacy risk: Ensuring fairness when collecting or using personal data. This means informing people about what data is being collected, giving them some control over it, and making sure the benefits of collecting that data are proportional to the risks involved.
Example: When a mobile app collects users’ location data, it asks for clear consent and provides an option to disable location tracking. This ensures that users are informed and have control over whether they want to share this sensitive information.
Verfication
Ensures the resultant system performs the way it is supposed to perform
Validation
Ensures the requirements satisfy the needs of the intended user base