Explain privacy and sensitive data concepts in relation to security Flashcards

1
Q

Organizational consequences of privacy and data breaches

A

Reputation damage
Identity theft
Fines
IP theft

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Reputation damage

A

Reputation damage—data breaches cause widespread negative publicity, and customers are less likely to trust a company that cannot secure its information assets.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Identity theft

A

Identity theft—if the breached data is exploited to perform identity theft, the data subject may be able to sue for damages.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Fines

A

Fines—legislation might empower a regulator to levy fines. These can be fixed sum or in the most serious cases a percentage of turnover.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

IP theft

A

IP theft—loss of company data can lead to loss of revenue. This typically occurs when copyright material—unreleased movies and music tracks—is breached. The loss of patents, designs, trade secrets, and so on to competitors or state actors can also cause commercial losses, especially in overseas markets where IP theft may be difficult to remedy through legal action.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Notifications of breaches

A

Escalation

Public notifications and disclosures

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Escalation

A

A breach may be detected by technical staff and if the event is considered minor, there may be a temptation to remediate the system and take no further notification action. This could place the company in legal jeopardy. Any breach of personal data and most breaches of IP should be escalated to senior decision-makers and any impacts from legislation and regulation properly considered.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Public notifications and disclosures

A

Other than the regulator, notification might need to be made to law enforcement, individuals and third-party companies affected by the breach, and publicly through press or social media channels. For example, the Health Insurance Portability and Accountability Act (HIPAA) sets out reporting requirements in legislation, requiring breach notification to the affected individuals, the Secretary of the US Department of Health and Human Services, and, if more than 500 individuals are affected, to the media (hhs.gov/hipaa/for-professionals/breach-notification/index.html). The requirements also set out timescales for when these parties should be notified. For example, under GDPR, notification must be made within 72 hours of becoming aware of a breach of personal data (csoonline.com/article/3383244/how-to-report-a-data-breach-under-gdpr.html). Regulations will also set out disclosing requirements, or the information that must be provided to each of the affected parties. Disclosure is likely to include a description of what information was breached, details for the main point-of-contact, likely consequences arising from the breach, and measures taken to mitigate the breach.

GDPR offers stronger protections than most federal and state laws in the US, which tend to focus on industry-specific regulations, narrower definitions of personal data, and fewer rights and protections for data subjects. The passage of the California Consumer Privacy Act (CCPA) has changed the picture for domestic US legislation, however (csoonline.com/article/3292578/california-consumer-privacy-act-what-you-need-to-know-to-be-compliant.html).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Data types

A
Classifications
Personally identifiable information (PII)
Health information
Financial information
Government data
Customer data
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Classifications

A
Public
Private
Sensitive
Confidential
Critical
Proprietary
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Public

A

Public (unclassified)—there are no restrictions on viewing the data. Public information presents no risk to an organization if it is disclosed but does present a risk if it is modified or not available.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Private

A

Private/personal data—information that relates to an individual identity.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Sensitive

A

Sensitive—this label is usually used in the context of personal data is privacy-sensitive information about a subject that could harm them if made public and could prejudice decisions made about them if referred to by internal procedures. As defined by the EU’s General Data Protection Regulations (GDPR), sensitive personal data includes religious beliefs, political opinions, trade union membership, gender, sexual orientation, racial or ethnic origin, genetic data, and health information (ec.europa.eu/info/law/law-topic/data-protection/reform/rules-business-and-organisations/legal-grounds-processing-data/sensitive-data/what-personal-data-considered-sensitive_en).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Confidential

A

Confidential (secret)—the information is highly sensitive, for viewing only by approved persons within the owner organization, and possibly by trusted third parties under NDA.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Critical

A

Critical (top secret)—the information is too valuable to allow any risk of its capture. Viewing is severely restricted.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Proprietary

A

Proprietary—proprietary information or intellectual property (IP) is information created and owned by the company, typically about the products or services that they make or perform. IP is an obvious target for a company’s competitors, and IP in some industries (such as defense or energy) is of interest to foreign governments. IP may also represent a counterfeiting opportunity (movies, music, and books, for instance).

17
Q

Personally identifiable information (PII)

A

Personally identifiable information (PII) is data that can be used to identify, contact, or locate an individual. A Social Security Number (SSN) is a good example of PII. Others include name, date of birth, email address, telephone number, street address, biometric data, and so on. Some bits of information, such as a SSN, may be unique; others uniquely identify an individual in combination (for example, full name with birth date and street address).

Some types of information may be PII depending on the context. For example, when someone browses the web using a static IP address, the IP address is PII. An address that is dynamically assigned by the ISP may not be considered PII. PII is often used for password reset mechanisms and to confirm identity over the telephone. For example, PII may be defined as responses to challenge questions, such as “What is your favorite color/pet/movie?” These are the sort of complexities that must be considered when laws are introduced to control the collection and storage of personal data.

18
Q

Health information

A

Personal health information (PHI)—or protected health information—refers to medical and insurance records, plus associated hospital and laboratory test results. PHI may be associated with a specific person or used as an anonymized or deidentified data set for analysis and research. An anonymized data set is one where the identifying data is removed completely. A deidentified set contains codes that allow the subject information to be reconstructed by the data provider.

PHI trades at high values on the black market, making it an attractive target. Criminals seek to exploit the data for insurance fraud or possibly to blackmail victims. PHI data is extremely sensitive and its loss has a permanent effect. Unlike a credit card number or bank account number, it cannot be changed. Consequently, the reputational damage that would be caused by a PHI data breach is huge.

19
Q

Financial information

A

Financial information refers to data held about bank and investment accounts, plus information such as payroll and tax returns. Payment card information comprises the card number, expiry date, and the three-digit card verification value (CVV). Cards are also associated with a PIN, but this should never be transmitted to or handled by the merchant. Abuse of the card may also require the holder’s name and the address the card is registered to. The Payment Card Industry Data Security Standard (PCI DSS) defines the safe handling and storage of this information (pcisecuritystandards.org/pci_security).

20
Q

Government data

A

Internally, government agencies have complex data collection and processing requirements. In the US, federal laws place certain requirements on institutions that collect and process data about citizens and taxpayers. This data may be shared with companies for analysis under strict agreements to preserve security and privacy.

21
Q

Customer data

A

Customer data can be institutional information, but also personal information about the customer’s employees, such as sales and technical support contacts. This personal customer data should be treated as PII. Institutional information might be shared under a nondisclosure agreement (NDA), placing contractual obligations on storing and processing it securely.

22
Q

Privacy enhancing technologies

A
Data minimization
Data masking
Tokenization
Anonymization
Pseudo-anonymization
23
Q

Data minimization

A

Data minimization is the principle that data should only be processed and stored if that is necessary to perform the purpose for which it is collected. In order to prove compliance with the principle of data minimization, each process that uses personal data should be documented. The workflow can supply evidence of why processing and storage of a particular field or data point is required. Data minimization affects the data retention policy. It is necessary to track how long a data point has been stored for since it was collected and whether continued retention supports a legitimate processing function. Another impact is on test environments, where the minimization principle forbids the use of real data records.

Counterintuitively, the principle of minimization also includes the principle of sufficiency or adequacy. This means that you should collect the data required for the stated purpose in a single transaction to which the data subject can give clear consent. Collecting additional data later would not be compliant with this principle.

24
Q

Data masking

A

Data masking is a data security technique in which a dataset is copied but with sensitive data obfuscated. This benign replica is then used instead of the authentic data for testing or training purposes.

25
Q

Tokenization

A

Tokenization means that all or part of data in a field is replaced with a randomly generated token. The token is stored with the original value on a token server or token vault, separate to the production database. An authorized query or app can retrieve the original value from the vault, if necessary, so tokenization is a reversible technique. Tokenization is used as a substitute for encryption, because from a regulatory perspective an encrypted field is the same value as the original data.

26
Q

Anonymization

A
27
Q

Pseudo-anonymization

A

Removing personal information from a data set to make identification of individuals difficult, even if the data set is combined with other sources.

28
Q

Roles and responsibilities

A
Data owners
Data controller
Data processor
Data custodian/steward
Data protection officer (DPO)
29
Q

Data owners

A

Data owner—a senior (executive) role with ultimate responsibility for maintaining the confidentiality, integrity, and availability of the information asset. The owner is responsible for labeling the asset (such as determining who should have access and determining the asset’s criticality and sensitivity) and ensuring that it is protected with appropriate controls (access control, backup, retention, and so forth). The owner also typically selects a steward and custodian and directs their actions and sets the budget and resource allocation for sufficient controls.

30
Q

Data controller

A

Data controller—the entity responsible for determining why and how data is stored, collected, and used and for ensuring that these purposes and means are lawful. The data controller has ultimate responsibility for privacy breaches, and is not permitted to transfer that responsibility.

31
Q

Data processor

A

Data processor—an entity engaged by the data controller to assist with technical collection, storage, or analysis tasks. A data processor follows the instructions of a data controller with regard to collection or processing.

32
Q

Data custodian/steward

A

Data steward—this role is primarily responsible for data quality. This involves tasks such as ensuring data is labeled and identified with appropriate metadata and that data is collected and stored in a format and with values that comply with applicable laws and regulations.

33
Q

Data protection officer (DPO)

A

Data Privacy Officer (DPO)—this role is responsible for oversight of any personally identifiable information (PII) assets managed by the company. The privacy officer ensures that the processing, disclosure, and retention of PII complies with legal and regulatory frameworks.

34
Q

Information life cycle

A

An information life cycle model identifies discrete steps to assist security and privacy policy design. Most models identify the following general stages:

Creation/collection—data may be generated by an employee or automated system, or it may be submitted by a customer or supplier. At this stage, the data needs to be classified and tagged.
Distribution/use—data is made available on a need to know basis for authorized uses by authenticated account holders and third parties.
Retention—data might have to be kept in an archive past the date when it is still used for regulatory reasons.
Disposal—when it no longer needs to be used or retained, media storing data assets must be sanitized to remove any remnants.

35
Q

Impact assessment

A
36
Q

Terms of agreement

A
37
Q

Privacy notice

A