CCSP Chapter 4: Cloud Data Security Flashcards

1
Q

Create
* Data Created Remotely
* What is PKI
* TLS vs SSL
* Data Created within the Cloud

A

Create
* Data Created Remotely - data should be encrypted (FIPS 140-2) before uploading to cloud and protect against attacks (i.e. MiTM, insider threat); Upload connection should also be secure (TLS 1.2 or higher or IPsec)
* PKI - deals with keys and managing them; enables secure communication
* TLS replaces SSL, however SSL is still used;
* Data Created within the Cloud - data should be encrypted during creation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Storing data in cloud

A

Store will happen as data is Created; Encyprtion at rest and encryption in transit to mitigate exposure to threats while data is being moved to cloud data center

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Using Data in cloud

A
  • Operations in the cloud will necessitate remote access, so connections need to be secure with an encrypted tunnel;
  • The platforms users use to connect should be secure
  • Users should be trained
  • Data owners should limit access to users
    *Enable Logging and audit trails

On the Provider Side:
* CSP must ensure data on virtualized host cannot be read or detected by other virtual hosts on the same device
* CSP will have to implement personnel and administrative controls so that data center personnel cannot access any raw customer data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Sharing Data in Cloud

A
  • Sharing restrictions based on jurisdiction
  • IRM solutions
  • Encrypted files and communications
  • limit or prevent data being sent based on regulatory mandates
  • Egress monitoring
  • Export and import restrictions
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Export and Import Restrictions

A

Export Restrictions
* International Traffic in Arms Regulations (ITAR) (USA) - state department prohibitions on defense-related exports; can include cryptography systems
* Export Administration Regulations (EAR) (USA) - DoC prohibitions on dual-use items (technologies that could be used for both commercial and military purposes)

Import Restrictions
* Cryptography - many countries have restrictions on importing cryptosystems or material that has been encrypted;
* The Wassenaar Arrangement - a group of 41 countries have agreed to inform each other about military shipments to nonmember countries; not a treaty and therefore not legally binding, but may require your organization to notify your govt in order ot stay compliant

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Archiving Data in the Cloud

A
  • This is the phase for long term storage
  • Cryptography and key management - keys need to be stored and managed correctly, if they are lost, data will be lost or exposed
    *Also need to consider PHYSICAL SECURITY - Location, format, staff, procedure
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Elliptical Curve Cryptography (ECC)

A

Uses smaller keys to provide same level of security as traditional cryptography; uses algbraic elliptical curves

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Destroying data in the cloud

A
  • Crypgraophic erasure (cryptoshredding) is the only feasible means to destroy data in the cloud
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Types of Cloud Storage Architectures (4)

A
  1. Volume Storage
  2. Object-Based Storage
  3. DataBases
  4. Content Delivery Network (CDN)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q
  1. Volume Storage
    * Define Volume Storage
    * Two types of Volume Storage
    * What Cloud Service Model is Volume Storage typically offered in?
A
  • Volume Storage - the customer is allocated storage space within the cloud, this storage space is represented as an attached drive to the user’s virtual machine
    1. File Storage - data is stored and displayed juas as a file structure in the traditional environment, as files and folders; This is popular with big data analytical tools
    1. Block Storage - this is a blank volume that the customer can put anything into; it allows more flexibility and higher performance but requires more administration; might entail installation of OS or other app to store, sort, retrieve data; This is great for data of multiple types and kinds, such as enterprise backup services
  • Volume Storage is typically offered in IaaS
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q
  1. Object-Based Storage
    * What cloud service model is object-based storage offered in?
A
  • storing data as objects, not files or blocks
  • include actual production content, metadata, object identifier, unique address identifier
  • allow for significant level of description and indexing
  • object-based storage offered in IaaS
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q
  1. Databases
    * What cloud service model are databases typically offered in?
A
  • data is arranged according to characteristics and elements, including the specific trait required to file the data known as the primary key
  • usually the backend storage and stores data in fields
  • can be implemented on any cloud service model, but most often configured to work with PaaS and SaaS
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q
  1. Content Delivery Network (CDN)
A
  • it is a form of data caching, usually near geophysical locations of high use, for copies of data commonly requested by users
  • i.e. online multimedia streaming services; instead of dragging data from data center to users at variable distances - the streaming service provider can place copiest of the most requested media near specific areas where those requests are likely to be made, therefore improving bandwidth delivery quality
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Cloud Data Security Foundational Strategies - Encryption

A
  • Encryption used to protect data at rest, in transit, and in use
  • Includes Key management and Masking, Obfuscation, Anonymization, and Tokenization
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Encryption

  1. Key Management (6)
A
  • How and where keys are stored can affect the overall risk of data
  • Things to note for Key management
    1. Level of Protection - the key must be protected at the same or higher level as the data it is protecting.
    2. Key Recovery - if a user is fired, passes away, or lost their key - there needs to be SOP to recover that key. Usually multiple people have portions of the key
    3. Key Distribution - when creating keys, keys need to be distributed. Keys should never be distributed in the clear and should be passed along out of band, which is expensive. There needs to be procedures in place for how keys are distributed
    4. Key Recovation - if a key is stolen, lost, etc. There needs to be a process on how to suspend that key or the user’s ability to use it
    5. Key Escrow - it is highly deseriable that a trusted 3rd party holds copies in a secure environment;
    6. Oursourcing Key Management - keys should not be stored in the data they are protecting, therefore they should be stored outside of the CSP’s data center. One solution is for the cloud customer to retain the keys which is cumbersome and expensive; the other is using a Cloud Access Security Broker (CASB); CASBs are 3rd party providers that handle IAM and key Management services for cloud customers; CASBs are much less expensive then self maintenance
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q
  • Transparent Encryption
  • Hardware Security Module (HSM)
A
  • Transparent Encryption - encryption key for the database is stored in the database itself
  • Hardware Security Module (HSM) - a device that can safely store and manage encryptions keys and is used in servers, data transmission, and log files; It is far stronger than saving and storing keys in software
17
Q

Encryption

  1. Masking, Obfuscation, Anonymization, and Tokenization
    * Define
    * Reasons to use (3)
    * How to use (5)
A
  • hiding actual data and instead using a representation of that data;
    Define
  • Masking - XXX-XX-1234
  • Obfuscation - using any technique to make data less meaningful, detailed, or readable to protect the data; Two kinds (Static and Dynamic); Static is where a new data set is created as a copy from the original data; that copy is used; Dynamic i.e. where customer is granted access but data is obscured from them as they access it (i.e. XXX-XX-1234)
  • Anonymization or De-identification - removing nonspecific identifiers i.e. age birhtday location; this can be difficult bc sensitive data must be recognized and marked which means hackers can idnetify which information is sensitive by markers
  • Tokenization - it is the practice of having two databases, one with live actual sensitive data and the other with nonrepresentational tokens mapped to that data (attached image)

* Reasons to do this are:
1. Test environments - new SW should be tested in sandboxed environments before being deployed to the production environment; When this type of testing occurs, actual production data should not be used, however data that closely approximates the same traits and characteristics should be used;
2. Enforcing Least Privilege - giving access to elements of a data set without revealing its entirety; i.e. showing paritial credit card #
3. Secure Remote Access - when a customer logs onto a web service, the customer’s account might not have their account data displayed out in the open to avoid risks such as hijacked session, stolen credentials, or shoulder surfing*

  • How to use
    1. Randomization - replacing the data or part of the data with random characters. As with other methods, you want to leave other traits intact such as the length of the string or character set (i.e. if it was numerical, aphabetic, had special chars, etc)
    2. Hashing - ONE WAY cryptographic function used to create a digest of the original data; this method ensures it is uncoverable and can perform integrity check later; hashing converts variable length into fixed length so you lose many properties of the original data
    3. Shuffling - ussing different parts of the data within the same data set to represent the data; The con to this is using the actual production data
    4. Masking - hiding data with useless characters; i.e. xxx-xx-1234
    5. Nulls - deleting the raw data from the display before it is represented or displaying null sets; Con to this is the data set will be reduced drastically;
18
Q

Security Information and Event Management (SIEM)
* Define and what are goals of SIEM

A
  • SIEM - a set of tools used to collect, manage, analyze, and display log data
  • Goals of SIEM include:
  • Centralize Collection of log Data - allows users to aggregate all logs into one area making it easier for admins to monitor the environment. However, this poses a risk for attackers having all log data in one location; SIEM requires additional layers of security
  • Enhanced Analysis Capabilities - SIEM allows for automatic detection for obvious attacks, but will not detect persistent threats drawn out over weeks or months; log analysts should not be full time as this is a repetitive straining task, however having log analysts part time to discover persistent threats will be useful
  • Dashboarding - management often does not understand security; SIEM allows creation of dashboards which make it easier for mgmt to understand;
  • Automated Response - some SIEMs alow for automated alerts and response capabilities;
19
Q

Egress monitoring / DLP
* Define and what are goals of SIEM
* Implementing DLP in cloud

A
  • examining data as it leaves the production environment; aka DLP, data loss, leak prevention, and protection; DLP solutions identify data, monitors activity, and enforces policies; can be implemented at points of network egress (i.e. DMZ); but in the cloud this would be on all public facing devices or hosts that process data within the production environment (i.e. local agents installed on user workstations)
  • Goals of DLP or Egress monitoring include:
  • Additional Security - can be used as another control in layered defense
  • Policy Enforcement - users can be alerted by the DLP when they are attempting to perform an action that would violate the organization’s policy
  • Enhanced Monitoring - provides more log stream to organization’s monitoring suite
  • Regulatory Compliance - specific data can be identified by DLP and dissemination of that data can be controlled accordingly to better adhere to regulatory mandates
  • Implementing DLP in cloud can be difficult due to cloud customer not having sufficient access in terms of administrative permissions; DLP also comes with a lot of processing overhead with all that monitoring so cost increases