11. Data Security and Encryption - DONE Flashcards
When considering data security controls in cloud environments, you need to consider three main components:
*Determine which data is allowed to be stored in the cloud, based on data classifications (covered in Chapter 5) that address your legal and regulatory compliance requirements. Pay particular attention to permitted jurisdictions and storage media.
*Protect and manage data security in the cloud. This will involve establishing a secure architecture, proper access controls, encryption, detection capabilities, and other security controls as needed.
*Ensure compliance with proper audit logging established and backups and business continuity in place.
list the most common ways storage can be consumed by customers of a cloud provider:
how will they affect how you secure your data in a cloud environment?
- object storage
- volume storage
- database
- application/platform
Each of these storage types has different threats and data protection options, which can differ depending on the provider. For example, typically you can give individual users access to individual objects, but a storage volume is allocated to a virtual machine (VM) in its entirety. This means your approach to securing data in a cloud environment will be based on the storage model used.
Describe object storage
This storage type is presented like a file system and is usually accessible via APIs or a front-end interface (such as the Web). Files (such as objects) can be made accessible to multiple systems simultaneously.
This storage type can be less secure, as it has often been discovered to be accidentally made available to the public Internet.
Examples of common object storage include Amazon S3, Microsoft Azure Block binary large objects (blobs), and Google Cloud Storage service.
“NOTEBlob storage is used to hold unstructured data such as video, audio, and other file types.”
Describe volume storage
This is a storage medium such as a hard drive that you attach to your server instance. Generally a volume can be attached only to a single server instance at a time.
Describe database storage
Cloud service providers may offer customers a wide variety of database types, including commercial and open source options. Quite often, providers will also offer proprietary databases with their own APIs. These databases are hosted by the provider and use existing standards for connectivity.
Databases offered can be relational or nonrelational. Examples of nonrelational databases include NoSQL, other key/value storage systems, and file system–based databases such as Hadoop Distributed File System (HDFS).
Describe Application/platform storage
This storage is managed by the provider. Examples of application/platform storage include content delivery networks (CDNs), files stored in Software as a Service (SaaS) applications (such as a customer relationship management [CRM] system), caching services, and other options.
Regardless of the storage model, what are common practices the CSP will use when managing data
“Regardless of the storage model in use, most CSPs employ redundant, durable storage mechanisms that use data dispersion (also called “data fragmentation of bit splitting” in the CSA Guidance). This process takes data (say, an object), breaks it up into smaller fragments, makes multiple copies of these fragments, and stores them across multiple servers and multiple drives to provide high durability (resiliency). In other words, a single file would not be located on a single hard drive, but would be spread across multiple hard drives.
Once acceptable storage locations are determined, you must monitor them for activity using tools such as :
database activity monitor (DAM) and file activity monitor (FAM). These controls can be not only detective in nature but may also prevent large data migrations from occurring.
The following tools and technologies can be useful for monitoring cloud usage and any data transfers:
Cloud access security broker (CASB)
- CASB systems were originally built to protect SaaS deployments and monitor their usage, but they have recently expanded to address some concerns surrounding Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) deployments as well.
You can use CASB to discover your actual usage of cloud services through multiple means such as network monitoring, integration with existing network gateways and monitoring tools, or even monitoring Domain Name System (DNS) queries. This could be considered a form of a discovery service. Once the various services in use are discovered, a CASB can monitor activity on approved services either through an API connection or inline (man-in-the-middle) interception. Quite often, the power of a CASB is dependent on its data loss prevention (DLP) capabilities (which can be either part of the CASB or an external service, depending on the CASB vendor’s capabilities).
The following tools and technologies can be useful for monitoring cloud usage and any data transfers:
URL filtering
URL filtering (such as a web gateway) may help you understand which cloud services your users use (or try to use).
The problem with URL filtering, however, is that you are generally stuck in a game of “whack-a-mole” when trying to control which services are allowed to be used and which are not. URL filtering will generally use a whitelist or blacklist to determine whether or not users are permitted to access a particular website.
what is the main difference between URL filtering and CASB
The main difference between URL filtering and CASB is that, unlike traditional whitelisting or blacklisting of domain names, CASB can use DLP when it is performing an inline inspection of SaaS connections.
The following tools and technologies can be useful for monitoring cloud usage and any data transfers:
Data loss prevention
A DLP tool may help detect data migrations to cloud services. You should, however, consider a couple of issues with DLP technology.
First, you need to “train” a DLP to understand what is sensitive data and what is not. Second, a DLP cannot inspect traffic that is encrypted. Some cloud SDKs and APIs may encrypt portions of data and traffic, which will interfere with the success of a DLP implementation.
To protect data as it is moving to a cloud, you need to focus on the security of data in transit. what are some examples?
Does your provider support Secure File Transfer Protocol (SFTP), or do they require you to use File Transfer Protocol (FTP) that uses clear-text credentials across the Internet? Your vendor may expose an API to you that has strong security mechanisms in place, so there is no requirement on your behalf to increase security.
some data transfers may involve data that you do not own or manage, such as data from public or untrusted sources. You should ensure that you have security mechanisms in place to inspect this data before processing or mixing it in with your existing data.
To protect data as it is moving to a cloud, you need to focus on the security of data in transit. what are some examples of encryption of data in transit?
As far as encryption of data in transit is concerned, many of the approaches used today are the same approaches that have been used in the past. This includes Transport Layer Security (TLS), Virtual Private Network (VPN) access, and other secure means of transferring data. If your provider doesn’t offer these basic security controls, get a different provider—seriously.
Another option for ensuring encryption of data in transit is that of a proxy (aka hybrid storage gateway or cloud storage gateway). The job of the proxy device is to encrypt data using your encryption keys prior to it being sent out on the Internet and to your provider. This technology, while promising, has not achieved the expected rate of adoption. Your provider may offer software versions of this technology, however, as a service to its customers.
“When you’re considering transferring very large amounts of data to a provider, do not overlook shipping of hard drives to the provider if possible
why so?.
Although data transfers across the Internet are much faster than they were ten years ago, I would bet that shipping 10 petabytes of data would be much faster than copying it over the Internet. Remember, though, that when you’re shipping data, your company may have a policy that states all data leaving your data center in physical form must be encrypted. If this is the case, talk with your provider regarding how best to do this.
You need to be aware of only two security controls for your CCSK exam:
The core data security controls are access controls and encryption. Access controls are your single most important controls.
Remember that access controls are your number-one controls. If you mess this up, all the other controls fall apart. Once you get the basics right (meaning access controls), then you can move on to implementing appropriate encryption of data at rest using a risk-based approach.
Access controls must be implemented properly in three main areas:
*Management plane- These access controls are used to restrict access to the actions that can be taken in the CSP’s management plane. Most CSPs have deny-by-default access control policies in place for any new accounts that may be created.
*Public and internal sharing controls - These controls must be planned and implemented when data is shared externally to the public or to partners.
*Application-level controls - Applications themselves must have appropriate controls designed and implemented to manage access. This includes both your own applications built in PaaS as well as any SaaS applications your organization consumes.