CV Exam Simmyt Flashcards

1
Q

Which of the following is also referred to as the control plane?

A. Forwarding Plane
B. Service Plane
C. Data Plane
D. Management Plane

A

D. Management Plane

Explanation:
The management plane, also called the control plane, is the plane on a device where changes are made and the device is managed, For example, applying an IP address or changing the name of the device would be activities that occur on the management plane.

The data plane is responsible for the sending of packets and is also called the forwarding plane.

The forwarding plane is just another name for the data plane.

There is no service plane.

Objective:
Cloud Platform and Infrastructure Security

Sub-Objective:
Comprehend Cloud Infrastructure Components

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

In which step of the SDLC are the possible user operations defined?

A. Defining
B. Development
C. Design
D. Testing

A

C. Design

Explanation:
In the Design step of the Software Development Life Cycle (SDLC), the possible user operations are defined and the interface is envisioned.

The Development step is where the programmers are most involved and is when code is written.

In the Testing step, activities such as initial penetration testing and vulnerability scanning against the application are performed.

In the Defining step, the business requirements of the software are determined, which will drive the design step.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Which consideration is NOT shared by both the vendor and the cloud customer?

A. Security
B. Budget
C. Performance
D. Reversibility

A

B. Budget

Explanation:
Budget is a consideration of the customer only. The vendor has no say in how the organization allocates its resources for the cloud initiative.

However, that is one of the few considerations that are not shared by both the vendor and the customer. Among the considerations shared are:

    Interoperability
    Portability
    Reversibility
    Availability
    Security
    Privacy
    Resiliency
    Performance
    Governance
    Maintenance
    Versioning
    Service levels
    Service Level Agreements (SLA)
    Auditability
    Regulatory
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Which of the following is NOT a characteristic of the log collection process?

A. Difficult process, easy analysis
B. Often not a priority
C. Mundane and Repetitive
D. Sufficient Experience and training requirements

A

A. Difficult process, easy analysis

Explanation:
The logging process is not difficult, and analysis is not easy. Logging is fairly easy, most software and devices in modern enterprises can effectively log anything and everything that the organization might want to capture. Reading and analyzing these logs, however, can prove challenging.

Log reviewing is mundane and repetitive. This is not exciting work, and even the best analyst can become lax due to repetition.

Log review and analysis is often not a priority. Most organizations do not have the wherewithal to dedicate the personnel required to effectively analyze log data.

Log analysis requires someone experienced. The person needs to have sufficient experience and training to perform the activity in a worthwhile manner.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

During an annual risk assessment of your company’s cloud deployment, you obtain the SLA from the CSP and need to request certain metrics from the CSP for analysis. You need to determine the time required to perform a requested operation or task. Which measurement should you examine?

A. Man-time to switchover
B. Response time
C. Completion Time
D. Instance Startup time

A

B. Response time

Explanation:
You should examine the response time, which is the time required to perform a requested operation or task.

Completion time is the time required to complete the initiated or requested task. Completion time measures from the time the request is made until the request fulfillment is completed, while response time measures from the time the request is made until the request fulfillment is started. Response time is always smaller than completion time.

Instance startup time is the time required to initialize a new instance.

Mean-time to switchover is the average time to switch over from a service failure to a replicated failover instance (backup).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

A data and media sanitization clause is being included as part of the service level agreement (SLA) with your company’s cloud service provider.
Which sanitization method is the preferred method of sanitization?

A. Physical Destruction
B. Cryptographic Erasure
C. Overwriting
D. Degaussing

A

A. Physical Destruction

Explanation:
Physical destruction ensures that data is physically unrecoverable and is the preferred method of sanitization.

Degaussing applies a strong magnetic field to the hardware and media, effectively making them blank. This method does not work with solid-state drives.

Overwriting applies multiple passes of random characters to the storage areas, ending with a final pass of all zeroes or ones. Overwriting can be extremely time-consuming for large storage areas. This method does ensure that data cannot be recovered. However, it is not the preferred method of sanitization.

Cryptographic erasure, also referred to as cryptoshredding, encrypts the data. Then it takes the keys generated in that process, encrypts them with a different encryption engine, and destroys the keys. As long as the keys are properly destroyed and a strong encryption algorithm is used, then the data is unrecoverable. However, if a strong encryption algorithm is not used, the encryption can be broken and the data retrieved.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

You have recently been hired by a company with a SaaS cloud thatb was launched earlier this year.
You discover that the appropriate risk assessment was not performed prior to the deployment.
As a result, a number of issues have been encountered since the cloud deployment that could have been mitigated.

What is the threat that has occurred in this scenario?

A. APTs
B. Insufficient due diligence
C. Denial of Service
D. Shared technology issues

A

B. Insufficient due diligence

Explanation:
In this scenario, the threat that has occurred is insufficient due diligence. If the company had performed a proper risk assessment, it would have likely identified many of the issues that occurred and could have taken actions to prevent or mitigate those threats.

A denial of service (DoS) threat occurs when system slowdowns occur as a result of attackers consuming system resources.

Shared technology issues occur when the underlying components in the cloud model are not properly secured using defense-in-depth techniques. All layers of the cloud must be secured and monitored.

Advanced persistent threats (APTs) are attacks that infiltrate systems to gain a foothold within the targeted company. The attackers maintain a quiet presence in the company’s network and establish more doors into the network, using those doors to steal data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

When performing a risk assessment of your cloud’s logical and physical infrastructure, which value provides an estimate of how often an identified risk will occur over the period of a year?

A. SLE
B. AV
C. EF
D. ARO

A

D. ARO

Explanation:
Annualized rate of occurrence (ARO) provides an estimate of how often an identified risk will occur over the period of a year. ARO is used in the calculation of annualized loss expectancy (ALE). ALE = ARO x SLE.

Asset value is the value of the asset at risk and is expressed in a monetary amount. Exposure factor is the estimated loss that will result if the risk occurs and is expressed as a percentage. Single loss expectancy (SLE) is the potential loss when a risk occurs and uses the following formula:

SLE = AV x EF

If an asset is worth $10,000 and the exposure factor is 20%, then SLE = $2,000. If the ALE for this risk event is every two years, then ALE = $2,000 x .5 = $1,000.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Which of the following consists of managing information in accordance with who has rights to it?

A. PCI-DSS
B. IRM
C. DLP
D. PII

A

B. IRM

Explanation:
Information Rights Management (IRM) is the process of managing information in accordance with who has rights to it. It is also sometimes known by the name Digital Rights Management.

None of the other options manages information based on who has rights to it.

Data Loss Prevention (DLP) software can be used to monitor the edge of the network or to monitor a single device for the exfiltration of sensitive data.

The Payment Card Industry Data Security Standard (PCI-DSS) covers credit card data.

Personally identifiable information (PII) is any piece of information that can be tied uniquely to an individual, such as a name and Social Security number.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Your company has a public cloud that provides SaaS through a CSP. Your company’s developers have worked with developers at the CSP to create a custom application that uses REST via APIs to interact with the cloud environment. Which of the following is NOT used in REST interactions?

A. Sessions
B. XML
C. Credentials
D. JSON

A

A. Sessions

Explanation:
Representational State Transfer (REST) interactions do NOT include sessions. The server does not need to store any temporary information about client, so sessions are not required.

Credentials are used to allow authentication between clients and servers. A RESTful application programming interface (API) can support multiple data formats, including JavaScript Object Notation (JSON) and Extensible Markup Language (XML).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Which data classification process ensures that data that is considered sensitive in one environment is likewise treated as sensitive in another environment?

A. Cross-Referencing
B. Classification
C. Mapping
D. Labeling

A

C. Mapping

Explanation:
Mapping is the process that ensures that data that is considered sensitive in one environment is treated as sensitive in another environment.

Classification is the process that defines the sensitivity level of a resource. It is attached to the resource as a label and is communicated to other environments through mapping.

Labeling is the process of attaching a data classification level to a resource. The label can include many properties about the resource but, at a minimum, should describe its sensitivity level.

Cross-referencing is not a term used when discussing classification processes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

You are employed by a CSP that must ensure compliance with Chinese national standards. Which CSA STAR level should the CSP pursue?

A. Level 2: CSA C-STAR Assessment
B. Level 1: CSA GDPR Code of Conduct Self-Assessment
C. Level 2: CSA STAR Attestation
D. Level 2: CSA STAR Certification

A

A. Level 2: CSA C-STAR Assessment

Explanation:
LEVEL 2: Cloud Security Alliance (CSA) C-Security, Trust, and Assurance Registry (C-STAR) will ensure compliance with Chinese national standards.

Level 1: CSA Global Data Protection Regulation (GDPR) Code of Conduct Self-Assessment only includes a self-assessment and does not ensure compliance with Chinese national standards.

Level 2: CSA STAR Attestation includes a third-party individual assessment against SOC 2 standards.

Level 2: CSA STAR Certification includes a third-party individual assessment against ISO/IEC 27001:2005.

CSA also includes Level 3: Continuous Monitoring, which is currently under development and should include third-party continuous monitoring.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

A financial institution has recently decided to engage a CSP to provide an IaaS deployment. The board of directors is concerned with the disclosure of personal financial information. Which US federal law should apply in this situation?

A. SCA
B. GLBA
C. SOX
D. HIPAA

A

B. GLBA

Explanation:
The Gramm-Leach-Bliley Act (GLBA) should apply in this scenario. GLBA regulates the collection and disclosure of private financial information, stipulates that financial institutions must implement security programs to protect such information, and prohibits accessing private information using false pretenses. It also requires financial institutions to give customers written privacy notices regarding information-sharing practices.

Health Insurance Portability and Accountability Act (HIPAA) would apply in a scenario where healthcare organizations are involved. Sarbanes-Oxley (SOX) protect individuals from accounting errors and fraudulent practices in publicly traded companies. The Stored Communications Act (SCA) protects certain electronic communication and computing services from unauthorized access or interception.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the first phase of the cloud-secure software development life cycle?

A. Develop
B. Design
C. Test
D. Define

A

D. Define

Explanation:
This first phase of the cloud-secure software development life cycle is Define. During this phase, all the requirements for the application are defined.

The phases of the cloud-secure software development life cycle are:

    Define
    Design
    Develop
    Test
    Secure operations
    Disposal
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Your company is carrying out some testing of its business continuity plan (BCP). During this testing, all business units managers and BCP process employees are present and discuss the each individual’s responsibilities. Which type of test is most likely occurring?

A. Full-Interruption Test
B. Functional Drill
C. Tabletop exercise
D. Walk-through drill

A

C. Tabletop exercise

Explanation:
A tabletop exercise, also referred to as structured walk-through test, is most likely occurring. During this test, business unit managers and BCP process employees are present. A discussion of each individual’s responsibilities is completed. Sometimes it includes individual and team training of the step-by-step procedures outlined in the BCP. The testing process provides clarification and highlighting of critical plan elements. Problems are also noted. This is the first type of test that is usually completed for a BCP.

None of the other tests is described in the scenario.

A walk-through drill, also referred to as a simulation test, is more complicated than a tabletop exercise. In a walk-through drill, a scenario is selected and applied to the BCP. Usually only the operational and support personnel for the BCP process attend this meeting. Attendees practice certain functional steps to ensure that they have the knowledge and skills needed to complete them. For this test, it is critical for employees to act out the critical steps, recognize difficulties, and resolve problems.

A functional drill, also referred to as a parallel test, involves moving personnel to the recovery site(s) to attempt to establish communications and perform real recovery processing. The drill will help the organization determine whether following the BCP will successfully recover critical systems at an alternate processing site. Because a functional drill fully tests the BCP, all employees are involved. It demonstrates emergency management capabilities and tests procedures for evacuation, medical response, and warnings.

A full-interruption, also referred to as a full-scale test, is the most comprehensive BCP test. A real-life emergency is simulated as closely as possible. It is important to properly plan this type of test to ensure that business operations are not negatively affected. This usually includes processing data and transactions using backup media at the recovery site. All employees must participate in this type of test, and all response teams must be involved. This test has the highest level of simulation, including notifications, resource mobilization, and communications.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Your company has a contract with a CSP to provide an IaaS model. This model has been deployed for two years now. Company developers have designed different APIs to manage and to provide access to the cloud resources. Management has recently become concerned that competitors who use the same CSP can gain access to your company’s information that is deployed on shared CSP resources. Which common pitfall of cloud deployment models is described in this scenario?

A. Integration Complexity
B. On-premises Transfer
C. Apps not cloud ready
D. Tenancy Separation

A

D. Tenancy Separation

Explanation:
This scenario is displaying the pitfall of tenancy separation. When an organization deploys a cloud solution, it must usually address the security concerns that accompany a multi-tenancy solution. Most companies will need to ensure that the CSP implements the proper countermeasures for access control, process isolation, and denial of guest/host escape attempts.

On-premises transfer is a pitfall that occurs when a company transfers its applications and configurations to the cloud. APIs that were developed to use on-premises resources may not function properly in the cloud or provide the appropriate security.

Integration complexity is a pitfall that occurs when new applications need to interface with old applications. Often, developers do not have unrestricted access to the supporting services. It can be difficult to design the integration of infrastructure, applications, and integration platforms managed by the CSP with the appropriate level of security.

Apps not being cloud ready is a pitfall that occurs because many applications were not originally developed while considering the security implications of a cloud deployment. These older applications may not work properly in a cloud model.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Your company wants to deploy several virtual machines using resources provided by your cloud service provider (CSP). As part of this deployment, you need to install and configure the virtualization management tools on your network. Which of the following statements is FALSE about installing these tools?

A. Access to the virtualization management tools should be rule-based
B. Virtualization management should take place on an isolated management network
C. Audit and log all access to the virtualization management tools
D. Only a secure kernel based virtual machine should be used to access the hosts

A

A. Access to the virtualization management tools should be rule-based

Explanation:
Access to the virtualization management tool should be role-based, not rule-based.

All of the other statements regarding virtualization management are true. Virtualization management should take place on an isolated management network. You should audit and log all access to the virtualization management tools. Only a secure kernel-based virtual machine (KVM) should be used to access the hosts.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Users are reporting issues when accessing data on the cloud. When you research the issue, you notice that reads and writes are slow. Which component is most likely causing this problem?

A. CPU
B. Disk
C. Memory
D. Network

A

B. Disk

Explanation:
When you are experiencing slow reads and writes, the problem is mostly likely the disk.

None of the other components is the problem. If the network were the problem, you would experience excessive dropped packets. If the memory were the problem, you would experience excessive memory usage or a paging file issue. If the CPU were the problem, you would experience excessive CPU utilization.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

With which cloud model is the lack of security surrounding app stores an issue?

A. SaaS
B. PaaS
C. IaaS
D. XaaS

A

A. SaaS

Explanation:
One of the most important security considerations with Software as a Service (SaaS) is the lack of security surrounding app stores. In many cases these app stores make apps available that the cloud vendor did not develop but simply resells. These apps may not have been adequately tested from a security standpoint. Google Android has had issues with this in the past. It is possible and advisable to restrict access in the app store to only company developed or approved apps.

None of the other models include the lack of security surrounding app stores as an issue.

Identity and access management (IAM) is important to all cloud deployments, but is critical to Platform as a Service (PaaS). PaaS comprises a development platform that not only provides but encourages shared access. It is critical that IAM systems be robust. Otherwise you lose both security and accountability, which is a key need in a shared environment.

The biggest security consideration for Infrastructure as a Service (IaaS), especially in a shared public cloud, is the proper segmentation of resources between tenants. This includes the isolation of VMs from various tenants.

Anything as a Service (XaaS) is a general term that applies to all forms of cloud service offerings and not to a specific type.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Which of the following technologies or concepts has had the impact of making the cloud intelligent?

A. Quantum Computing
B. Containers
C. AI
D. IoT

A

C. AI

Explanation:
Artificial intelligence (AI) occurs when a machine system such as a computer is able to take in data and make decisions based on that data unaided by humans. Also sometimes called machine learning, it enables the cloud to “learn” from data or to become “intelligent”.

Quantum computing uses quantum-mechanical phenomena such as superposition and entanglement to perform computations. When quantum supremacy is realized (which a NASA and Google AI partnership claimed to have achieved in mid-2019), quantum computing has the ability to solve a problem that classical computers practically cannot. When quantum computing becomes available to the masses, it will likely do so through the cloud because of the massive computing resources required to perform quantum calculations.

Containers comprise an alternative method of providing virtualization. While container-based virtualization is not likely to completely replace server virtualization because the security is still better with a virtual machine, it has many advantages in a cloud environment. It can be deployed quicker than a VM, and containers let you pack more computing workloads onto a single server, resulting in less hardware purchase, lower facilities cost for your datacenter, and a reduction in experts required to manage that equipment.

The Internet of Things (IoT) has had the effect of increasing the need for cloud computing services because it is usually preferable to manage IoT resources from the cloud.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

You are employed by a cloud service provider (CSP) that has globally distributed data centers and secure cloud computing environments. You decide to deploy remote access to allow authorized employees, customers, and third-party personnel to remotely manage cloud deployments. Which of the following is NOT a benefit of providing this solution?

A. Lower Administrative Overhead
B. Accountability
C. Secure Isolation
D. Session Control
E. Real Time Monitoring
A

A. Lower Administrative Overhead

Explanation:
Deploying remote access to allow authorized employees, customer, and third-party personnel to remotely manage cloud deployments does NOT lower administrative effort. In fact, it may actually increase administrative overhead because administrators will need to ensure that the appropriate tools are installed for those needing remote access. In addition, training will need to be provided on the remote access tools.

All of the other options are benefits of providing remote access. Accountability provides information on who is accessing the data center remotely using a tamper-proof audit trail. Session control allows control of who can access the environment and enforcement of workflows. Real-time monitoring allows administrators to view privileged activities as they are happening or as a recorded playback for forensic analysis. Secure isolation between the remote user’s desktop and the target system is provided so that any potential malware does not spread to the target systems.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

In which data technique is data sliced into chunks that are encrypted along with parity bits and then written to various drives in the cloud cluster?

A. Data Encoding
B. RAID
C. Erasure Coding
D. Data Dispersion

A

D. Data Dispersion

Explanation:
While somewhat similar to RAID, data dispersion is the process of slicing the data into what are called shards. It is then encrypted along with parity bits (a process called erasure encoding when used in cloud data dispersal) and then written to various drives in the cloud cluster.

While there are RAID versions that provide no-fault tolerance, in most cases RAID uses multiple drives, strategically located data, and parity information to allow for immediate access to data that may have resided on a failed drive. RAID does not, however, slice the data into shards and then encrypt it along with parity bits.

Data encoding is the process of using coding techniques to prevent the introduction of malicious- character strings into web applications. It has nothing to do with data dispersal.

Erasure coding is the process used by data dispersion to encrypt the data along with parity bits.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Recently, an attacker was able to access several virtual machines deployed in the cloud. Initially, you were able to deploy a workaround. Once the incident was stopped, you were able to identify the root cause. As a result, you now need to deploy a fix for the known error. In which process are you currently working?

A. Configuration Management
B. Problem Management
C. Incident Management
D. Change Management

A

B. Problem Management

Explanation:
This issue was initially identified as a problem. Problem management is used to minimize the impact of problems. A problem is the root cause of an incident.

Change management manages all changes to configuration items, including any devices. All changes must be tested and formally approved prior to deployment in the live environment. Problem management should hand over the fix for the known error for proper testing and approval. Once the fix is approved, change management will deploy the fix, and the fix will be marked as completed in both the change management and problem management processes.

Incident management occurs when incidents are identified, analyzed, and corrected to prevent a future reoccurrence. The initial attack and its resolution was part of incident management. However, identifying the root cause and deploying a fix to a known error is part of problem management.

Configuration management occurs when the configuration of an item, such as a network device, must be changed.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

When secure settings have been implemented on your SIEM system, which system needs to be updated to reflect those changes?

A. WAF
B. SDLC
C. CMDB
D. API

A

C. CMDB

Explanation:
The Configuration Management Database (CMDB) needs to be updated to reflect any system changes. The CMDB is a repository that should contain all the configuration settings of the various systems and the history or versioning of changes to those settings.

The Software Development Life Cycle (SDLC) is a model to guide software development projects to ensure the timely delivery of systems that are both functional and secure.

A web application firewall (WAF) is a firewall that is designed to examine all traffic to a web server while looking for common web attacks in an attempt to prevent them.

An application programming interface (API) is a software entity created to communicate between applications or between components of an application.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

Which of the following cloud service categories is most likely to have web application security issues?

A. None of the categories
B. PaaS
C. IaaS
D. SaaS

A

D. SaaS

Explanation:
The Software as a Service (SaaS) cloud service is most likely to have web application security issues because software and OS vulnerabilities exist whether software is hosted in the cloud or on-premises.

None of the other listed categories is as likely to have web application security issues.

The security issues for the Infrastructure as a Service (IaaS) model are personnel threats, external threats, and lack of specific skillsets to manage the model.

The Platform as a Service (PaaS) mode includes the security issues listed for the IaaS model, and also includes issues with interoperability, persistent backdoors, virtualization, and resource sharing.

The SaaS model includes all the security issues listed for the IaaS and PaaS models, and also includes issues with proprietary formats and web application security.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

Which of the following statements does NOT correspond to the Store phase of the cloud data lifecycle?

A. Data should be classified according to sensitivity and value
B. The Store phase typically occurs at the same time as the Create phase
C. Data should be protected according to its classification level
D. Access control lists (ACLs) should be created to control access to cloud data

A

B. The Store phase typically occurs at the same time as the Create phase

Explanation:
While data should be classified according to sensitivity and value, this classification does NOT take place during the Store phase. It takes place during the Create phase.

All of the other statements correspond to the Store phase of the cloud data lifecycle. Data should be protected according to its classification level, which is assigned during the Create phase. ACLs should be created to control access to cloud data. The Store phase typically occurs at the same time as the Create phase.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

Which of the following, when used to switch between multiple devices connected to a unit, physically breaks the current connection before a new one is made?

A. Secure Data Ports
B. Fixed Firmware
C. Tamper labels
D. Air-gapped pushbuttons

A

D. Air-gapped pushbuttons

Explanation:
Air-gapped pushbuttons on KVM switches physically break the current connection before a new one is made.

Tamper labels are used to alert you that someone has physically accessed the system and torn the labels. They are applied to the cases of devices that you need to remain secure. While they do not prevent physical access, they alert you if physical access has occurred.

Fixed firmware is device software that cannot be erased or altered. Fixed firmware is installed on internal chips in the device.

Secure data ports reduce the likelihood of data leaking between computers that are connected through the KVM by protecting the ports. But they do not break a connection before a new one is made.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

Which of the following is the maximum amount of time you can continue without a resource?

A. MTD
B. RTO
C. ALE
D. RPO

A

A. MTD

Explanation:
Maximum tolerable downtime (MTD) is the maximum amount of time you can continue without a resource.

Recovery time objective (RTO) is the target time in which recovery occurs. It should be less than the MTD to provide extra time.

Recovery point objective (RPO) is measured in data loss, not time. It is the maximum allowable amount of data you can afford to lose during an issue. This value can be reduced by more frequent backups.

Annualized loss expectancy (ALE) is the financial loss expected from an event spread across the time period between events based on the event’s historical frequency. This yields a yearly average amount.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

You are designing the encryption system to use for your cloud solution. What type of cryptography would be appropriate when most of the access to the cloud will be from smartphones and tablets?

A. ECC
B. AES
C. DES
D. Triple DES

A

A. ECC

Explanation:
Elliptical curve cryptography (ECC) is an approach to public key cryptography that uses much smaller keys than traditional cryptography to provide the same level of security. Smaller key sizes place a lighter load on the CPU of the devices, and because smartphones and tablets have less processing power, this is a good thing.

Advanced Encryption Standard (AES) is the most secure encryption currently, but it requires the use of large keys.

Data Encryption Standard (DES) is less secure than AES and also requires the use of larger keys.

Triple DES is more secure than DES, less secure than AES, and again, requires the use of larger keys.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

In which step of the SDLC are the business requirements of the software determined?

A. Defining
B. Design
C. Testing
D. Development

A

A. Defining

Explanation:
In the Defining step of the Software Development Life Cycle (SDLC), the business requirements of the software are determined, which will drive the design step.

In the Testing step, activities such as initial penetration testing and vulnerability scanning against the application are performed.

In the Design step, the possible user operations are defined and the interface is envisioned.

The Development step is where the programmers are most involved and is when code is written.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

According to ISO/IEC 27034, which two components are used to build the ASMP? (Choose two.)

A. DAST
B. ANF
C. ONF
D. SAST
D. RASP
A

B. ANF
C. ONF

Explanation:
The organizational normative framework (ONF) and the application normative framework (ANF) are used to build the application security management process (ASMP). The ONF defines the organizational security best practices for all application development, and include sections that cover business context, regulatory context, technical context, specifications, roles, process, and application security control (ASC) library. The ANF uses the applicable portions of the ONF on a specific application to achieve the needed security requirements or the target trust level.

The ONF and ANF have a one-to-many relationship, where one ONF is used as the basis to create multiple ANFs. An ASMP manages and maintains each ANF.

The steps in the ASMP process are as follows:

Specify the application requirements and environment.
Assess application security risks.
Create and maintain the ANF and ONF.
Provision and operate the application.
Audit application security.

All of the other options are security testing methods.

Static application security testing (SAST) is considered to be a white-box test. With SAST, an analysis of the application source code, byte code, and binaries are performed without executing the application code. It is used to detect coding errors. SAST is most often using during the development of the application and is more comprehensive that dynamic application security testing (DAST).

Dynamic application security testing (DAST) is usually considered a black-box test. DAST is used against applications that are running, rather than prior to their deployment like SAST.

Runtime application self-protection (RASP) prevents issues in applications by deploying self-protection capabilities in the runtime environment.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
32
Q

Your company wants to migrate to a cloud solution for personnel access to Windows and Linux. Which cloud service category should you research?

A. PaaS
B. NaaS
C. CompaaS
D. SaaS
D. DSaaS
F. IaaS
A

A. PaaS

Explanation:
The Platform as a Service (PaaS) cloud service category allows access to operating systems, such as Windows, Linux, Unix, and Mac OS.

The three main cloud service categories are PaaS, Software as a Service (SaaS), and Infrastructure as a Service (IaaS). SaaS provides access to applications, including email. IaaS provides access to hardware, blades, connectivity, and utilities.

Other cloud service categories that you need to understand for the CCSP exam include:

Compliance as a Service (CompaaS or CaaS) - includes a variety of compliance services such as data encryption, disaster recovery, reporting, and vulnerability scanning.
Networking as a Service (NaaS) - includes network services from third parties to customers that do not want to build their own networking infrastructure.
Data Science as a Service (DSaaS) - involves an outside company providing advanced analytics applications (gathered using data science) to corporate clients for their business use.

Some publications will tell you that CompaaS, NaaS, and DSaaS will not be on the exam. However, these three categories are specifically listed in the CCSP Client Information Bulletin (CIB) from (ISC)2.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
33
Q

What is the primary coverage area in ISO/IEC 28000:2007?

A. Information security management systems
B. Security management systems for the supply chain
C. Risk management
D. Security techniques for PII in public clouds

A

B. Security management systems for the supply chain

Explanation:
ISO/IEC 28000:2007 covers security management systems for the supply chain.

ISO/IEC 31000 covers risk management. ISO/IEC 27001 covers information security management systems. ISO/IEC 27018:2014 covers security techniques for personally identifiable information (PII) in public clouds.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
34
Q

You want to be able to control the number of API requests made within a four-hour period. Which network functionality should you configure?

A. Rate Limiting
B. Filtering
C. Bandwidth allocation
D. Routing

A

A. Rate Limiting

Explanation:
You should configure rate limiting to control the number of API requests made within a four-hour period. Rate limiting controls the amount of traffic sent or received.

Routing directs the flow of traffic. Filtering allows or denies access to resources. Bandwidth allocation allows you to control the amount of bandwidth used.

You need to understand two other network functionalities:

Address allocation - provides one or more static or dynamic addresses to a cloud resource.
Access control - grants or denies access to a specific cloud resource.
35
Q

IBM Watson, AWS IA, and Microsoft Cognitive APIs are all examples of the effect on cloud computing of which of the following?

A. Machine Learning
B. Blockchain
C. IoT
D. Containers

A

A. Machine Learning

Explanation:
IBM Watson, AWS IA, and Microsoft Cognitive APIs are all examples of the implementation of cognitive computing or machine learning. These components learn from the massive amount of data available in the cloud.

The other options impact cloud computing but do not manifest themselves as tools that perform machine learning.

Blockchain is best known for its role in the rise of cryptocurrencies. One of the items coming out of blockchain is the smart contract. A smart contract is a computer protocol intended to digitally facilitate, verify, or enforce the negotiation or performance of a contract. Smart contracts allow the performance of credible transactions without the use of third parties (lawyers). This is just one example of the ways in which security can be made both more transparent and effective using Blockchain.

The Internet of Things (IoT) has had the effect of increasing the need for cloud computing services because it is usually preferable to manage IoT resources from the cloud.

Containers comprise an alternative method of providing virtualization. While container-based virtualization is not likely to completely replace server virtualization because the security is still better with a virtual machine, it has many advantages in a cloud environment. It can be deployed quicker than a VM, and containers let you pack more computing workloads onto a single server, resulting in less hardware purchase, lower facilities cost for your datacenter, and a reduction in experts required to manage that equipment.

36
Q

Which process establishes with adequate certainty the identity of an entity?

A. Identification
B. Authorization
C. Authentication
D. Integrity

A

C. Authentication

Explanation:
Authentication is the process that established with adequate certainty the identity of an entity.

Identification is the process of claiming an identity. Authorization is the process of granting access to resources. Integrity is the process of ensuring that data is real, accurate, and protected from unauthorized modification.

37
Q

Which PKI operation should be performed when a certificate is compromised?

A. Distribution
B. Recovery
C. Revocation
D. Escrow

A

C. Revocation

Explanation:
When a certificate is compromised, meaning that the key has been discovered, the certificate should be revoked, which, through the publishing of the Certificate Revocation List (CRL), will be known by all systems, resulting in the refusal to honor the certificate.

Recovery is an operation performed when a key is lost. It is the process of restoring it from a backup.

Distribution is the sharing of a key with those who need to use it.

Escrow is the process of placing a key in the possession of a third party to hold in case it is needed in the future.

38
Q

Which cloud vulnerability occurs when an application allows untrusted data to be sent to a web browser without proper validation or escaping?

A. CSRF
B. DoS
C. SQL Injection
D. XSS

A

D. XSS

Explanation:
Cross-site scripting (XSS) is one of the most widely seen application flaws, next to injection. XSS allows a malicious user to execute code or hijack sessions in the user’s browser.

Cross-site request forgery (CSRF) manipulates an authenticated user’s browser to send a forged HTTP request along with cookies and other authentication information in an effort to force the victim’s browser to generate a request that a vulnerable application thinks is a legitimate request from the user.

A Denial of Service (DoS) attack is one that makes the cloud application unavailable to legitimate users by bombarding the application with so many requests that performance is affected, often resulting in application unavailability.

An SQL injection is when SQL commands are entered into a dialog box and used to access data in a database.

39
Q

Which of the following statements is NOT true of software supply chain management?

A. The API chain can spin out of control to the point that the organization does not know if they are sending confidential data to another party
B. Organizations have a clear understanding of the origin of the software or code being used
C. It is concerned with the issues of non-secure software beyond the organizational boundary
D. Organizations are consuming software that is being development by a third party or accessed with or through third-party libraries

A

B. Organizations have a clear understanding of the origin of the software or code being used

Explanation:
As this is a negatively worded question, the correct option is that organizations having a clear understanding of the software or code origin. In fact, with software supply chain management, organizations do NOT have a clear understanding of the origin of the software or code being used.

All of the other statements are true regarding software supply chain management:

Software supply chain management is concerned with the issues of non-secure software beyond the organizational boundary.
Organizations are consuming software that is being developed by a third party or accessed with or through third-party libraries.
The API chain can spin out of control to the point that the organization does not know if they are sending confidential data to another party.
40
Q

As part of your company’s cloud deployment, you must ensure that multi-factor authentication is implemented. Which of these would provide the BEST security?

A. Username, password and PIN
B. Username, smart card and hand scan
C. Username, password and smart card
D. Password, PIN and smart card

A

A. Username, password and PIN

Explanation:
Of the options listed, the BEST security is provided by the username, smart card, and hand scan. This uses three factors of authentication: something you know (username), something you have (smart card), and something you are (hand scan).

The username, password, and smart card option only provides two factors of authentication: something you know (username and password) and something you have (smart card).

The username, password, and PIN option only provides one factor of authentication: something you know.

The password, PIN, and smart card option only provides two factors of authentication: something you know (password and PIN) and something you have (smart card).

41
Q

You have been asked to ensure that data that is stored in your company’s cloud is encrypted during transmission. Which of the following should you deploy?

A. EFS
B. Database encryption
C. IPSec
D. Bitlocker

A

C. IPSec

Explanation:
Internet Protocol Security (IPSec) should be deployed. It is a protocol that will encrypt data as it is being transmitted. IPSec is used to encrypt data in motion (DIM).

BitLocker, database encryption, and Encrypting File System (EFS) are deployed to protect data at rest (DAR), not DIM

42
Q

Which CSA Star program level is a complimentary offering that documents the security controls provided by various cloud computing offerings by performing a self-assessment?

A. Level 1
B. None of these
C. Level 2
D.Level 3

A

A. Level 1

Explanation:
The CSA Security, Trust, and Assurance Registry (STAR) Level 1 is a free designation that allows cloud computing providers to document their security controls by performing a self-assessment against CSA best practices. The results are made publically available to customers.

Level 2 has three different designations, which provide attestation, certification, and assessment:

Level 2: CSA STAR Attestation - This level is a collaboration between CSA and the AICPA to provide guidelines for CPAs to conduct SOC 2 engagements using criteria from the AICPA (Trust Service Principles, AT 101) and the CSA Cloud Controls Matrix.
Level 2: CSA STAR Certification - This level is a technology-neutral certification that is based on a rigorous, independent third-party assessment of a cloud service provider's security. The certification leverages the requirements of the ISO/IEC 27001:2005 management system standard together with the CSA Cloud Controls Matrix.
Level 2: CSA C-STAR Assessment - This level is an assessment that is based on an independent third-party assessment of a cloud service provider's security for the Greater China market. The assessment harmonizes CSA best practices with Chinese national standards. 

Level 3 is a forthcoming program that will provide an ongoing automated confirmation of the current security practices of cloud providers. Providers will publish their security practices according to CSA formatting and specifications, including validation of Cloud Controls Matrix (CCM), Cloud Trust Protocol (CTP), and CloudAudit (A6) standards. Customers and tool vendors will retrieve the information in a variety of contexts.

43
Q

Which of the following tests of the BCP provides the highest level of simulation, including notification and resource mobilization?

A. Functional Drill
B. Tabletop exercise
C. Walk through drill
D. Full interruption test

A

D. Full interruption test

Explanation:
A full-interruption test provides the highest level of simulation, including notification and resource mobilization. A full-interruption, also referred to as a full-scale test, is the most comprehensive BCP test. A real-life emergency is simulated as closely as possible. It is important to properly plan this type of test to ensure that business operations are not negatively affected. This usually includes processing data and transactions using backup media at the recovery site. All employees must participate in this type of test, and all response teams must be involved.

None of the other test provides the same level of simulation as a full-interruption test.

A tabletop exercise, also referred to as structured walk-through test, provides the least amount of simulation. During this test, business unit managers and BCP process employees are present. A discussion of each individual’s responsibilities is completed. Sometimes it includes individual and team training of the step-by-step procedures outlined in the BCP. The testing process provides clarification and highlighting of critical plan elements. Problems are also noted. This is the first type of test that is usually completed for a BCP.

A walk-through drill, also referred to as a simulation test, is more complicated than a tabletop exercise. In a walk-through drill, a scenario is selected and applied to the BCP. Usually only the operational and support personnel for the BCP process attend this meeting. Attendees practice certain functional steps to ensure that they have the knowledge and skills needed to complete them. For this test, it is critical for employees to act out the critical steps, recognize difficulties, and resolve problems.

A functional drill, also referred to as a parallel test, involves moving personnel to the recovery site(s) to attempt to establish communications and perform real recovery processing. The drill will help the organization determine whether following the BCP will successfully recover critical systems at an alternate processing site. Because a functional drill fully tests the BCP, all employees are involved. It demonstrates emergency management capabilities and tests procedures for evacuation, medical response, and warnings.

44
Q

A cloud service provider (CSP) is deploying a new cloud data center. As part of this deployment, new hardware has been purchased for the physical infrastructure for the data center. You are concerned about the secure configuration of hardware, particularly that the new hardware will not be updated properly. Which of the following is MOST likely to be overlooked as part of this secure configuration?

A. Application Updates
B. Operating System Updates
C. Driver Updates
D. Firmware Updates

A

A. Application Updates

Explanation:
Firmware updates are most likely to be overlooked as part of the secure configuration of hardware. Firmware is permanent software that is programmed into read-only memory. While firmware can be updated, firmware updates do not often garner as much attention as operating system, application, or driver updates.

Operating system, application, and driver updates are published by the hardware vendors and often announced by the vendor. In some cases, vendors, such as Microsoft, have a regular day on which all such updates are published.

CSPs must keep up with the hardware deployed and ensure that all updates are made in a timely manner.

45
Q

Which term is used for SaaS storage that exists only as long as its instance is up?

A. Raw Storage
B. Ephemeral Storage
C. Unstructured Storage
D. Long-term Storage

A

B. Ephemeral Storage

Explanation:
Ephemeral storage is Software as a Service (SaaS) storage that exists only as long as its instance is up.

Raw storage is an SaaS storage option in a virtualization environment that enables a storage logical unit number (LUN) to be directly connected to a virtual machine (VM) from the storage area network (SAN).

Long-term storage is an SaaS storage option tailored for use when data must be archived.

Unstructured storage is a Platform as a Service (PaaS) storage that does not store data in a traditional row-column database. The other type of PaaS storage is structured storage, which is stored in a traditional database.

46
Q

Which of the following provides guidelines that can be used to assess the security of an organization that handles credit card data?

A. FIPS 140-2
B. PCI DSS
C. ISO/IEC 27017
D. Common Criteria

A

B. PCI DSS

Explanation:
The Payment Card Industry Data Security Standard (PCI-DSS) is a standard for handling credit card data. It provides recommendations that, when followed and audited, can be used to provide assurance to customers and business partners that an organization is doing all they can to protect this information.

Common Criteria is a system developed to provide security assessment audits of computing products.

Federal Information Processing Standards (FIPS) 140-2 specifies standards for use in computer systems by non-military American government agencies and government contractors.

ISO/IEC 27017 is an addition to ISO/IEC 20002 and includes further controls with implementation guidance that specifically relate to cloud services.

47
Q

Which of the following industries faces specialized compliance requirements by the NERC/CIP?

A. Accounting Organizations
B. Medical Providers
C. Electric Utilities
D. Payment Card Industry

A

C. Electric Utilities

Explanation:
The North American Electric Reliability Corporation (NERC) Critical Infrastructure Protection (CIP) is a non-profit that works with all stakeholders to develop standards for power system operation and monitoring, and enforcing compliance with those standards.

Medical providers are regulated by the Health Insurance Portability and Accountability Act (HIPAA).

The payment card industry is regulated by the Payment Card Industry (PCI).

Accounting organizations are guided by the Generally Accepted Accounting Principles (GAAP). Financial reporting is regulated or standardized by organizations such as the Securities and Exchange Commission (SEC), the Financial Accounting Standards Board (FASB), the International Accounting Standards Board (IASB), and by the Sarbanes-Oxley (SOX) Act.

48
Q

Which of the following is an international standard that focuses on designing, implementing, and reviewing risk management processes and practices?

A. ISO/IEC 270170
B. ISO/IEC 27000
C. ISO/IEC 27006
D. ISO/IEC 31000

A

D. ISO/IEC 31000

Explanation:
ISO/IEC 31000 is an international standard that focuses on designing, implementing, and reviewing risk management processes and practices.

ISO/IEC 27000 covers information security management systems, including an overview and vocabulary.

ISO/IEC 27010 covers information security management for inter-sector and inter-organizational communications.

ISO/IEC 27006 covers the requirements for bodies providing audit and certification of information security management systems.

49
Q

While reviewing the audit logs related to your company’s cloud environment, you discover a series of events that you suspect was carried out by an unauthorized user. You need to determine which user or machine is responsible for these events. Which of the following attributes should you examine?

A. Source Address
B. Application Identifier
C. Destination Address
D. URL from data

A

A. Source Address

Explanation:
You should examine the source address to identify which user or machine is responsible for these events. The source address could include the device identifier, IP address, cell tower ID, or mobile telephone number, depending on the device used.

The destination address will reveal which assets where accessed as part of the event. The application identifier will tell you which application was used as part of the event. The URL form data will tell you which web address was involved in the attack. While the form data may provide information on the unauthorized user, unauthorized users often do not enter valid data into forms.

In this scenario, you want to identify who completed an event. To determine this information, you should examine the following attributes:

Source address
User identity (if authenticated or otherwise known)
Geolocation
Service name and protocol
Window, form, or page (such as URL address)
Application address
Application identifier
50
Q

As part of the SLA agreement with your CSP, the CSP must ensure that every component is redundant. Which countermeasure does this provide?

A. Continuous Uptime
B. Access Controls
C. Automation of Controls
D. Physical Controls

A

A. Continuous Uptime

Explanation:
Ensuring that every component is redundant provides continuous uptime. This protects against component failure and allows updates to be performed without affecting performance.

Automation of controls is a countermeasure that ensures any controls are implemented immediately. For example, VM images should include malware software, encryption, and audit logging. This ensures that any new VMs deployed will automatically have the controls needed.

Access controls is a countermeasure that protect resources no matter how small or large the cloud deployment is. Strong access control policies ensure the privacy and confidentiality of data in the cloud.

Physical controls are countermeasures that ensure that access to the physical resources on which the cloud is deployed is limited.

51
Q

Which of the following functions is used to determine the user’s right to access a certain resource?

A. Federated Identity management
B. Access management
C. Single Sign-On
D.Authorization

A

D.Authorization

Explanation:
Authorization determines the user’s right to access a certain resource.

Access management ensures that users can access relevant resources based on their credentials and characteristics of their identity.

Single sign-on (SSO) ensures that a single user authentication process grants access to multiple information technology (IT) systems or even organizations.

Federated identity management (FIM) provides the policies, processes, and mechanisms that manage identity and trusted access to systems across organizations.

52
Q

You are an administrator for a cloud service provider (CSP). During negotiations with a new client, the client mentions several concerns regarding virtualization. The client is particularly concerned that attackers might be able to alter the logic of a secure kernel-based virtual machine (KVM). Which KVM feature will prevent this from happening?

A. Isolated Data Channels
B. Housing Intrusion Detection
C. Fixed Firmware
D. Safe Buffer Design

A

C. Fixed Firmware

Explanation:
Fixed firmware will prevent modifications to the logic of a KVM. A secure KVM prevents data loss from the server to a connected device.

A safe buffer design for a secure KVM means that a memory buffer is not included and the keyboard buffer is cleared after data is transmitted. The memory buffer is different that the logic of a KVM.

Housing intrusion detection is a feature that alerts administrators if the housing for the secure KVM has been physically opened. In most cases, the secure KVM becomes inoperable if this occurs.

Isolated data channels prevent data from being transferred between connected devices and the secure KVM. But this does not protect KVM logic.

53
Q

Which activity will require a level of cooperation from the cloud provider?

A. Auditing
B. Adding Resources
C. Provisioning of VMs
D. Deleting Resources

A

A. Auditing

Explanation:
Auditing will require a level of cooperation from the provider as they are normally very reluctant to allow access physically to their cloud or network diagrams.

Provisioning of VMs is typically an activity that the customer can do themselves.

Both adding and removing computing resources can be done by the customer and is often done automatically based on workload.

54
Q

Your company is currently negotiating a service agreement with a cloud provider. Management has asked you to determine if collecting and making available necessary evidence related to the operation and use of the cloud is included as part of the service agreement. What is the term for this?

A. Auditability
B. Performance
C. Governance
D. Privacy

A

A. Auditability

Explanation:
Auditability is collecting and making available necessary evidence related to the operation and use of the cloud.

Performance includes several non-functional facets relating to the operation of a cloud service, including:

    Service availability
    Service request response time
    Service request transaction rate
    Service request latency
    Data throughput rate
    Number of concurrent service requests
    Data storage capacity

Governance is the system by which the provisioning and usage of cloud services are directed and controlled. Internal and external factors affect governance.

Privacy is the protection of personally identifiable information (PII).

55
Q

Which of the following statements is true regarding the physical environment of the cloud?

A. The power and pipe components limit the resources that can be deployed in the data center
B. The basic level of service is referred to as power, pipe and ping
C. All of the statements are true
D. To ensure that there is no single point of failure, all components that could break down should be replicated

A

C. All of the statements are true

Explanation:
All of these statements regarding the physical environment of the cloud are true.

The basic level of service is referred to as power, pipe, and ping. The power and pipe components limit the resources that can be deployed in the data center. The power component is electrical power supplied to the data center. The pipe component is the Internet connection used by the data center. The ping component is the ability to remotely connect to the data center’s servers.

To ensure that there is no single point of failure, all components that could break down should be replicated.

56
Q

In which of the following risk management responses does the organization do nothing?

A. Mitigation
B. Avoidance
C. Acceptance
D. Transfer

A

C. Acceptance

Explanation:
In some cases, all available mitigations may cost more than the expected loss from the issue. In that case, it makes sense to accept the risk.

Avoidance is the act of discontinuing the activity or process that is causing the risk.

Mitigation is the act of selecting a security control that reduces either the likelihood or the impact of a risk.

Transfer is the process of transferring the risk to a third party, such as an insurance company.

57
Q

According to the STRIDE threat model, which threat occurs when the attacker assumes the identity of a trusted subject?

A. Spoofing
B. Elevation of Privilege
C. Information Disclosure
D. Tampering

A

A. Spoofing

Explanation:
According to the Spoofing, Tampering, Repudiation, Information Disclosure, Denial of Service, and Elevation of Privilege (STRIDE) threat model, a spoofing threat occurs when the attack assumes the identity of a trusted subject. It uses the six threats as defined by its name.

A tampering threat occurs when data is altered by the attacker. An information disclosure threat occurs when information is obtained by an unauthorized user. An escalation of privilege threat occurs when an attackers gains a higher privilege level.

58
Q

Your company’s IT department has decided to implement Apache CloudStack. What is Apache CloudStack?

A. A paid and open-source computer software for building Amazon Web Services (AWS)- compatible private and hybrid cloud computing environments
B. An open source cloud computing software for creating, managing, and deploying infrastructure
C. A third party entity or company that extends or enhances value to cloud services customers through relationships with multiple cloud service providers
D. A software application that is accessed via the Internet and may include an agent or applet installed locally on the users device

A

B. An open source cloud computing software for creating, managing, and deploying infrastructure

Explanation:
Apache CloudStack creates, manages, and deploys clouds. It is an open-source application.

A cloud application or cloud app is a software application accessed via the Internet and which may include an agent or applet installed locally on the user’s device. Microsoft Office 365, Salesforce, and GSuite are examples of cloud applications.

A cloud service broker (CSB) is a third-party entity or company that extends or enhances value to cloud services customers through relationships with multiple cloud service providers. Examples of CSBs include Blue Coat Systems and Netskope.

Eucalyptus is a paid and open-source computer software for building Amazon Web Services (AWS)-compatible private and hybrid cloud computing environments.

59
Q

Which of the following processes will be required of open source software in a regulated environment?

A. Credentialed
B. Accreditation
C. Certification
D. Validation

A

D. Validation

Explanation:
In a regulated environment, open source applications must be validated to be acceptable. This may mean performing a security validation process.

Certification is a process that evaluates the software’s technical components for functionality. It does not validate the software’s use in a regulated environment.

Accreditation is the full approval and acceptance of the software by management.

While there is no credentialing process in software assessment, there are credentialed vulnerability scans that are run with administrator rights, resulting in a deeper scan.

60
Q

You have recently been hired by an organization to help them classify the sensitive data that they collect from their customers. While performing data discovery, you notice that sexual orientation and religious affiliation information are being collected. Within which data category does this information fit?

A. Internet Data
B. Personal Data
C. Biometric data
D. Sensitive Data

A

D. Sensitive Data

Explanation:
Sexual orientation and religious affiliation fit within the sensitive data category. Other information in this category includes health information and political beliefs.

Personal data includes address, phone number, date of birth, and gender. Personal data can usually be discovered with a minimal amount of investigation.

Biometric data includes fingerprints, finger scans, retina scans, and other biometric data that would need to be captured using a biometric scanner or software.

Internet data includes browsing habits, cookies, and other information regarding an individual’s Internet usage.

61
Q

As part of an audit of their cloud services, an organization is documenting the list of current services and resources utilized from its CSP. In which phase of the audit planning process is the organization involved?

A. Refine the audit process, and document lessons learned
B. Define the audit scope
C. Define the audit objectives
D. Conduct the audit

A

B. Define the audit scope

Explanation:
The organization is involved in the define the audit scope phase. This phase includes the following steps:

Document the core focus and boundaries of the audit.
Define the key components of services.
Define the cloud services to be audited.
Define the geographic locations that are permitted and required and those that are actually being audited.
Define the key stages to audit.
Document the CSP contacts.
Define the assessment criteria and metrics.
Document final reporting dates.

The define the audit objectives phase includes the following steps:

Document and define the audit objectives.
Define the audit outputs and format.
Define the frequency and the audit focus.

The conduct the audit phase includes ensuring that the appropriate personnel are engaged and that the appropriate tools are available.

The refine the audit process and document lessons learned phase includes changing the audit process based on feedback from the audits that have been completed. This includes factoring in any provider changes that have occurred.

62
Q

Your company’s cloud service provider (CSP) manages and allocates compute resources using either a per-guest OS basis or a per-host basis within the cluster. Which of the following is NOT used to manage compute resources?

A. Filters
B. Reservations
C. Limits
D. Shares

A

A. Filters

Explanation:
Filters are NOT used to manage compute resources. Filters are used to allow or deny content or access to resources. Filters are not related to the compute resources, but rather to the network and communication component of the cloud infrastructure.

Reservations, limits, and shares are used to manage compute resources. Reservations guarantee a minimum resource allocation to a guest and can be used for CPU or RAM resources. Limits create a maximum fixed or expandable resource allocation. Shares are values assigned to prioritize compute resource access for all guests assigned.

63
Q

You are the security administrator for your company’s cloud deployment that is physically located in Chicago, IL. Recently, several new laws as regulations have been enacted that affect your company’s data. You have laws and regulations from the EU, the US, the state of IL, and the city of Chicago that affect the same type of data. Which statement is TRUE of how you should handle this?

A. The US laws and guidelines should take precedence over all others
B. The least strict laws and guidelines should be adopted
C. The strictest laws and guidelines should be adopted
D. The EU laws and guidelines should take precedence over all others
E. The IL laws and guidelines should take precedence over all others.

A

C. The strictest laws and guidelines should be adopted

Explanation:
In this scenario, the strictest laws and guidelines should be adopted. If multiple laws and regulations affect a company’s data, it is the responsibility of the organization to thoroughly review all the laws and regulations and determine which are the strictest. By adopting the strictest laws and regulations, you ensure that the company is thoroughly protected regardless of the entity.

You should not adopt the least strict laws and guidelines as this would leave you open to legal issues.

International, federal, or state laws and guidelines should not take precedence over other laws and guidelines.

Organizations may need to solicit legal advice to help determine which laws and regulations should apply to the organization’s data.

64
Q

You are hired as a security architect for a company. Your manager informs you that the company currently hosts the cloud environment on corporate resources. Based on the business continuity plan, the company uses a cloud provider to provide the backup solution. Which model best describes this solution?

A. Private Architecture, Cloud service backup
B. Cloud Operations, Cloud Service Backup
C. Cloud operations, third party cloud backup provider
D. Private Architecture, Private Backup

A

A. Private Architecture, Cloud service backup

Explanation:
The scenario given is best described as a private architecture, cloud service backup solution. A private architecture solution hosts the cloud environment on private, company-owned resources. A cloud service backup uses a cloud provider to provide the backup solution.

A cloud operations solution uses a cloud provider to host the cloud environment. A third-party cloud backup provider uses a different cloud provider than the operations provider to host the backup solution. A private backup uses private, company-owned resources for the backup solution.

65
Q

Management is concerned that deploying cloud resources will result in the failure of storage, memory, routing, and reputation separation between different tenants. Which risk class identified in European Network and Information Security Agency (ENISA) covers this?

A. Management Interface Compromise
B. Lock In
C. Isolation Failure
D. Loss of governance

A

C. Isolation Failure

Explanation:
The isolation failure risk class identified in ENISA's Cloud Computing: Benefits, Risks, and Recommendations for Information Security includes the failure of storage, memory, routing, and reputation separation between different tenants.

The management interface compromise class includes the risk that the management interface is accessible through the Internet. The lock-in class includes the risk that migrating from one service provider to another can be difficult because of the different tools. The loss of governance class includes the risk that the client cedes control to the cloud provider.

66
Q

Which of the following can be used to separate VMs in the cloud at both Layer 2 and Layer 3?

A. Blacklists
B. VPNs
C. VLANs
D. Subnets

A

C. VLANs

Explanation:
A virtual local area network (VLAN) is a logical isolation of particular segments of a network at both Layer 2 and Layer 3. Even though all the machines may be physically connected to the same switch, devices in a VLAN can only communicate directly with other members of the same VLAN, unless the traffic first passes through a router, at which point security access lists can be implemented to control access from one VLAN to another.

Virtual private networks (VPNs) are used for remote access to a network and are not used to separate VMs in the cloud at both Layer 2 and Layer 3.

Subnets can be used to separate devices in the cloud, but this separation is at Layer 3 only, not both Layer 2 and Layer 3.

Blacklists are used to deny access to a variety of resources, such as email addresses and websites, but they are not use to separate VMs in the cloud at both Layer 2 and Layer 3.

67
Q

You are using encryption for the data at rest in your company’s public cloud deployment. Which of the following statements regarding the key management is FALSE?

A. Keys should be stored in secure dedicated hardware
B. Cryptographic keys should never be transmitted in the clear
C. Random number generation should be conducted as a trusted process
D. Key management functions should be coordinated with the CSP

A

D. Key management functions should be coordinated with the CSP

Explanation:
Key management functions should NOT be coordinated with the cloud service provider (CSP). They should be conducted separately from the CSP.

All of the other statements regarding key management are true. Random number generation should be conducted as a trusted process. Keys should be stored in secure dedicated hardware. Cryptographic keys should never be transmitted in the clear.

68
Q

Your company is working with a cloud service provider (CSP) to implement a new application for accessing the cloud. Management wants the new application to be developed according to ISO/IEC 27034’s application security guidelines.

Developers work with the CSP to document the organizational nominative framework (ONF). The CSP has asked that you ensure that the regulatory section of the ONF is complete. What does this section contain?

A. The controls that are required to protect an application based on the identified threats, the context, and the targeted level of trust
B. The required and available technologies that are applicable to application security
C. The standards, laws and regulations that affect the company and its applications
D. The application security policies, standards, and best practices adopted by the company

A

C. The standards, laws and regulations that affect the company and its applications

Explanation:
The regulatory section of the ONF contains the standards, laws, and regulations that affect the company and its applications.

The business context section contains the applications security policies, standards, and best practices adopted by the company.

The technical context contains the required and available technologies that are applicable to application security.

The application security control library section contains the controls that are required to protect an application based on the identified threats, the context, and the targeted level of trust.

69
Q

Your company is deciding whether to use proprietary application programming interfaces (APIs) versus open source APIs. Which of the following is a benefit of open source APIs?

A. Ability to review code
B. Vendor liability
C. Formal change management
D. Formal patch management

A

A. Ability to review code

Explanation:
A benefit of open source APIs is the ability to review code. Another benefit is the reduced price of open source APIs.

All of the other options are benefits to proprietary or vendor APIs. Vendor liability, formal patch management, and formal change management is usually the case when you use proprietary APIs.

70
Q

Which of the following is an end-user advocacy group that is dedicated to accelerating cloud’s successful adoption and to drilling down into the standards, security, and interoperability issues that surround the transition to the cloud?

A. NIST
B. FIPS
C. TCI
D. CWG

A

D. CWG

Explanation:
The Cloud Working Group (CWG), which took over the mission of the Cloud Standards Customer Council™ (CSCC™), is an end-user advocacy group. It is dedicated to accelerating cloud’s successful adoption, as well as to drilling down into the standards, security, and interoperability issues that surround the transition to the cloud.

The National Institute of Standards and Technology (NIST) is a national organization that sets standards for multiple industries and fields, including U.S. government agencies. NIST 800-53 is a guidance document with the primary goal of ensuring that appropriate security requirements and controls are applied to all U.S. federal government information in information management systems.

The Federal Information Processing Standard (FIPS) is a set of standards that are published by NIST’s Computer Security Resource Center for use by U.S. government information systems. FIPS Publication 140-2, (FIPS PUB 140-2), is a U.S. government computer security standard used to approve cryptographic modules.

Established by the Cloud Security Alliance (CSA), the Trusted Cloud Initiative (TCI) helps cloud service providers (CSPs) develop identity, access, and compliance management guidelines.

71
Q

While monitoring your company’s cloud solution, a technician notices that there were several hours’ worth of entries in an event log that has been erased. Upon research, you realize that this log erasure will allow a user to deny that certain events occurred. Which type of threat is this according to the STRIDE threat model?

A. Spoofing
B. Denial of Service
C. Elevation of Privilege
D. Repudiation

A

D. Repudiation

Explanation:
According to the Spoofing, Tampering, Repudiation, Information Disclosure, Denial of Service, and Elevation of Privilege STRIDE threat model, this is a repudiation threat because users will be able to deny that certain events occurred. It uses the six threats as defined by its name.

A denial of service threat occurs when an attacker overloads a system so that legitimate access is denied. An elevation of privilege threat occurs when an attackers gains a higher privilege level. A spoofing threat occurs when the attack assumes the identity of a trusted subject.

72
Q

Which of the following is required to protect web-based applications per PCI-DSS guidelines?

A. NGFW
B. WAF
C. DAM
D. API Gateway

A

B. WAF

Explanation:
A web application firewall (WAF) is required to protect web-based applications per PCI-DSS guidelines. A WAF is a firewall specifically designed to assess all input to the applications. It looks for common web-based attacks and other web-specific compromises.

A next-generation firewall (NGFW) is one that performs deep packet inspection. It is not designed specifically for web applications and is not required to protect web-based applications per PCI-DSS.

A Database Activity Monitor (DAM) is used to protect a database application, not a web-based application.

An application programming interface (API) gateway is used to impose controls on API activity. It is not required to protect web-based applications per PCI-DSS

73
Q

You have discovered that one of your firewalls, due to a misconfiguration, is allowing some unwanted traffic. You have corrected the issue. With which of the following methods did you treat the risk?

A. Avoid
B. Accept
C. Share
D. Mitigate

A

D. Mitigate

Explanation:
You have addressed this risk issue with a control that modified the configuration. This is known as risk mitigation.

You did not use avoidance to address the issue. Avoidance means to stop doing or using what is causing the issue. In this case, it would amount to decommissioning the firewall.

Risk acceptance means leaving things as they are. While it sounds risky, in many cases there are no mitigations possible or the mitigations are more expensive than the expected potential loss from a breach occurring as a result of the issue.

You have not shared the risk. Risk sharing means that two entities share the risk. Examples would be implementing a cloud backup solution so that both your organization and the cloud provider shares the risk of losing data.

74
Q

With which cloud model is identity and access management the most critical?

A. PaaS
B. XaaS
C. IaaS
D. SaaS

A

A. PaaS

Explanation:
While identity and access management (IAM) is important to all cloud deployments, it is most critical in Platform as a Service (PaaS) models. Because PaaS comprises a development platform that not only provides but encourages shared access, it is critical that the identity and access management system be robust. Otherwise you lose both security and accountability, which is a key need in a shared environment.

One of the most important security considerations with Software as a Service (SaaS) is the lack of security surrounding app stores. In many cases, these app stores make apps available that the cloud vendor did not develop but simply resells. These apps may not have been adequately tested from a security standpoint. Google Android has had issues with this in the past. It is possible and advisable to restrict access in the app store to only company developed or approved apps.

The biggest security consideration for Infrastructure as a Service (IaaS), especially in a shared public cloud, is the proper segmentation of resources between tenants. This includes the isolation of virtual machines (VMs) from various tenants.

Anything as a Service (XaaS) is a general term that applies to all forms of cloud service offerings and not to a specific type.

75
Q

Which aspect of systems and communications protections varies between being the responsibility of the enterprise, the cloud service provider (CSP), or both, depending on which cloud model is used?

A. Physical Security
B. Data Security
C. Platform Security
D. Infrastructure Security

A

C. Platform Security

Explanation:
Depending on which cloud model is being used, platform security may be the responsibility of the enterprise, the CSP, or a shared responsibility. In the Infrastructure as a Service (IaaS) model, it is an enterprise responsibility. In the Platform as a Service (PaaS) model, it is a shared responsibility. In the Software as a Service (SaaS) model, it is a CSP responsibility.

Physical security is a CSP responsibility no matter which model is used.

Data security is an enterprise responsibility no matter which model is used.

Infrastructure security is a shared responsibility in IaaS models, and a CSP responsibility in PaaS and SaaS models.

76
Q

Your colleague is confused about the difference between a CSP and an MSP. Which of the following would you include as a key differentiator in an explanation?

A. Control exerted over the data
B. Location of the hardware
C. Cost of the services consumed
D. Time requires to provision a service

A

A. Control exerted over the data

Explanation:
The main difference between a cloud service provider (CSP) and a managed service provider (MSP) is the management influence and control exerted over the data and operational processes. In an MSP, the consuming enterprise sets the governance. In a CSP, the service provider sets the governance.

The location of hardware and the cost of services are not core differentiators between a CSP and MSP.

The time required to provision services is a function of technology selection and incorporated management processes. It is not a function of whether the service provider is an MSP or CSP.

77
Q

Which of the following is another term for de-identification?

A. De-Duplication
B. Data Loss Prevention
C. Hashing
D. Anonymization

A

D. Anonymization

Explanation:
Anonymization is a process that removes any identifying information, such as names and addresses, and is another term for de-identification.

De-duplication is a process that removes redundant data from a data set.

Hashing algorithms are used to verify the integrity of data.

Data loss prevention (DLP) software can be used to monitor the edge of the network or to monitor a single device for the exfiltration of sensitive data.

78
Q

You will be storing confidential data in the cloud. You need to ensure that the data is protected while at rest. Which technologies could you deploy? (Choose two.)

A. IPSec
B. TLS
C. SSL
D. Volume Encryption
E. Drive Encryption
A

D. Volume Encryption
E. Drive Encryption

Explanation:
You could deploy drive or volume encryption to ensure that data is protected while at rest. This type of encryption encrypts the entire drive or volume and protects data at rest.

SSL, TLS, and IPSec are technologies that encrypt communication. They are used to protect data in transit.

79
Q

A broad definition of cloud computing includes the following characteristics:

Broad network access
On-demand services
Resource pooling
Measured or “metered” service

Which of the following statements best describes on-demand services?

A. Uses advanced routing and other technologies to prevent network bandwidth bottlenecks
B. Charges customers for only the resources they use
C. Allows customers to scale their needs in real time without involving the provider
D. Allows the cloud provider to meet customer demands by allowing assets to be shared by multiple customers

A

C. Allows customers to scale their needs in real time without involving the provider

Explanation:
On-demand services refers to allowing customers to scale their needs in real time without involving the provider.

Broad network access refers to using advanced routing and other technologies to prevent network bandwidth bottlenecks.

Resource pooling allows the cloud provider to meet customer demands by allowing assets to be shared by multiple customers. Resource pooling enables multi-tenancy.

Measured or “metered” service refers to charging cloud computing customers for only the resources they use.

According to (ISC)2, the key characteristics of cloud computing include on-demand self-service, broad network access, multi-tenancy, rapid elasticity and scalability, resource pooling, and metered service.

Self-service means that customers control the services that they need. Multi-tenancy means that multiple tenants may be using the same resources. Rapid elasticity means that resources can be flexibly allocated as needed for immediate usage, instead of requiring the customer to purchase resources. Rapid scalability means cloud provider has the ability to quickly meet the increasing or decreasing needs of the tenant.

80
Q

You are currently working to ensure that the data retention policies for your company’s cloud deployment are correct. Which phase of the cloud data life cycle will these policies affect?

A. Store
B. Archive
C. Create
D. Destroy

A

B. Archive

Explanation:
The data retention policies will affect the Archive phase of the cloud data life cycle. The data archive policies will also affect the Archive phase.

The data destruction policies will affect the Destroy phase of the cloud data life cycle. The data audit policies will affect all phases of the cloud data life cycle.

81
Q

Which of the following is measured in data loss, not time?

A. ALE
B. RTO
C. RPO
D. MTD

A

C. RPO

Explanation:
Recovery point objective (RPO) is measured in data loss, not time. It is the maximum allowable amount of data you can afford to lose during an issue. This value can be reduced by more frequent backups.

Maximum tolerable downtime (MTD) is the maximum amount of time you can continue without a resource.

Recovery time objective (RTO) is the target time in which recovery occurs. It should be less than the MTD to provide extra time.

Annualized loss expectancy (ALE) is the financial loss expected from an event spread across the time period between events based on the event’s historical frequency. This yields a yearly average amount.

82
Q

Which of the following is NOT a challenge with data discovery in the cloud?

A. Creating the data
B. Performing preservation and maintenance
C. Accessing the data
D. Identifying where the data resides

A

A. Creating the data

Explanation:
Creating the data is NOT a challenge with data discovery in the cloud.

All of the other listed options are challenges with data discovery in the cloud.

Identifying where the data resides is a challenge because you do not have access to the physical resources on which the data resides, and it is often hard to discover the actual physical location.

Accessing the data can be a challenge because of the rights and privileges that are configured and managed in the cloud. Granting the appropriate access permissions may require more time and effort.

Performing preservation and maintenance is a challenge because oftentimes this task is overlooked. Preservation and maintenance of data should be clearly spelled out in the service level agreement (SLA) with the cloud service provider (CSP).

83
Q
A

d