CompTIA Security+ (SY0-701) Exam 4 Flashcards
To ensure ongoing compliance with data protection laws, a company decides to implement a system for compliance monitoring. The most effective feature of this system would be:
A yearly manual audit by an external consultant.
Quarterly employee satisfaction surveys.
Automated alerts for any non-compliance issues.
An anonymous tip line for employees to report violations.
Automated alerts for any non-compliance issues offer real-time monitoring and quick response capabilities, making them the most effective feature for ensuring ongoing compliance. This proactive approach allows the company to identify and address compliance issues as they arise, reducing the risk of fines, sanctions, and reputational damage. While yearly audits, an anonymous tip line, and employee surveys can complement a compliance program, they do not provide the same level of immediacy and continuous oversight as automated monitoring.
For a company focused on minimizing operational costs while maintaining security, which architectural decision is advisable?
Implementation of thin clients
Adoption of an all-cloud approach
High investment in state-of-the-art data centers
Extensive use of open-source software
Extensive use of open-source software can help minimize operational costs while still maintaining security, provided that the software is well-maintained and patched regularly. Open-source software can be less costly than proprietary solutions and offers transparency, allowing for community-reviewed security. However, it requires diligent management to ensure it remains secure over time.
A security administrator notices that the company website is experiencing unusually high traffic, leading to service unavailability. After a recent update, users report slow internet speeds and new, unfamiliar software installations. The IT department also notes an increase in outbound traffic. What might be happening?
DDoS Reflected attack
DNS poisoning
Wireless intrusion
Malicious code infection
The symptoms described—slow internet speeds, unfamiliar software installations, and increased outbound traffic—strongly suggest a malicious code infection, such as malware or a virus, especially following a recent update that could have been compromised. This scenario does not align with a wireless intrusion, which typically affects wireless network security; a DDoS reflected attack, which involves overwhelming a system with external requests; or DNS poisoning, which redirects web traffic to malicious sites.
A responsible disclosure program encourages ethical hackers to report vulnerabilities. Which aspect is most critical for its success?
Offering the highest bounties for bug reports
Publicly disclosing all reported vulnerabilities immediately
Limiting the scope of the program to web applications
Ensuring reported vulnerabilities are promptly addressed
While offering bounties is an incentive, the prompt and effective remediation of reported vulnerabilities is crucial for the success of a responsible disclosure program. This ensures trust in the program and motivates ethical hackers to participate, knowing their efforts contribute to improving security. Limiting scope and immediate public disclosure are less critical and could, in certain contexts, undermine the program’s effectiveness or security.
A company is adopting a hybrid work model and requires a solution to securely connect multiple branch offices and remote workers to its central corporate network. Which technology is most appropriate?
SASE
Layer 7 Firewall
WAF
SD-WAN
Software-Defined Wide Area Network (SD-WAN) is the most suitable technology for connecting multiple branch offices and remote workers securely to a central network. It enables the use of multiple types of connections, including the internet, to create secure and high-performance networks. While SASE also provides secure connections, it is more about integrating networking and security into a cloud service, which might be more than needed for just connecting offices. Layer 7 Firewalls and WAFs do not specifically address the connectivity and network optimization needs of a hybrid work model.
A microservices architecture can improve security through isolation. However, what is a key security challenge in this architecture?
Complex data encryption
Single point of failure
Service discovery mechanisms
Increased monolithic desig
In a microservices architecture, service discovery mechanisms can present a key security challenge. As services need to communicate with each other, the discovery process can become a potential attack vector if not properly secured. Ensuring that service discovery and communication are secure is critical to prevent unauthorized access and data breaches.
An organization wants to ensure that its mobile workforce can securely access its internal network from any device, anywhere, without compromising the security posture of the corporate network. Which solution should they implement?
Secure Access Service Edge (SASE) is the best solution for providing secure network access to a mobile workforce from any device, anywhere. It combines comprehensive networking and security services, such as SWG, CASB, FWaaS, and ZTNA, into a single, integrated cloud service, ensuring both secure access and a consistent security posture regardless of location or device. While VPNs offer secure access, they do not provide the same level of integrated security features or the scalability and flexibility that SASE offers. UTM and NGFW are more site-centric and less equipped to handle the diverse access requirements of a mobile workforce.
A company is migrating from on-premise servers to a cloud-based infrastructure. What is the primary security implication of this transition?
Higher initial costs
Reduced control over patch management
Decreased resilience
Increased availability
When moving to cloud-based infrastructure, organizations often face reduced control over patch management since cloud service providers manage the infrastructure. While cloud environments can offer increased availability and resilience, the company relinquishes some control over when and how patches are applied, which can impact security if the service provider does not promptly address vulnerabilities.
A security administrator discovers that an attacker has exploited a vulnerability in the web server’s software, allowing the attacker to gain unauthorized access to the entire server directory. What type of attack has occurred?
Buffer overflow
Directory traversal
Injection
Privilege escalation
A directory traversal attack allows attackers to access restricted files and directories outside of the web server’s root directory. This type of attack differs from buffer overflow, injection, and privilege escalation, which involve memory exploitation, malicious data input, and unauthorized access elevation respectively.
Which data sanitization method is most environmentally friendly while ensuring data on SSDs is unrecoverable?
Physical destruction
Cryptographic erasure
Degaussing
Shredding
Cryptographic erasure, which involves using encryption to make data inaccessible and then destroying the encryption keys, is an environmentally friendly option as it allows the SSD to be reused. Physical destruction and shredding are effective at making data unrecoverable but are not environmentally friendly as they render the SSD unusable. Degaussing is ineffective on SSDs because they do not store data magnetically.
When decommissioning a hard drive from a corporate environment, which method ensures no data can be recovered while allowing the drive to be reused?
Encryption
Overwriting
Degaussing
Physical destruction
Overwriting a hard drive with one or more patterns of data effectively renders the original data unrecoverable, making it a suitable method for decommissioning drives that are to be reused. Degaussing and physical destruction prevent the drive from being reused, making them less suitable for situations where reuse is desired. Encryption does not erase the data but rather makes it unreadable without the decryption key; however, if the encryption key is compromised, the data can still be accessed.
An IT team is considering the security implications of different architectural models for a new online service. Which model should they prioritize for optimal responsiveness?
Traditional grid computing
Centralized server-client model
Cloud-based services with edge computing
Decentralized peer-to-peer network
Cloud-based services integrated with edge computing are optimal for responsiveness, as they process data closer to the end-users, reducing latency. While centralized models can provide control and peer-to-peer networks distribute loads, the combination of cloud and edge computing offers the best balance of speed, scalability, and security.
A microservices architecture can improve security through isolation. However, what is a key security challenge in this architecture?
Complex data encryption
Increased monolithic design
Service discovery mechanisms
Single point of failure
In a microservices architecture, service discovery mechanisms can present a key security challenge. As services need to communicate with each other, the discovery process can become a potential attack vector if not properly secured. Ensuring that service discovery and communication are secure is critical to prevent unauthorized access and data breaches.
Your organization is implementing data protection measures for PII. Which method should you employ to ensure that sensitive data is replaced with pseudonyms to protect privacy?
Encryption
Data masking
Tokenization
Obfuscation
Tokenization replaces sensitive data with unique tokens, preserving data integrity while protecting privacy. This method allows for secure data processing without exposing the original PII, making it suitable for protecting sensitive information like personally identifiable data.
In terms of data loss prevention, why is a higher frequency of backups recommended?
Simplifies the backup process
Increases data storage
requirements
Minimizes potential data loss
Reduces the RTO
A higher backup frequency minimizes potential data loss between backups, ensuring more up-to-date recovery points, unlike increasing storage requirements which is a consequence, not a benefit, and while it might slightly complicate the backup process, the trade-off for reduced data loss is considered beneficial.
Which of the following best describes the purpose of a system/process audit in vulnerability management?
To perform an in-depth analysis of network traffic
To identify unused applications and services for removal
To assess compliance with internal and external security policies
To update security tools and software to the latest versions
System/process audits are conducted to assess an organization’s compliance with established internal and external security policies, regulations, and standards. They are comprehensive evaluations that cover various aspects of security, including policies, procedures, and technical controls, ensuring that practices align with security requirements and identifying areas for improvement.
What is a false negative in the context of vulnerability confirmation?
A vulnerability that does not exist but is reported as a finding.
A vulnerability that is detected but is actually part of the system’s normal functionality.
A vulnerability detected as resolved without any intervention.
A vulnerability that exists but is not detected by the assessment tool.
A false negative occurs when a vulnerability assessment tool fails to detect an existing vulnerability. This situation is dangerous because it can lead organizations to believe their systems are secure when, in fact, vulnerabilities remain unaddressed. Conversely, a false positive, where a non-existent vulnerability is reported, may lead to unnecessary work but doesn’t directly leave systems exposed. The other options describe scenarios that don’t accurately define a false negative.
What is a crucial security measure for ICS and SCADA systems?
Connecting ICS/SCADA systems directly to the internet for real-time data access.
Implementing strong physical security around the ICS/SCADA devices.
Regularly updating the ICS/SCADA software to the latest version.
Using the latest IoT technologies to monitor ICS/SCADA systems.
Implementing strong physical security around ICS/SCADA devices is crucial to protect against unauthorized access and tampering, which can have severe consequences for critical infrastructure. While regularly updating software is important for addressing vulnerabilities, it must be done with caution due to the specialized nature of these systems. Using IoT technologies for monitoring can introduce additional security risks, and connecting ICS/SCADA systems directly to the internet without proper safeguards can expose them to cyber threats.
Your company has implemented a strict security policy requiring all remote access to the network to be authenticated and encrypted. However, they also need a solution that allows for device verification and the ability to enforce policies based on user and device identity. Which of the following would be the most appropriate solution?
SSL VPN
TLS
IPSec
NAC
Network Access Control (NAC) is the best solution for this scenario because it not only allows for authentication and encryption of remote access but also provides device verification and the ability to enforce policies based on user and device identity. While TLS and IPSec provide encryption and SSL VPN allows for secure remote access, none of these solutions offer the comprehensive access control and policy enforcement capabilities provided by NAC.