CompTIA Security+ (SY0-701) Exam 4 Flashcards
To ensure ongoing compliance with data protection laws, a company decides to implement a system for compliance monitoring. The most effective feature of this system would be:
A yearly manual audit by an external consultant.
Quarterly employee satisfaction surveys.
Automated alerts for any non-compliance issues.
An anonymous tip line for employees to report violations.
Automated alerts for any non-compliance issues offer real-time monitoring and quick response capabilities, making them the most effective feature for ensuring ongoing compliance. This proactive approach allows the company to identify and address compliance issues as they arise, reducing the risk of fines, sanctions, and reputational damage. While yearly audits, an anonymous tip line, and employee surveys can complement a compliance program, they do not provide the same level of immediacy and continuous oversight as automated monitoring.
For a company focused on minimizing operational costs while maintaining security, which architectural decision is advisable?
Implementation of thin clients
Adoption of an all-cloud approach
High investment in state-of-the-art data centers
Extensive use of open-source software
Extensive use of open-source software can help minimize operational costs while still maintaining security, provided that the software is well-maintained and patched regularly. Open-source software can be less costly than proprietary solutions and offers transparency, allowing for community-reviewed security. However, it requires diligent management to ensure it remains secure over time.
A security administrator notices that the company website is experiencing unusually high traffic, leading to service unavailability. After a recent update, users report slow internet speeds and new, unfamiliar software installations. The IT department also notes an increase in outbound traffic. What might be happening?
DDoS Reflected attack
DNS poisoning
Wireless intrusion
Malicious code infection
The symptoms described—slow internet speeds, unfamiliar software installations, and increased outbound traffic—strongly suggest a malicious code infection, such as malware or a virus, especially following a recent update that could have been compromised. This scenario does not align with a wireless intrusion, which typically affects wireless network security; a DDoS reflected attack, which involves overwhelming a system with external requests; or DNS poisoning, which redirects web traffic to malicious sites.
A responsible disclosure program encourages ethical hackers to report vulnerabilities. Which aspect is most critical for its success?
Offering the highest bounties for bug reports
Publicly disclosing all reported vulnerabilities immediately
Limiting the scope of the program to web applications
Ensuring reported vulnerabilities are promptly addressed
While offering bounties is an incentive, the prompt and effective remediation of reported vulnerabilities is crucial for the success of a responsible disclosure program. This ensures trust in the program and motivates ethical hackers to participate, knowing their efforts contribute to improving security. Limiting scope and immediate public disclosure are less critical and could, in certain contexts, undermine the program’s effectiveness or security.
A company is adopting a hybrid work model and requires a solution to securely connect multiple branch offices and remote workers to its central corporate network. Which technology is most appropriate?
SASE
Layer 7 Firewall
WAF
SD-WAN
Software-Defined Wide Area Network (SD-WAN) is the most suitable technology for connecting multiple branch offices and remote workers securely to a central network. It enables the use of multiple types of connections, including the internet, to create secure and high-performance networks. While SASE also provides secure connections, it is more about integrating networking and security into a cloud service, which might be more than needed for just connecting offices. Layer 7 Firewalls and WAFs do not specifically address the connectivity and network optimization needs of a hybrid work model.
A microservices architecture can improve security through isolation. However, what is a key security challenge in this architecture?
Complex data encryption
Single point of failure
Service discovery mechanisms
Increased monolithic desig
In a microservices architecture, service discovery mechanisms can present a key security challenge. As services need to communicate with each other, the discovery process can become a potential attack vector if not properly secured. Ensuring that service discovery and communication are secure is critical to prevent unauthorized access and data breaches.
An organization wants to ensure that its mobile workforce can securely access its internal network from any device, anywhere, without compromising the security posture of the corporate network. Which solution should they implement?
Secure Access Service Edge (SASE) is the best solution for providing secure network access to a mobile workforce from any device, anywhere. It combines comprehensive networking and security services, such as SWG, CASB, FWaaS, and ZTNA, into a single, integrated cloud service, ensuring both secure access and a consistent security posture regardless of location or device. While VPNs offer secure access, they do not provide the same level of integrated security features or the scalability and flexibility that SASE offers. UTM and NGFW are more site-centric and less equipped to handle the diverse access requirements of a mobile workforce.
A company is migrating from on-premise servers to a cloud-based infrastructure. What is the primary security implication of this transition?
Higher initial costs
Reduced control over patch management
Decreased resilience
Increased availability
When moving to cloud-based infrastructure, organizations often face reduced control over patch management since cloud service providers manage the infrastructure. While cloud environments can offer increased availability and resilience, the company relinquishes some control over when and how patches are applied, which can impact security if the service provider does not promptly address vulnerabilities.
A security administrator discovers that an attacker has exploited a vulnerability in the web server’s software, allowing the attacker to gain unauthorized access to the entire server directory. What type of attack has occurred?
Buffer overflow
Directory traversal
Injection
Privilege escalation
A directory traversal attack allows attackers to access restricted files and directories outside of the web server’s root directory. This type of attack differs from buffer overflow, injection, and privilege escalation, which involve memory exploitation, malicious data input, and unauthorized access elevation respectively.
Which data sanitization method is most environmentally friendly while ensuring data on SSDs is unrecoverable?
Physical destruction
Cryptographic erasure
Degaussing
Shredding
Cryptographic erasure, which involves using encryption to make data inaccessible and then destroying the encryption keys, is an environmentally friendly option as it allows the SSD to be reused. Physical destruction and shredding are effective at making data unrecoverable but are not environmentally friendly as they render the SSD unusable. Degaussing is ineffective on SSDs because they do not store data magnetically.
When decommissioning a hard drive from a corporate environment, which method ensures no data can be recovered while allowing the drive to be reused?
Encryption
Overwriting
Degaussing
Physical destruction
Overwriting a hard drive with one or more patterns of data effectively renders the original data unrecoverable, making it a suitable method for decommissioning drives that are to be reused. Degaussing and physical destruction prevent the drive from being reused, making them less suitable for situations where reuse is desired. Encryption does not erase the data but rather makes it unreadable without the decryption key; however, if the encryption key is compromised, the data can still be accessed.
An IT team is considering the security implications of different architectural models for a new online service. Which model should they prioritize for optimal responsiveness?
Traditional grid computing
Centralized server-client model
Cloud-based services with edge computing
Decentralized peer-to-peer network
Cloud-based services integrated with edge computing are optimal for responsiveness, as they process data closer to the end-users, reducing latency. While centralized models can provide control and peer-to-peer networks distribute loads, the combination of cloud and edge computing offers the best balance of speed, scalability, and security.
A microservices architecture can improve security through isolation. However, what is a key security challenge in this architecture?
Complex data encryption
Increased monolithic design
Service discovery mechanisms
Single point of failure
In a microservices architecture, service discovery mechanisms can present a key security challenge. As services need to communicate with each other, the discovery process can become a potential attack vector if not properly secured. Ensuring that service discovery and communication are secure is critical to prevent unauthorized access and data breaches.
Your organization is implementing data protection measures for PII. Which method should you employ to ensure that sensitive data is replaced with pseudonyms to protect privacy?
Encryption
Data masking
Tokenization
Obfuscation
Tokenization replaces sensitive data with unique tokens, preserving data integrity while protecting privacy. This method allows for secure data processing without exposing the original PII, making it suitable for protecting sensitive information like personally identifiable data.
In terms of data loss prevention, why is a higher frequency of backups recommended?
Simplifies the backup process
Increases data storage
requirements
Minimizes potential data loss
Reduces the RTO
A higher backup frequency minimizes potential data loss between backups, ensuring more up-to-date recovery points, unlike increasing storage requirements which is a consequence, not a benefit, and while it might slightly complicate the backup process, the trade-off for reduced data loss is considered beneficial.