2.1 Information Gathering and Vulnerability Scanning Flashcards
Given a scenario, perform passive reconnaissance.
How can a PenTest team test the security posture of an organization?
The team can do this by searching for key contacts, information and technical data such as online articles, news items, social media and press releases in order to provide a better understanding of the business operations and reputation of the target organization.
What is the Common Vulnerabilities & Exposures (CVE)?
This is a listing of all publicly disclosed vulnerabilities. Each entry refers to a specific vulnerability of a particular product.
What is the Common Weakness Enumeration (CWE)?
This is a database of software-related vulnerabilities maintained by MITRE Corporation.
What are Public Source-Code Repositories?
Public Source-Code Repositories, such as GitHub, Bitbucket, CloudForge and SourceForge promote code sharing and collaboration, speeding up development times.
What is Google Hacking?
Google Hacking is a method used by PenTest teams to optimize search results. This process uses the Google search engine to identify potential security weaknesses in publicly available sources, such as an organizations website. Google Hacking queries almost always include one or multiple special search operators in order to cut down on irrelevant results and focus on specific types of desired information.
What is a web cache viewer?
A web cache viewer allows you to search for older versions of websites which is a snapshot of the raw HTML and some of the page contents. This can trace back to old press releases, directories and even information on the source code that contains comments or sensitive information.
How would you evaluate a website?
You could evaluate a website, by using tools like browsers, Nmap, Metasploit and DirBuster. Forced browsing is also an option, which is used to identify unlinked URLs or IPs from a website to gain access to unprotected resources. Lastly, OSINT tools such as Maltego, along with standard or Google hacking searches, can reveal the technologies in use.
What is the purpose of a robots.txt file?
The robots.txt file is a simple, yet essential file that directs bots to the Extensible Markup Language (XML) sitemap file, telling the bots where to search, and more importantly, where NOT to search.