CH13 Security Testing Flashcards
Four Bounds of Security Testing
Dynamic Testing
Static Testing
Automatic Testing
Manual Testing
Manual Dynamic Approach
Penetration Testing
Automatic Dynamic Approaches
DAST, IAST, Vulnerability Scanner
Automatic Static Approach
SAST
Manual Static Approach
Manual Code Review
Security Testing light definition
a systematic process for revealing flaws in information systems
Dynamic Testing: The Core Idea
The running application is being tested
Strange inputs are made
The outputs are returned and the program behaviour is observed and reported
Dynamic Testing operates similarly to black-box testing, in that you’re comparing outputs against inputs and not necessarily assessing the code itself.
- means that hard-coded aspects of the program may get easily overlooked
Static Security Testing Core Idea
Parse all source code and config files
Analyze (an abstract representation of) the parsed files
Report any problems found
Static Testing is similar in execution to program compilation
Vulnerability existence against report spectrum
Weakness exists and reported = True Positive
Weakness exists and not reported = False Negative (DANGEROUS)
Weakness doesn’t exist and reported = False Positive (ANNOYING TIME WASTE)
Weakness doesn’t exist and not reported = True Negative
For a fully automated security testing tool, we must make compromises regarding False Negatives and False Positives:
In case of doubt, report a potential vulnerability:
- we might “annoy” users with many findings that are not real issues
- we risk “the boy that cried wolf” phenomena
In case of doubt, we stay silent:
- we might miss severe issues
Reasons and Recommendations for False Negatives
Fundamental: Under-approximation of the tool
- missing language features (might intercept data flow analysis)
- missing support for complete syntax (parsing errors)
Therefore Report to tool vendor
Configuration: lacking knowledge of insecure frameworks
- insecure sinks (output) and sources (input)
Therefore Improve configuration
Unknown security threats
- XML verb tampering
Therefore Develop new analysis for tool (might require support from tool vendor)
Security expert: “I want a tool with 0 false negatives! False negatives increase the overall security risk”
Reasons and Recommendations for False Positives
Fundamental: over-approximation of the tool, e.g.m
- pointer analysis
- call stack
- control-flow analysis
Therefore report tool to vendor
Configuration: lacking knowledge of security framework, e.g.,
- sanitization functions
- secure APIs
therefore improve configuration
Mitigated by attack surface: strictly speaking a true finding, e.g.,
- No external communication due to firewall
- SQL injections in a database admin tool
Therefore should be fixed. In practice often mitigated during audit or local analysis configuration
Developer: “I want a tool with 0 false positives!” False positives create unnecessary effort
Prioritization of Findings
A pragmatic solution for too many findings
Classification with clear instructions for each vulnerability has proven to be the most easy to understand.
Can clearly see:
- What needs to be audited
- What needs to be fixed
- as security issue
- quality issue
- Different rules for
- old code
- new code
Mainly two patterns that cause security vulnerabilities
Local issues
- insecure functions
- secrets stored in the source code
Data-flow related issues
- XSS
- Secrets stored in the source code
generic defects visible in the code
static analysis sweet spot. built-in rules make it wasy for tools to find these without programmer guidance
e.g. buffer overflows