software testing and assurance Flashcards
Software Testing - Definition and Scope
finds defects against quality
Software Testing - Objectives
find bugs fast
match req spec
validate
make damn good test cases
White-Box / Structural Testing
dev who knows structure tests it
e.g.
control flow testing
cyclomatic complexity
data flow testing
Black-Box / Behavioral Testing
test around func. reqs
edge-case testing
test w diff input combines
finite state machine based (what)
Finite State Machine
A finite state machine is a model to describe the dynamic behaviors of an object over time
each object is treated as an isolated entity that communicates with the rest of the world by detecting events and responding to them
Usage-Based Testing
test as close to ops environ as possible
inputs are assigned a prob distribution
System Testing types
AP FAP Sys Test
Functional Testing ,
Performance Testing ,
Pilot Testing,
Acceptance Testing,
Installation Testing
Types of Performance Testing
stress
volume
config
compatibility
security,
timing
environmental
quality,
recovery,
human factors
Software testing
assert(expected == actual)
Terminology
*Error: human
*Fault: software due to human error
*Failure: pure software fuckup
*Verification: find fault w reqspec
*Validation: find fault w users
*Acceptance Testing: Validation w user environ
Testing vs Debugging
Testing:
Main goal – Cause failures
Process – Predefined, controlled
Outcome – Predictable
Debugging:
Main goal – Locate, correct faults
Process – Iterative
Outcome – Not predictable
Goals of testing
main goal is to fuckup
do what to do
not do what not do do
there’s always a fuckup
Principles of testing
complete testing is a pipe dream
it’s hard, preventative, risk based, planned
Each level of testing is:
*Baselined for the next level
*Dependent on the lower level
*Ineffective for lower level fault detection
System test plan
describes testing acts
identifies
stuff to test
how to test
who to tests
risks
costbudgetschedulecost
deliverables
environ needs
staff/training needs
approvals
Test design specification
what feats to test
how to test
test conds
pass fail criteria
Controlled testing environments
like dev environ, use mocking stubs wtv
Incremental integration
LOOP
1 unit test
integrate another unit test w prev
Continue
Incremental integration strategy: top-down
*Integrate one level at a time, starting from top
*Stub all lower levels
*Advantages:
- stubs may be easier to build than drivers
- exercises most important control modules first
Incremental integration strategy: bottom-up
*Test lowest level modules first
*Use test drivers to invoke units
*Replace drivers with next highest level unit
when ready
*Advantages:
- can start testing critical or complex modules first
- can perfect external interfaces first
Incremental integration strategy: hybrid
*Start at the top and work down while building up
from the bottom
*Requires a strategy, for example:
- test for a particular function
- test for a particular flow or interface
- test for a particular hardware
*Advantages:
- can stage the delivery of capabilities
- implement I/O modules first to use in testing
- can work around schedule problems
- test user interfaces early
- can hold off on volatile units
Integrating testing
group well, follow req specs, test for risk
Other integration techniques
*Critical modules first
* random
*idk, make the skeleton first, whatever that means, then test rest
Function testing
reqspec
Regression testing
Verifies that the existing features do not continue
to work
Stress testing (*)
start early on
*To overload the system
*To push the system
- to its limits
- beyond its limits
- back to normal
*To break the system
*consider the worst things the idiots can do
performance testing is for varying loads
stress testing is for sudden increased loads
Background testing
Subjects the system to real loads instead of no load, like a foundational testing, then slowly run up loads
Configuration testing
test w diff environs
Recovery testing
test that undo redo logging works if db crashes
then test that all data that comes later is valid, none lost
test w different stresses/all user fuckups
Compatibility testing
Verifies that the system does not meet the system
compatibility objectives
*To ensure that the software is compatible with
the operating system
*Usually executed on a duplicate of the
customer’s environment
Reliability testing
Determines how often the system will fail during
a given period of time
*Determine the expected duration between failures
*Statistics analysis
Security testing
Determines whether the system is guarded against
unauthorized users
*whitehat hacking tryna break your own org
Volume testing (*)
Verifies that the system cannot handle the volume
of data specified
*Subjects the system to heavy volumes of data
Guidelines for selecting a test site
Feature usage
*Activity rate
Stress Testing
► Find how the system deals with overload
Reason 1: Determine failure behaviour if the load goes above the
intended, how “gracefully” does the system fail?
Reason 2: Expose bugs that only occur under heavy loads, especially for
OS, middleware, servers, etc.
E.g. memory leaks, incorrect resource allocation and scheduling,
race conditions
Regression Testing
► Rerun old tests to see if anything was “broken” by a change
Changes: bug fixes, module integration, maintenance enhancements, etc.
► Need test automation tools
Load tests, execute them, check correctness
Everything has to be completely automatic
► Could happen at any time: during initial development or after deployment