flashcards
verify T basis early in the SDLC will …
prevent defects
T control in fundament.T process - when?
always
design and prioritize of high level TC - when?
T analysis and design
developer makes a __ which causes a __ when code is dynamically tested
mistake, failure
exhaustive T is __
impossible
Func T can be conducted at ___ levels
all
Non-func T can be conducted at ___ levels
all
triggers for Maintenance T
a component in production is modifies, migrated or retired
V-model. Design docs (DD) available. What Testers do?
create func/non-func TCs + review DD
Formal review. Role which documents issues
scribe
static analysis best finds
dead code
best T tech for: determine/improve code maintainability
static
document specifies input/output for test
TC specs
what is a test condition?
is what a TC targets for testing. = TC tests a test condition
reason for use experience-based tech?
can found defects which missed by more formal tech
error guessing is used in …
experience T
how calc decision (D) coverage?
num of D outcomes executed / total num of D outcomes in module
how calc statement (S) coverage?
num of S executed / total num of S in module
equivalence partitioning requires
one TC for each partition, one for too low and one for too high
boundary value analysys (BVA) requires
for each partition: prev+first - 0;1 + 49;50 + 59;60 + 69;70 + 79;80
T design specs contains …
T conditions (what to test) + T approach
TC specs contains …
test cases
T procedures contain …
test steps
full statement coverage
each S (=operator. usually ‘if’) executed once
what T leader does ..
writes a T strategy
T planning should be … (when?)
not when. It’s a continuous activity
Risk-based T is an … approach
analytical (risk analysis)
T summary report contains ‘variances’ section which descibes …
diff btw what was planned for T and what was ACTUALLY tested
system always become more reliable after debugging: T/F?
F
what fund. T principle helps to find as many bugs as possible?
defect clustering
3 activities of T implem and execution:
- TC dev and prioritize, create T data, write T proc; 2. group TC into TS; 3. verif T env
V-model includes the verif of …
design
acceptance T is required for …
confidence
M Testing requires …
both re-test and R test
M Testing is difficult to scope =>
req careful risk and impact analysis
S. and D. Testing are complementary because …
share the aim of ident defects but differ in the defect types found
reviews are a cost-effective …
early static test
use case testing is good for …
acceptance, cover main business processes, find defects in components integration
which fundamental T activity do the test data prep tools support?
T analysis and design
if disagreement w dev …
remind about common goal create quality systems
inside SDLC, testing role is …
provide decision-making info
sometimes T is required for legal reasons because …
contracts may specify T reqs
root cause analysis helps …
to better identify and correct the defects root cause
pesticide paradox is …
running the same T over and over -> reduce the chance of finding new defects
well-managed test level should have …
a T objective
black-box T us based on …
req. docs
experience-based T is used …
in conj w more formal tech
TC tests T cond by …
following T procedures
bva =
2 per valid range + 1 for negative + 1 for exceeding
risk level is determined by …
likelyhood and impact
defect density is used for …
determine which areas of sw have the highest number of defects -> re-evaluate risk/priority
T exec tool purpose is …
execute T objects using automated T scripts
pilot project objectives are …
learn, evaluate the fit in the organization, decide standard usage, assess benefits
T contributes to the quality of delivered software by …
identif root causes of defects from past projects and use lessons -> improve processes -> help to reduce defect count
T planning assigns resources and …
sets the level of T procedures
acceptance T test basis is …
risk analysis report, system reqs, business use cases
objective for T is …
finding defects
objectives for acceptance T are …
confidence + assess readiness for deployment and use
debugging process:
T ident defect, Dev locate and fix, T confirm
verify the T env is ready - done during this fundamental T process:
Planning and Control
choice of SDLC model depends on …
product and project characteristics
what T metrics provides the best indication of T progress?
Test failure rate of tests executed
Integration T test level. Test basis:
software and system design
Integration T test level. Test objects:
interfaces
independent T is important because …
independent T can verify assumptions made during specification and implementation of the system
functional and structural T can be used together at __ T levels
ALL
M Testing is triggered by …
changes to delivered sw and uses impact analysis to min regression T
Formal review. One of roles -
moderator
review process success factors are …
- predefined objectives; 2. right people involved; 3.emphasis on learning and process improvement
experience-based T: TC are derived from …
knowledge of the testers
most effect the testing efforts -
product reqs for reliability and security
T planning - when
continuously in all life cycle processes and activities
execution tools examples:
test harness, test comparators
pilot project main reason:
assess cost-effectiveness
T planning - major tasks:
find: scope, risks, objectives
evaluate reqs testability is a part of T. phase
T analysis/design
acceptance TC are based on …
output of requirement analysis/req.specs
validation =
helps to check that we have built the right product
impact analysis helps to decide …
how much testing should be done
functional system testing is …
end-to-end func of the system as a whole
technical review AKA
peer review
formal review kick-off =
explain objectives
low level design -> what level of T?
integration
business reqs -> what level of T?
acceptance
high level design -> what level of T?
system
review success factors:
- defects found are welcomed and expressed objectively; 2. mgmt support; 3. emphasis on learn and proc improv
static analysis tools can find defects:
vars never used, security vuln, prog.std violations, uncalled func
T cond derive from …
specs
regression T - when:
after sw changed, environment changed
T leader tasks:
- interact w T tool vendor; 2. write T sum report; 3. decide what should be automated and how
typical exit criteria …
Thoroughness measures, reliability measures, cost, schedule, tester availability and residual risks.
when to stop T ?
when T completion crit have been met
formal review phases:
plan, kick-off, prep, review meeting, rework, follow-up
T objectives during dev
provoke as many failures as possible
T objectives during delivery
confirm that system works as expected and assess the quality for stakeholders
QA?
prevents defects
static T is …
remove ambiguites and errors
dynamic T is …
execute program with some test data
7 T principles
- T shows the presence of bugs
- exhaustive T is impossible
- early T
- defect clust
- pesticide paradox
- T is context-dependent
- absence-of-errors fallacy - no errors doesn’t mean good product
if risk is low and acceptable ->
stop T and ship
T should provide enough info for whom?
stakeholders
risk analysis answers:
- what to test 1st
- what to test most
- how thoroughly to test
- what not to test
- how much time to allocate for T
fundamental T process steps:
1a. Plan = def T obj and T activities
1b. Control = compare actual progress against the plan and report status
2. Analysis/Design = tangible T cond and T cases, test-bed
3. Implem/Execute = write T proc, TC->TS, priority, check test-env, run, log, bugrep
4. Evaluate exit crit and summary report to stakehold (what planned/achieved)
5. T closure
T design tech list
bb, wb, exp-based
bb types
decision-table, state_transition, use_case(actors/activities/system), bva, equiv_part
T design process parts:
identify T cond / T cases / T data
typical test design strategy
- func (bb)
- non-func
- wb - check statement/decision cov and create new TCs if necessary
- exp-based T
test types
func non-func struct = wb related to changes (regression, re-test, maintenance)
validation =
doing the right thing (sw created by specs but code not maintainable)
verif =
doing the things in the right way (good code but not match specs)
V&V for Testers
verif=detect_faults, valid=comply
V&V for Analysts
verif = reqs not ambigious and complete valid = valid w customer what he asks make sure
signs of good T for any model
each T level has clear T obj
for every dev act -> T act
review drafts as soon as they’re ready
T exec tools types:
T comparators
coverage measure
security T
test harness / unit test framework