software testing and assurance Flashcards

1
Q

Software Testing - Definition and Scope

A

finds defects against quality

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Software Testing - Objectives

A

find bugs fast
match req spec
validate
make damn good test cases

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

White-Box / Structural Testing

A

dev who knows structure tests it
e.g.
control flow testing
cyclomatic complexity
data flow testing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Black-Box / Behavioral Testing

A

test around func. reqs

edge-case testing
test w diff input combines
finite state machine based (what)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Finite State Machine

A

A finite state machine is a model to describe the dynamic behaviors of an object over time

each object is treated as an isolated entity that communicates with the rest of the world by detecting events and responding to them

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Usage-Based Testing

A

test as close to ops environ as possible
inputs are assigned a prob distribution

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

System Testing types

A

AP FAP Sys Test
Functional Testing ,
Performance Testing ,
Pilot Testing,
Acceptance Testing,
Installation Testing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Types of Performance Testing

A

stress
volume
config
compatibility
security,
timing
environmental
quality,
recovery,
human factors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Software testing

A

assert(expected == actual)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Terminology

A

*Error: human
*Fault: software due to human error
*Failure: pure software fuckup
*Verification: find fault w reqspec
*Validation: find fault w users
*Acceptance Testing: Validation w user environ

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Testing vs Debugging

A

Testing:
Main goal – Cause failures
Process – Predefined, controlled
Outcome – Predictable

Debugging:
Main goal – Locate, correct faults
Process – Iterative
Outcome – Not predictable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Goals of testing

A

main goal is to fuckup

do what to do
not do what not do do

there’s always a fuckup

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Principles of testing

A

complete testing is a pipe dream
it’s hard, preventative, risk based, planned

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Each level of testing is:

A

*Baselined for the next level
*Dependent on the lower level
*Ineffective for lower level fault detection

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

System test plan

A

describes testing acts

identifies
stuff to test
how to test
who to tests
risks
costbudgetschedulecost
deliverables
environ needs
staff/training needs
approvals

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Test design specification

A

what feats to test
how to test
test conds
pass fail criteria

17
Q

Controlled testing environments

A

like dev environ, use mocking stubs wtv

18
Q

Incremental integration

A

LOOP
1 unit test
integrate another unit test w prev
Continue

19
Q

Incremental integration strategy: top-down

A

*Integrate one level at a time, starting from top
*Stub all lower levels
*Advantages:
- stubs may be easier to build than drivers
- exercises most important control modules first

20
Q

Incremental integration strategy: bottom-up

A

*Test lowest level modules first
*Use test drivers to invoke units
*Replace drivers with next highest level unit
when ready
*Advantages:
- can start testing critical or complex modules first
- can perfect external interfaces first

21
Q

Incremental integration strategy: hybrid

A

*Start at the top and work down while building up
from the bottom
*Requires a strategy, for example:
- test for a particular function
- test for a particular flow or interface
- test for a particular hardware
*Advantages:
- can stage the delivery of capabilities
- implement I/O modules first to use in testing
- can work around schedule problems
- test user interfaces early
- can hold off on volatile units

22
Q

Integrating testing

A

group well, follow req specs, test for risk

23
Q

Other integration techniques

A

*Critical modules first
* random
*idk, make the skeleton first, whatever that means, then test rest

24
Q

Function testing

25
Regression testing
Verifies that the existing features do not continue to work
26
Stress testing (*)
start early on *To overload the system *To push the system - to its limits - beyond its limits - back to normal *To break the system *consider the worst things the idiots can do performance testing is for varying loads stress testing is for sudden increased loads
27
Background testing
Subjects the system to real loads instead of no load, like a foundational testing, then slowly run up loads
28
Configuration testing
test w diff environs
29
Recovery testing
test that undo redo logging works if db crashes then test that all data that comes later is valid, none lost test w different stresses/all user fuckups
30
Compatibility testing
Verifies that the system does not meet the system compatibility objectives *To ensure that the software is compatible with the operating system *Usually executed on a duplicate of the customer’s environment
31
Reliability testing
Determines how often the system will fail during a given period of time *Determine the expected duration between failures *Statistics analysis
32
Security testing
Determines whether the system is guarded against unauthorized users *whitehat hacking tryna break your own org
33
Volume testing (*)
Verifies that the system cannot handle the volume of data specified *Subjects the system to heavy volumes of data
34
Guidelines for selecting a test site
Feature usage *Activity rate
35
Stress Testing
► Find how the system deals with overload  Reason 1: Determine failure behaviour if the load goes above the intended, how “gracefully” does the system fail?  Reason 2: Expose bugs that only occur under heavy loads, especially for OS, middleware, servers, etc.  E.g. memory leaks, incorrect resource allocation and scheduling, race conditions
36
Regression Testing
► Rerun old tests to see if anything was “broken” by a change  Changes: bug fixes, module integration, maintenance enhancements, etc. ► Need test automation tools  Load tests, execute them, check correctness  Everything has to be completely automatic ► Could happen at any time: during initial development or after deployment