F6 Flashcards

1
Q

• Structural testing (white-box, glass-box)

A

– Usescode/detaileddesignto develop test cases
– Typicallyusedinunittesting
– Approaches:
• Coverage-based testing • Symbolic execution
• Data flow analysis
• …

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

• Functional testing (black-box)

A

– Usesfunctionspecificationsto develop test cases
– Typicallyusedinsystemtesting
– Approaches:
• Equivalence partitioning • Border case analysis
• …

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

BLACK-BOX TESTING

A

Testgenerationwithoutknowledgeof software structure
• Alsocalledspecification-basedor functional testing
Equivalence partitioning Boundary-value analysis Error Guessing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

EQUIVALENCE PARTITIONING

A

Input- and output data can be grouped into classes where all members in the class behave in a comparable way.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

WHITE-BOX TESTING

A

Methods based on internal structure of code
Approaches:
– Coverage-based testing
• Statement coverage • Branch coverage
• Path coverage
• Data-flow coverage
– Symbolic execution – Data flow analysis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

COVERAGE-BASED TESTING adtavtages

A

– Systematic way to develop test cases – Measurable results (the coverage)
– Extensive tool support • Flow graph generators • Test data generators
• Bookkeeping
• Documentation support

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

COVERAGE-BASED TESTING adtavtages dis

A

Disadvantages
– Code must be available
– Does not (yet) work well for data-driven programs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

PRE-IMPLEMENTATION TESTING

A
• Inspections
21
– evaluation of documents and code prior to technical review or testing
• Walkthrough – In teams
– Examine source code/detailed design
• Reviews
– More informal
– Often done by document owners
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

PRE-IMPLEMENTATION TESTING advantages

A

Effective
– High learning effect
– Distributing system knowledge

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

PRE-IMPLEMENTATION TESTING Disadvantages

A

o Expensive

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

UNIT TESTING

A

Tests the smallest individually executable code units.
Objective: Find faults in the units. Assure correct functional behavior of units.
By: Usually programmers. Tools:
• Testdriver/harness
• Coverageevaluator
• Automatictestgenerat

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

INTEGRATION TESTING

A

• Testing two or more units or components
• Objectives
– Interfaceerrors
– Functionality of combined units;look for problems with functional threads
• By: Developers or Testing group
• Tools:Interface analysis;callpairs
• Issues:Strategy for combining units

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

INTEGRATION TESTING

How to integrate & test the system

A
• Top-down
• Bottom-up
• Criticalunitsfirst  
• Functionality-oriented(threads)
Implications of build order
• Top-down=>stubs;morethoroughtop-level
• Bottom-up=>drivers;morethoroughbottom-
level
• Critical=>stubs&drivers.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

SYSTEM TESTING

A

Testthefunctionality,performance, reliability, security of the entire system.
• By:Separatetestgroup.
• Objective:Finderrorsintheoverall system behavior. Establish confidence in system functionality. Validate that system achieves its desired non- functional attributes.
• Tools:Usersimulator.Loadsimulator

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

ACCEPTANCE TESTING

A
  • Operatesysteminuserenvironment, with standard user input scenarios.
  • By:Enduser
  • Objective:Evaluatewhethersystem meets customer criteria. Determine if customer will accept system.
  • Tools:Usersimulator.Customertest scripts/logs from operation of previous system.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

REGRESSION TESTING

A
  • Defn:Testofmodifiedversionsof previously validated system.
  • By:Systemorregressiontestgroup.
  • Objective:Assurethatchangesto system have not introduced new errors.
  • Tools:Regressiontestbase, capture/replay
  • Issues:Minimalregressionsuite,test prioritization
17
Q

THE KEY PROBLEMS OF SOFTWARE TESTING

A

Selecting or generating the right test cases. Knowing when a system has been tested enough.
Knowing what has been discovered/ demonstrated by execution of a test suite.

18
Q

REALITIES OF SYSTEM TESTING

A

• Availabletimefortestingisshort
– Compressing development risks introducing problems – Compressing testing risks missing critical problems
• Testerswanttostarttestingearly
• Systemtestingrequiresanavailablesystem
• Developersresisttestinguntilsystemis“ready”
To optimize use of the existing resources, use risk analysis.