Chapter 3 & 4 Flashcards

1
Q

Entry criteria

A

The set of generic and specific conditions for permitting a process to go forward with a defined task, e.g. test phase. The purpose of entry criteria is to prevent a task from starting which would entail more wasted effort compared to the effort needed to remove the failed entry criteria.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Formal review

A

A review characterized by documented procedures and requirements, e.g. inspection

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Informal review

A

A review not based on a formal documented procedure.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Inspection

A

A type of peer-review that relies on visual examination of documents to detect effects, e.g. violations of development standards and non-conformance to higher-level documentation. The most formal review technique and therefore always based on a documented procedure.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Moderator/inspection leader

A

The leader and main person responsible for an inspection or other review process.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Peer review

A

A review of a software work product by colleagues of the producer of the product for the purpose of identifying defects and improvements. Examples are inspection, technical review and walk through.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Review

A

An evaluation of a product or project status to ascertain discrepancies from planned results and to recommend improvements. Examples include management review, informal review, technical review, inspection, and walk through.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Reviewer

A

The person involved in the review that identifies and describes anomalies in the product or project under review. Reviewers can be chosen to represent different viewpoints and rolls in the review process.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Scribe

A

The person who record each defect mentioned any suggestions for process improvement during the review meeting, on a logging form. The scribe should ensure that the logging form is readable and understandable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Static analysis

A

Analysis of software development artifacts, e.g. requirements or code, carried out without execution of the software development artifacts. Static analysis is usually carried out by means of a supporting tool.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Static testing

A

Testing of a software development artifact, e.g., requirements, design or code, without execution of these artifacts, e.g. reviews or static analysis.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Technical review

A

A peer group discussion activity that focuses on achieving consensus on the technical approach to be taken.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Walkthrough

A

A step-by-step presentation by the author of a document in order to gather information and to establish a common understanding of its content.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Black box test design technique

A

Procedure to derive and/or select a test cases based on an analysis of the specification, either functional or nonfunctional, of a component or system without reference to its internal structure.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Boundary value analysis

A

A black box tests design technique in which test cases are designed based on boundary values.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Compiler

A

A software tool that translates programs expressed in a higher order language into their machine language equivalents

17
Q

Complexity

A

The degree to which a component of system has a design and/or internal structure that is difficult to understand, maintain and verify.

18
Q

Control flow

A

A sequence of events or paths in the execution through a component or system.

19
Q

Dataflow

A

An abstract representation of the sequence and possible changes of the state of data objects, where the state of an object is any of: creation, usage, or destruction.

20
Q

Decision coverage

A

The percentage of decision outcomes that have been exercised by attests week. 100% decision coverage in place book 100% bridge coverage and 100% statement coverage

21
Q

Decision table testing

A

Black box test designed technique in which test cases are designed to execute the combinations of inputs and/or stimuli (causes) shown in a decision table

22
Q

Equivalence partitioning

A

A black box test designed technique in which test cases are designed to execute representatives from equivalents partitions. In principle test cases are designed to cover each partition at least once.

23
Q

Error guessing

A

A test designed technique where the experience of the tester is used to anticipate what effects might be present in the computer system under test as a result of errors made into designed specifically to expose them

24
Q

Experience based test design technique

A

Procedure to derive and/or select test cases based on the testers experience, knowledge and intuition

25
Q

Fault attack

A

Direct and focused attempt to evaluate the quality, especially reliability, I have a test object by attempting to force specific failures to occur.

26
Q

State transition testing

A

A black box test designed technique in which test cases are designed to execute valid and invalid state transitions.

27
Q

Structure based testing

A

Testing based on an analysis of the internal structure of the component or system

28
Q

Test case

A

A set of input values, execution preconditions, expected results and execution post conditions, developed for a particular objective or test condition, such as to exercise a particular program path or to verify compliance with a specific requirement

29
Q

Test condition

A

An item or event of a component or system that could be verified by one or more test cases, e.g. a function, transaction, feature, quality attributes, or structural element

30
Q

Test data

A

Data that exists (for example in a database) before a test is executed, and that affects or is affected by the component or system under test

31
Q

Dynamic Testing

A

Testing that involves the execution of the software of a component or system.

32
Q

Traceability

A

Ability to id related items on documentation and software such as requirements with associated tests.