1.4.2 Test Activities and Tasks Flashcards

1
Q

A test process consists of the following main groups of activities:

A
  • Test planning
  • Test monitoring and control
  • Test analysis
  • Test design
  • Test implementation
  • Test execution
  • Test completion
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Each constituent activity test process consists of multiple individual tasks, which would vary

A

from one project or release to another.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

many of these test process activity groups may appear logically sequential, they are often implemented

A

iteratively

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

many of these main activity groups are tailored

A

within the context of the system and the project

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Test planning involves activities that define

A

the objectives of testing and the approach for meeting test objectives within constraints imposed by the context

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

examples of constraints imposed by the testing context:

A

specifying suitable test techniques and tasks, formulating a test schedule for meeting a deadline

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Test plans may be revisited based on

A

feedback from monitoring and control activities.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Test monitoring involves

A

the ongoing comparison of actual progress against planned progress using any test monitoring metrics defined in the test plan.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Test control involves

A

taking actions necessary to meet the objectives of the test plan (which may be updated over time).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Test monitoring and control are supported by

A

the evaluation of exit criteria

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

referred to as the definition of done in some software development lifecycle models

A

Test monitoring and control

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

evaluation of exit criteria for test execution as part of a given test level may include:

A

Checking test results and logs against specified coverage criteria
Assessing the level of component or system quality based on test results and logs
Determining if more tests are needed

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

example when more tests are needed

A

if tests originally intended to achieve a certain level of product risk coverage failed to do so, requiring additional tests to be written and executed

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Test progress against the plan is communicated to stakeholders in test progress reports, including

A

deviations from the plan and information to support any decision to stop testing.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Test progress against the plan is communicated to stakeholders in

A

test progress reports

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

test progress reports includes

A

deviations from the plan and information to support any decision to stop testing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

During test analysis, the test basis is analyzed to

A

identify testable features and define associated test conditions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

test analysis determines

A

“what to test” in terms of measurable coverage criteria.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Test analysis includes the following major activities:

A

Analyzing the test basis appropriate to the test level being considered
Evaluating the test basis and test items to identify defects of various types
Identifying features and sets of features to be tested
Defining and prioritizing test conditions for each feature
Capturing bi-directional traceability between each element of the test basis and the associated test conditions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Defining and prioritizing test conditions for each feature based on:

A

analysis of the test basis, considering functional, non-functional, and structural characteristics, other business and technical factors levels of risks

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Analyzing the test basis appropriate to the test level being considered

A

Requirement specifications
Design and implementation information
The implementation of the component or system itself
Risk analysis reports

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Requirement specifications, such as

A

business requirements,
functional requirements,
system requirements,
user stories,
epics,
use cases

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Requirement specification that specify

A

desired functional and non-functional component or system behavior

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Design and implementation information, that

A

specify component or system structure

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

Design and implementation information, such as

A

system or software architecture diagrams or documents,
design specifications,
call flow graphs,
modelling diagrams
interface specifications

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

implementation of the component or system itself, includes:

A

code,
database metadata and queries,
interfaces

27
Q

Risk analysis reports, may consider

A

functional, non-functional and structural aspects of the component or system

28
Q

Evaluating the test basis and test items to identify defects of various types, such as

A

Ambiguities
Omissions
Inconsistencies
Inaccuracies
Contradictions
Superfluous statements

29
Q

The application of black-box, white-box, and experience-based test techniques can be useful in the
process of test analysis to

A

reduce the likelihood of omitting important test conditions
and
to define more precise and accurate test conditions.

30
Q

In some cases, test analysis produces test conditions which are to

A

be used as test objectives in test charters.

31
Q

Test charters are typical work products in some types of

A

experience-based testing

32
Q

When these test objectives are traceable to the test basis, coverage achieved (during such experience-based testing)

A

can be measured.

33
Q

The identification of defects during test analysis is an important potential benefit, especially where

A

no other review process is being used
and/or
the test process is closely connected with the review process.

34
Q

test analysis activities verify whether the requirements are

A

consistent,
properly expressed,
complete,
valid properly capturing any customer, user or other stakeholder needs.

35
Q

behavior driven development (BDD) and acceptance test driven development (ATDD), involve

A

generating test conditions and test cases from user stories
and
acceptance criteria prior to coding.

36
Q

behavior driven development (BDD) and acceptance test driven development (ATDD), verify, validate, and

A

detect defects in
the user stories
and
acceptance criteria

37
Q

During test design, the test conditions are elaborated into

A

high-level test cases,
sets of high-level test cases,
other testware.

38
Q

test analysis answers the question

A

“what to test?”

39
Q

test design answers the question

A

“how to test?”

40
Q

Test design includes the following major activities:

A

Designing and prioritizing test cases and sets of test cases
Identifying necessary test data to support test conditions and test cases
Designing the test environment and identifying any required infrastructure and tools
Capturing bi-directional traceability between the test basis, test conditions, and test cases

41
Q

The elaboration of test conditions during test design often involves

A

the use of test techniques

42
Q

test design may also result in the identification of similar types of defects in

A

the test basis.

43
Q

the identification of defects during test design is an

A

important potential benefit.

44
Q

During test implementation,

A

the testware necessary for test execution is created and/or completed, including sequencing the test cases into test procedures.

45
Q

test implementation answers the question

A

“do we now have everything in place to run the tests?”

46
Q

Test implementation includes the following major activities:

A

Developing and prioritizing test procedures to subsequently create the corresponding test suites potentially, creating automated test scripts and therefore the required test suites
Arranging the test suites within a test execution schedule to obtain an efficient test execution
Building the test environment and verifying that everything needed has been set up correctly
Preparing test data and ensuring it is properly loaded in the test environment
Verifying and updating bi-directional traceability

47
Q

in the test implementation, the bi-directional traceability it happens between:

A

test basis, test conditions, test cases, test procedures and test suites

48
Q

test environment includes(potentially):

A

test harnesses,
service virtualization,
simulators
other infrastructure items

49
Q

Test design and test implementation tasks are often

A

combined.

50
Q

In exploratory testing and other types of experience-based testing, test design and implementation

A

may occur, and may be documented, as part of test execution.

51
Q

Exploratory testing may be based on

A

test charters (produced as part of test analysis)

52
Q

exploratory tests are executed immediately as they are

A

designed and implemented

53
Q

During test execution, test suites are run in accordance with

A

the test execution schedule.

54
Q

Test execution includes the following major activities:

A

Recording the IDs and versions of the test item(s) or test object, test tool(s), and testware
Executing tests either manually or by using test execution tools
Comparing actual results with expected results
Analyzing anomalies to establish their likely causes
Reporting defects based on the failures observed
Logging the outcome of test execution
Repeating test activities (as a result of action taken for an anomaly or as part of the planned testing)
Verifying and updating bi-directional traceability

55
Q

in the test execution, the bi-directional traceability it happens between:

A

test basis, test conditions, test cases, test procedures and test results

56
Q

examples of repeating test activities as part of the planned testing

A

execution of a corrected test,
confirmation testing,
regression testing

57
Q

Test completion activities collect data from completed test activities to consolidate

A

experience,
testware
any other relevant information.

58
Q

Test completion activities occur at project milestones such as

A

when a software system is released,
a test project is completed (or cancelled),
an Agile project iteration is finished,
a test level is completed
a maintenance release has been completed.

59
Q

Test completion includes the following major activities:

A

Checking whether all defect reports are closed
Creating a test summary report to be communicated to stakeholders
Finalizing and archiving the test environmentfor later reuse
Handing over the testware to other teams who could benefit from its use
Analyzing lessons learned from the completed test activities
Using the information gathered to improve test process maturity

60
Q

Checking whether all defect reports are closed, entering

A

change requests or product backlog items for any defects that remain unresolved at the end of test execution

61
Q

test environment includes:

A

the test data,
the test infrastructure
other testware

62
Q

teams who could benefit from a used testware

A

the maintenance teams,
other project teams
other stakeholders

63
Q

What it is the importance of Analyzing lessons learned from the completed test activities

A

to determine changes needed for future iterations, releases, and projects