Glossary Flashcards

1
Q

defect, bug, fault

A

An imperfection or deficiency in a work product where it does not meet its requirements or specifications.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

error, mistake

A

A human action that produces an incorrect result.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

failure

A

An event in which a component or system does not perform a required function within specified limits.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

quality

A

The degree to which a component, system or process meets specified requirements and/or user/customer needs and expectations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

risk

A

A factor that could result in future negative consequences.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

debugging

A

The process of finding, analyzing and removing the causes of failures in software.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

requirement

A

A provision that contains criteria to be fulfilled.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

review

A

A type of static testing during which a work product or process is evaluated by one or more individuals to detect issues and to provide improvements.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

test case

A

A set of preconditions, inputs, actions (where applicable), expected results and postconditions, developed based on test conditions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

testing

A

The process consisting of all lifecycle activities, both static and dynamic, concerned with planning, preparation and evaluation of software products and related work products to determine that they satisfy specified requirements, to demonstrate that they are fit for purpose and to detect defects.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

test objective

A

A reason or purpose for designing and executing a test.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

exhaustive testing, complete testing

A

A test approach in which the test suite comprises all combinations of input values and preconditions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

confirmation testing, re-testing

A

Dynamic testing conducted after fixing defects with the objective to confirm that failures caused by those defects do not occur anymore.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

exit criteria, completion criteria, test completion criteria, definition of done

A

The set of conditions for officially completing a defined task.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

incident, deviation, software test incident, test incident

A

An event occurring that requires investigation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

regression

A

A degradation in the quality of a component or system due to a change.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

regression testing

A

Testing of a previously tested component or system following modification to ensure that defects have not been introduced or have been uncovered in unchanged areas of the software, as a result of the changes made.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

test basis

A

The body of knowledge used as the basis for test analysis and design.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

test condition, test requirement, test situation

A

An aspect of the test basis that is relevant in order to achieve specific test objectives.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

coverage, test coverage

A

The degree to which specified coverage items have been determined or have been exercised by a test suite expressed as a percentage.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

test data

A

Data created or selected to satisfy the execution preconditions and inputs to execute one or more test cases.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

test execution

A

The process of running a test on the component or system under test, producing actual result(s).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

test log, test record, test run log

A

A chronological record of relevant details about the execution of tests.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

test plan

A

Documentation describing the test objectives to be achieved and the means and the schedule for achieving them, organized to coordinate testing activities.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

test procedure

A

A sequence of test cases in execution order, and any associated actions that may be required to set up the initial preconditions and any wrap up activities post execution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

test policy, organizational test policy

A

A high-level document describing the principles, approach and major objectives of the organization regarding testing.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

test suite, test case suite, test set

A

A set of test cases or test procedures to be executed in a specific test cycle.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

test summary report, test report

A

A test report that provides an evaluation of the corresponding test items against exit criteria

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

testware

A

Work products produced during the test process for use in planning, designing, executing, evaluating and reporting on testing.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

error guessing

A

A test technique in which tests are derived on the basis of the tester’s knowledge of past failures, or general knowledge of failure modes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

independence of testing

A

Separation of responsibilities, which encourages the accomplishment of objective testing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
32
Q

commercial off-the-shelf (COTS), off-the-shelf software

A

A software product that is developed for the general market, i.e. for a large number of customers, and that is delivered to many customers in identical format.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
33
Q

iterative development model

A

A development lifecycle where a project is broken into a usually large number of iterations. An iteration is a complete development loop resulting in a release (internal or external) of an executable product, a subset of the final product under development, which grows from iteration to iteration to become the final product.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
34
Q

incremental development model

A

A development lifecycle model in which the project scope is generally determined early in the project lifecycle, but time and cost estimates are routinely modified as the project team understanding of the product increases. The product is developed through a series of repeated cycles, each delivering an increment which successively adds to the functionality of the product.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
35
Q

validation

A

Confirmation by examination and through provision of objective evidence that the requirements for a specific intended use or application have been fulfilled.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
36
Q

verification

A

Confirmation by examination and through provision of objective evidence that specified requirements have been fulfilled.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
37
Q

V-model

A

A sequential development lifecycle model describing a one-for-one relationship between major phases of software development from business requirements specification to delivery, and corresponding test levels from acceptance testing to component testing.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
38
Q

alpha testing

A

Simulated or actual operational testing conducted in the developer’s test environment, by roles outside the development organization.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
39
Q

beta testing, field testing

A

Simulated or actual operational testing conducted at an external site, by roles outside the development organization.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
40
Q

component

A

A minimal part of a system that can be tested in isolation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
41
Q

component testing, module testing, unit testing

A

The testing of individual hardware or software components.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
42
Q

driver, test driver

A

A software component or test tool that replaces a component that takes care of the control and/or the calling of a component or system.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
43
Q

functional requirement

A

A requirement that specifies a function that a component or system must be able to perform.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
44
Q

non-functional requirement

A

A requirement that describes how the component or system will do what it is intended to do.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
45
Q

robustness, error-tolerance, fault-tolerance

A

The degree to which a component or system can function correctly in the presence of invalid inputs or stressful environmental conditions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
46
Q

robustness testing

A

Testing to determine the robustness of the software product.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
47
Q

stub

A

A skeletal or special-purpose implementation of a software component, used to develop or test a component that calls or is otherwise dependent on it. It replaces a called component.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
48
Q

system testing

A

Testing an integrated system to verify that it meets specified requirements.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
49
Q

test environment, test bed, test rig

A

An environment containing hardware, instrumentation, simulators, software tools, and other support elements needed to conduct a test.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
50
Q

test level, test stage

A

A specific instantiation of a test process.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
51
Q

test-driven development (TDD)

A

A way of developing software where the test cases are developed, and often automated, before the software is developed to run those test cases.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
52
Q

user acceptance testing, acceptance testing

A

Acceptance testing conducted in a real or simulated operational environment by intended users focusing their needs, requirements and business processes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
53
Q

acceptance criteria

A

The criteria that a component or system must satisfy in order to be accepted by a user, customer, or other authorized entity.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
54
Q

acceptance testing

A

Formal testing with respect to user needs, requirements, and business processes conducted to determine whether or not a system satisfies the acceptance criteria and to enable the user, customers or other authorized entity to determine whether or not to accept the system.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
55
Q

black-box testing, specification-based testing

A

Testing, either functional or non-functional, without reference to the internal structure of the component or system.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
56
Q

black-box test technique, black-box technique, specification-based technique, specification-based test technique

A

A procedure to derive and/or select test cases based on an analysis of the specification, either functional or non-functional, of a component or system without reference to its internal structure.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
57
Q

code coverage

A

An analysis method that determines which parts of the software have been executed (covered) by the test suite and which parts have not been executed, e.g., statement coverage, decision coverage or condition coverage.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
58
Q

functional testing

A

Testing conducted to evaluate the compliance of a component or system with functional requirements.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
59
Q

non-functional testing

A

Testing conducted to evaluate the compliance of a component or system with non-functional requirements.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
60
Q

interoperability

A

The degree to which two or more components or systems can exchange information and use the information that has been exchanged.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
61
Q

interoperability testing, compatibility testing

A

Testing to determine the interoperability of a software product.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
62
Q

load testing

A

A type of performance testing conducted to evaluate the behavior of a component or system under varying loads, usually between anticipated conditions of low, typical, and peak usage.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
63
Q

maintainability

A

The degree to which a component or system can be modified by the intended maintainers.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
64
Q

maintainability testing

A

Testing to determine the maintainability of a software product.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
65
Q

performance efficiency, time behavior, performance

A

The degree to which a component or system uses time, resources and capacity when accomplishing its designated functions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
66
Q

performance testing

A

Testing to determine the performance of a software product.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
67
Q

portability

A

The ease with which the software product can be transferred from one hardware or software environment to another.

68
Q

portability testing, configuration testing

A

Testing to determine the portability of a software product.

69
Q

reliability

A

The degree to which a component or system performs specified functions under specified conditions for a specified period of time.

70
Q

reliability testing

A

Testing to determine the reliability of a software product.

71
Q

security

A

The degree to which a component or system protects information and data so that persons or other components or systems have the degree of access appropriate to their types and levels of authorization.

72
Q

security testing

A

Testing to determine the security of the software product.

73
Q

stress testing

A

A type of performance testing conducted to evaluate a system or component at or beyond the limits of its anticipated or specified workloads, or with reduced availability of resources such as access to memory or servers.

74
Q

white-box testing, clear-box testing, code-based testing, glass-box testing, logic-coverage testing, logic-driven testing, structural testing, structure-based testing

A

Testing based on an analysis of the internal structure of the component or system.

75
Q

usability

A

The degree to which a component or system can be used by specified users to achieve specified goals in a specified context of use.

76
Q

usability testing

A

Testing to evaluate the degree to which the system can be used by specified users with effectiveness, efficiency and satisfaction in a specified context of use.

77
Q

impact analysis

A

The identification of all work products affected by a change, including an estimate of the resources needed to accomplish the change.

78
Q

maintenance

A

The process of modifying a component or system after delivery to correct defects, improve quality attributes, or adapt to a changed environment.

79
Q

maintenance testing

A

Testing the changes to an operational system or the impact of a changed environment to an operational system.

80
Q

dynamic testing

A

Testing that involves the execution of the software of a component or system.

81
Q

static testing

A

Testing a work product without code being executed.

82
Q

entry criteria

A

The set of conditions for officially starting a defined task.

83
Q

formal review

A

A form of review that follows a defined process with a formally documented output.

84
Q

informal review

A

A type of review without a formal (documented) procedure.

85
Q

inspection

A

A type of formal review to identify issues in a work product, which provides measurement to improve the review process and the software development process.

86
Q

metric

A

A measurement scale and the method used for measurement.

87
Q

moderator, inspection leader

A

A neutral person who conducts a usability test session.

88
Q

peer review

A

A form of review of work products performed by others qualified to do the same work.

89
Q

reviewer, checker, inspector

A

A participant in a review, who identifies issues in the work product.

90
Q

scribe, recorder

A

A person who records information during the review meetings.

91
Q

technical review

A

A formal review type by a team of technically-qualified personnel that examines the suitability of a work product for its intended use and identifies discrepancies from specifications and standards.

92
Q

walkthrough, structured walkthrough

A

A type of review in which an author leads members of the review through a work product and the members ask questions and make comments about possible issues.

93
Q

compiler

A

A computer program that translates programs expressed in a high-order language into their machine language equivalents.

94
Q

complexity

A

The degree to which a component or system has a design and/or internal structure that is difficult to understand, maintain and verify.

95
Q

control flow

A

The sequence in which operations are performed during the execution of a test item.

96
Q

data flow

A

An abstract representation of the sequence and possible changes of the state of data objects, where the state of an object is any of creation, usage, or destruction.

97
Q

static analysis

A

The process of evaluating a component or system without executing it, based on its form, structure, content, or documentation.

98
Q

test case specification

A

Documentation of a set of one or more test cases.

99
Q

test design

A

The activity of deriving and specifying test cases from test conditions.

100
Q

test execution schedule

A

A schedule for the execution of test suites within a test cycle.

101
Q

test procedure specification, test procedure, test scenario

A

Documentation specifying one or more test procedures.

102
Q

test script

A

A sequence of instructions for the execution of a test.

103
Q

traceability

A

The degree to which a relationship can be established between two or more work products.

104
Q

experience-based testing

A

Testing based on the tester’s experience, knowledge and intuition.

105
Q

experience-based test technique, experience-based technique

A

A procedure to derive and/or select test cases based on the tester’s experience, knowledge and intuition.

106
Q

test design

A

The activity of deriving and specifying test cases from test conditions.

107
Q

test design specification

A

Documentation specifying the features to be tested and their corresponding test conditions.

108
Q

test technique, test case design technique, test specification technique, test technique, test design technique

A

A procedure used to derive and/or select test cases.

109
Q

boundary value

A

A minimum or maximum value of an ordered equivalence partition.

110
Q

boundary value analysis

A

A black-box test technique in which test cases are designed based on boundary values.

111
Q

decision table, cause-effect decision table

A

A table used to show sets of conditions and the actions resulting from them.

112
Q

decision table testing

A

A black-box test technique in which test cases are designed to execute the combinations of inputs and/or stimuli (causes) shown in a decision table.

113
Q

equivalence partitioning, partition testing

A

A black-box test technique in which test cases are designed to exercise equivalence partitions by using one representative member of each partition.

114
Q

equivalence partition, equivalence class

A

A portion of the value domain of a data element related to the test object for which all values are expected to be treated the same based on the specification.

115
Q

state transition

A

A transition between two states of a component or system.

116
Q

state transition testing, finite state testing

A

A black-box test technique using a state transition diagram or state table to derive test cases to evaluate whether the test item successfully executes valid transitions and blocks invalid transitions.

117
Q

use case

A

A sequence of transactions in a dialogue between an actor and a component or system with a tangible result, where an actor can be a user or anything that can exchange information with the system.

118
Q

use case testing, scenario testing, user scenario testing

A

A black-box test technique in which test cases are designed to execute scenarios of use cases.

119
Q

decision

A

A type of statement in which a choice between two or more possible outcomes controls which set of actions will result.

120
Q

decision coverage

A

The coverage of decision outcomes.

121
Q

decision outcome

A

The result of a decision that determines the next statement to be executed.

122
Q

statement coverage

A

The percentage of executable statements that have been exercised by a test suite.

123
Q

statement, source statement

A

An entity in a programming language, which is typically the smallest indivisible unit of execution.

124
Q

exploratory testing

A

An approach to testing whereby the testers dynamically design and execute tests based on their knowledge, exploration of the test item and the results of previous tests.

125
Q

tester

A

A skilled professional who is involved in the testing of a component or system.

126
Q

test leader, lead tester

A

On large projects, the person who reports to the test manager and is responsible for project management of a particular test level or a particular set of testing activities.

127
Q

test manager

A

The person responsible for project management of testing activities and resources, and evaluation of a test object. The individual who directs, controls, administers, plans and regulates the evaluation of a test object.

128
Q

test approach

A

The implementation of the test strategy for a specific project.

129
Q

test strategy, organizational test strategy

A

Documentation that expresses the generic requirements for testing one or more projects run within an organization, providing detail on how testing is to be performed, and is aligned with the test policy.

130
Q

defect density, fault density

A

The number of defects per unit size of a work product.

131
Q

failure rate

A

The ratio of the number of failures of a given category to a given unit of measure.

132
Q

test control

A

A test management task that deals with developing and applying a set of corrective actions to get a test project on track when monitoring shows a deviation from what was planned.

133
Q

test monitoring

A

A test management activity that involves checking the status of testing activities, identifying any variances from the planned or expected status, and reporting status to stakeholders.

134
Q

configuration

A

The composition of a component or system as defined by the number, nature, and interconnections of its constituent parts.

135
Q

configuration management

A

A discipline applying technical and administrative direction and surveillance to identify and document the functional and physical characteristics of a configuration item, control changes to those characteristics, record and report change processing and implementation status, and verify compliance with specified requirements.

136
Q

product risk

A

A risk impacting the quality of a product.

137
Q

project risk

A

A risk that impacts project success.

138
Q

risk-based testing

A

Testing in which the management, selection, prioritization, and use of testing activities and resources are based on corresponding risk types and risk levels.

139
Q

incident management

A

The process of recognizing and recording incidents, classifying them, investigating them, taking action to resolve them, and disposing of them when resolved.

140
Q

incident report, deviation report, software test incident report, test incident report

A

Documentation of the occurrence, nature, and status of an incident.

141
Q

configuration management tool

A

A tool that provides support for the identification and control of configuration items, their status over changes and versions, and the release of baselines consisting of configuration items.

142
Q

coverage tool, coverage measurement tool

A

A tool that provides objective measures of what structural elements, e.g., statements, branches have been exercised by a test suite.

143
Q

debugging tool, debugger

A

A tool used by programmers to reproduce failures, investigate the state of programs and find the corresponding defect. Debuggers enable programmers to execute programs step by step, to halt a program at any program statement and to set and examine program variables.

144
Q

dynamic analysis tool

A

A tool that provides run-time information on the state of the software code. These tools are most commonly used to identify unassigned pointers, check pointer arithmetic and to monitor the allocation, use and de-allocation of memory and to flag memory leaks.

145
Q

incident management tool

A

A tool that facilitates the recording and status tracking of incidents.

146
Q

modeling tool

A

A tool that supports the creation, amendment and verification of models of the software or system.

147
Q

monitoring tool

A

A software tool or hardware device that runs concurrently with the component or system under test and supervises, records and/or analyzes the behavior of the component or system.

148
Q

performance testing tool

A

A test tool that generates load for a designated test item and that measures and records its performance during test execution.

149
Q

requirements management tool

A

A tool that supports the recording of requirements, requirements attributes (e.g., priority, knowledge responsible) and annotation, and facilitates traceability through layers of requirements and requirements change management. Some requirements management tools also provide facilities for static analysis, such as consistency checking and violations to pre-defined requirements rules.

150
Q

review tool

A

A tool that provides support to the review process. Typical features include review planning and tracking support, communication support, collaborative reviews and a repository for collecting and reporting of metrics.

151
Q

security tool

A

A tool that supports operational security.

152
Q

static analyzer, analyzer, static analysis tool

A

A tool that carries out static analysis.

153
Q

stress testing tool

A

A tool that supports stress testing.

154
Q

probe effect

A

The effect on the component or system by the measurement instrument when the component or system is being measured, e.g., by a performance testing tool or monitor. For example performance may be slightly worse when performance testing tools are being used.

155
Q

test comparison

A

The process of identifying differences between the actual results produced by the component or system under test and the expected results for a test. Test comparison can be performed during test execution (dynamic comparison) or after test execution.

156
Q

test comparator, comparator

A

A test tool to perform automated test comparison of actual results with expected results.

157
Q

test data preparation tool, test generator

A

A type of test tool that enables data to be selected from existing databases or created, generated, manipulated and edited for use in testing.

158
Q

test design tool

A

A tool that supports the test design activity by generating test inputs from a specification that may be held in a CASE tool repository, e.g., requirements management tool, from specified test conditions held in the tool itself, or from code.

159
Q

test harness

A

A test environment comprised of stubs and drivers needed to execute a test.

160
Q

test execution tool

A

A test tool that executes tests against a designated test item and evaluates the outcomes against expected results and postconditions.

161
Q

test management

A

The planning, scheduling, estimating, monitoring, reporting, control and completion of test activities.

162
Q

test management tool

A

A tool that provides support to the test management and control part of a test process. It often has several capabilities, such as testware management, scheduling of tests, the logging of results, progress tracking, incident management and test reporting.

163
Q

unit test framework

A

A tool that provides an environment for unit or component testing in which a component can be tested in isolation or with suitable stubs and drivers. It also provides other support for the developer, such as debugging capabilities.

164
Q

data-driven testing

A

A scripting technique that stores test input and expected results in a table or spreadsheet, so that a single control script can execute all of the tests in the table. Data-driven testing is often used to support the application of test execution tools such as capture/playback tools.

165
Q

keyword-driven testing, action word-driven testing

A

A scripting technique that uses data files to contain not only test data and expected results, but also keywords related to the application being tested. The keywords are interpreted by special supporting scripts that are called by the control script for the test.