Chapter 1 Flashcards

1
Q

Coverage

A

The degree to which specified coverage items have been determined or have been exercised by a test suite expressed as a percentage.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Debugging

A

The process of finding, analyzing and removing the causes of failure in software,

confirmation testting checks whether the fixes resolved the defects

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Defect

A

An imperfection or deficiency in a work product where it does not meet its requirements or specifications. This may cause a failure but not in all circumstances. It may require specific inputs or preconditions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Error

A

A human action that produces an incorrect result

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Failure

A

An event in which a component or system does not perform a required function within specified limits

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Quality

A

The degree to which a component, system or process meets specified requirements and/or user/customer needs and expectations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Quality Assurance

A

Part of quality management focused on providing confidence that quality requirements will be fulfilled

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Root Cause

A

A source of a defect such that if it is removed, the occurrence of the defect type is decreased or removed.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Test analysis

A

The activity that identifies test conditions by analyzing the test basis.

The process of transforming defined test objectives into tangible test conditions and test cases.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Test basis

A

The body of knowledge used as the basis for test analysis and design.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Test case

A

A set of preconditions, inputs, actions (where applicable), expected results and postconditions, developed based on test conditions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Test completion

A

The activity that makes test assets available for later use, leaves test environments in a satisfactory condition and communicates the results of testing to relevant stakeholders.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Test condition

A

An aspect of the test basis that is relevant in order to achieve specific test objectives

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Test control

A

A test management task that deals with developing and applying a set of corrective actions to get a test project on track when monitoring shows a deviation from what was planned.

Take necessary actions to get testing back on track

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Test data

A

Data created or selected to satisfy the execution preconditions and inputs to execute one or more test cases.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Test design

A

The activity of deriving and specifying test cases from test conditions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Test execution

A

The process of running a test on the component or system under test, producing actual result(s).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Text execution schedule

A

A schedule for the execution of test suites within a test cycle

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Test implementation

A

The activity that prepares the testware needed for test execution based on test analysis and design

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Test monitoring

A

A test management activity that involves checking the status of testing activities, identifying any variances from the planned or expected status, and reporting status to stakeholders.

running reports, reviewing reports

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Test object

A

The component or system to be tested

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Test objective

A

A reason or purpose for designing and executing a
test, ie
prevent defects
Validation of User Requirements
Verification of fulfillment of requireemnts
Build Confidence in Level of Quality
Compliance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Test oracle

A

A source to determine expected results to compare with the actual result of the system under test.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Test planning

A

The activity of establishing or updating a test plan.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Test procedure
A sequence of test cases in execution order, and any associated actions that may be required to set up the initial preconditions and any wrap up activities post execution
26
Test process
The set of interrelated activities comprising of test planning, test monitoring and control, test analysis, test design, test implementation, test execution, and test completion. Activities are not necessarily sequential (iterative/overlap) Context matters in determining test process: testing for a research POC vs a Medical Device.
27
Test suite
A set of test cases or test procedures to be executed in a specific test cycle
28
Testing
The process consisting of all lifecycle activities, both static and dynamic, concerned with planning, preparation, and evaluation of software products and related work products to determine that they satisfy specified requirements, to demonstrate they are fit for purpose and to detect defects. (test planning, test analysis, test design, test implementation, test reporting, test evaluation, test execution)
29
Testware
Work products produced during the test process for use in planning, designing, executing, evaluating, and reporting on testing.
30
Traceability
The degree to which a relationship can be established between two or more work products
31
Validation
Confirmation by examination and through provision of objective evidence that the requirements for a specific intending use or application have been fulfilled. Comparing work product to actual user needs (acceptance testing). Dynamic Testing.
32
Verification
Confirmation by examination and through provision of objective evidence that specific requirements have been fullfilled. Comparing the work product of one level to the previous one (User requirements, software requirements, architecture, design). Also check each work product for consistency, completeness. Static analysis and testing techniques walkthroughs, inspections, traceability.
33
Difference between Validation and Verification
System will meet user and other stakeholder needs (Dynamic) vs System performs as per specified requirements (Static)
34
Testing
test planning, test analysis, test design, test implementation, test reporting, test evaluation, test execution
35
Static Testing
Testing a work product without code being executed
36
Dynamic Testing
Testing that involves the execution of the software of a component or system
37
Role of Testing
Reduce risk of failures or the occurrence of problems during operation. Contribute to overall quality of software. Increase likelihood of meeting statekholder needs.
38
Quality Assurance
Adherence to proper processes in order to provide confidence that the appropriate level of quality will be achieved.
39
Quality Management
Activities that direct and control an organization with regard to quality. Includes both Quality Assurance and Quality Control.
40
Quality Control
Activities that support the achievement of appropriate levels of quality, including test activities.
41
Error vs Defect vs Failure
Person vs Code/Work Product vs Execution
42
Root cause
Source of a defect such that if it is removed, the occurrence of the defect type is decreased or removed.
43
Testing Principle 1
Testing shows the presence of defects, not their absence
44
Testing Principle 2
Exhaustive testing is impossible (except in trivial cases)
45
Testing Principle 3
Early testing saves time and money - begin with the requirements
46
Testing Principle 4
Defects can cluster - a few modules will have the most defects, and testing should be focused appropriately
47
Testing Principle 5
Pesticide Paradox, running the same tests over and over will eventually not find defects; the defects have been romoved.
48
Testing Principle 6
Testing is context dependent; what you test and how thoroughly you test it varies depending on the risk and life cycle. (safety critical vs consumer app)
49
Testing Principle 7
Absence of errors is a fallacy fixing all the located defects does not guarantee the software will work or meet user needs. Just finding and fixing a large number of defects will ensure the success of a system.
50
Exhaustive Testing
A test approach in which the test suite comprises all combinations of input values and pre-conditions (13 fields with 3 values yields 3xx13 or 1,594,323 tests
51
Test Analysis Steps
``` Step 1: identify the test basis Step 2: evaluate for defects Step 3: identify features to be tested Step 4: design and prioritize test conditions Functional, Non Functional, Structural, Experience Based, Other Business and technical factors, Risk Step 5: bidirectional traceability between basis and conditions ```
52
Test Analysis Defects
``` Ambiguities Omissions Inconsistencies Inaccuracies Contradictions Superfluous statements ```
53
Test Design Tasks
Prioritize High Level Test Cases Identify Necessary Test Data Specify Test Environment Set-up, including tools and infrastructure Ensure traceability between test basis, conditions, and cases Find defects in test basis
54
Test Implementation Tasks
``` Implement and prioritze test cases Develop and prioritze test procedures Create Test Data Organize into suites, execution schedule Build and verify test environment Update and verify bidirectional traceability ```
55
Test Execution Tasks
``` Record versions of items, objects, tools, testware used Execute manual / automated tests Compare actual v expected Analyze causes of anomalies Log outcome of executions (pass/fail/blocked) Report defects Re-execute failed test Verify traceability ```
56
Test Completion Tasks
``` Lessons Learned Turnover Planned Deliverables Closure of Defect Reports Create Test Summary Finalizing testware for later user Improve Test Process ```
57
Test Monitoring and Control Work Products
Test Progress Reports | Test Summary Reports
58
Test Analysis Work Products
Test Conditions Traced to Test Basis Prioritized Test Conditions Defects
59
Test Design Work Products
HL Test Cases Test Data Test Environment
60
Test Implementation Work Products
``` Test Data Test Cases Test Procedures Test Suites Test Execution Schedule ```
61
Test Execution Work Products
Test Case/Procedure Status Test Logs Defect Reports
62
Test Completion Work Products
``` Test Summary Report Closed Defect Report Action Items for Improvement Change Request / Backlog Items Finalized Testware ```