Intro Flashcards

1
Q

BS 7925-1

A

Testing involves executing software with the intent to identify errors and ensure the application meets specified requirements.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

IEEE Definition

A

Testing evaluates systems or components, manually or automatically, to verify that they satisfy requirements or identify discrepancies between expected and actual outcomes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

ISEB Syllabus

A

Testing measures software quality by identifying defects, encompassing both functional (what the software does) and non-functional (how it performs) aspects.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Entities in the Testing Problem

A
  • P (Implementation): Development of the system
  • S (Specification): Requirements or goals defining the system’s correct behavior.
  • O (Observation): Outputs and effects of the system under testing.
  • T (Testing): Process of verifying that P aligns with S by examining O.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Testing Axioms

A
  1. Bugs cannot be fully eliminated
  2. Exhaustive Testing is impossible
  3. Testing is risk-based and context-dependent
  4. Testing starts in SDLC
  5. Finding Bugs often uncovers more
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Testing is risk-based and context-dependent

A

Strategies vary between critical systems like medical software and non-critical systems.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Testing starts in SDLC

A

Catch bugs early to minimize cost and effort to fix them later.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Testing Principles

A
  • Traceability: Every test should correspond to a requirement
  • Risk Prioritization: Test high impact areas to maximize impact.
  • Pareto Principle: 80% of issues stem from 20% of the code.
  • Diversity of Techniques: A single method can’t uncover all types of bugs, use a mix of testing strategies (ex. White box, Black box)
  • Bug Fix Timing: Fixing defects early reduces cost and complexity
  • Pesticide Paradox: Regularly change and enhance tests to ensure continued effectiveness in finding bugs.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Additional Principles

A
  1. Sensitivity: A test is more effective if it fails consistently when an issue exists
  2. Intentions: Clearly define what each test aims to achieve to prevent ambiguity
  3. Partition: Break down large problems into smaller, manageable sections
  4. Feedback: Improve the testing process continuously based on past and present outcomes.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Error

A

Human mistake that leads to an incorrect implementation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Fault/Defect/Bug

A

Specific problem in the program (ex. Incorrect logic data)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Failure

A

Visible manifestation of a fault during program execution

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Levels of Testing

A
  1. Unit testing - Verifies individual components or modules in isolation
  2. Integration testing - Ensures different components work together correctly
  3. System testing - Evaluates the entire application as a whole against requirements
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

The Testing Process

A
  1. Test Planning
  2. Test Design
  3. Environment Setup
  4. Execution
  5. Problem Reporting
  6. Exit Criteria
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Test Planning

A

Establish the entry/exit criteria, Test strategies, tools, schedules and resources and introduce problem tracking and reporting mechanisms.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Test Design

A

Review system requirements, architecture, and testability. Define test conditions, required test data and specific test cases with preconditions and steps

17
Q

Environment Setup

A

Configure hardware, software, network and database environments needed for testing

18
Q

Execution

A

Run test cases, record outcomes (Pass/Fail/Not executed) and perform regression testing to make sure fixes don’t break existing functionality.

19
Q

Problem Reporting

A

Document with description, steps to reproduce, severity and priority, and potential new test ideas.

20
Q

Exit Criteria

A

Define conditions for ending testing; Coverage threshold met, no critical faults remaining, Testing within time and budget constraints.

21
Q

Test Case

A

Input, execution steps, and expected outcome

22
Q

Test Specification

A

Requirements satisfied by one or more test cases

23
Q

Test Suite

A

Collection of test cases

24
Q

Adequacy Criterion

A

Measure of how effectively a test suite meets testing requirements.

25
Types of Coverage Analysis
- Statement (Every line of code executes) - Branch (Every decision point's branches are tested) - Condition (Tests logical conditions for all T/F combinations) - Path (Tests all potential execution paths) - Data (Asses how input data affects execution)
26
balance between effort and effectiveness
You may spend 40 hours testing or 80 hours, but if they produce 10 and 12 bugs respectively then it’s better to spend 40 hours.