Chapter 2. PART 2 Test Levels Flashcards

1
Q

TEST LEVELS

A

Group of test activities that are organized and managed together. Each test level is an instance of the test process consisting of the activities performed for software at a driven development level, from individual components to complete systems or systems of systems. These levels are linked to other activities performed as part of the software development cycle.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

ATTRIBUTES OF TEST LEVELS

A

1.TEST OBJECT
2. TEST OBJECTIVE
3. TEST BASIS
4. DEFECTS AND FAILURES
5. SPECIFIC APPROACHES AND RESPONSIBILITIES

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

TEST TYPE

A

Testing associated with a specific characteristic of test object
Difference between test level and test type - the latter take into account the test objective and the reason for testing, the first one - refers to the phases of the development life cycle and the stage of advancement in product development
Test level and test type are independent of each other - each test type can be performed at any test level

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

TEST LEVELS - MOST COMMON CLASSIFICATION

A
  1. COMPONENT TESTING
  2. COMPONENT INTEGRATION TESTING
  3. SYSTEM TESTING
  4. SYSTEM INTEGRATION TESTING
  5. ACCEPTANCE TESTING
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Minimal available test environment DTAP - WHAT’S THAT

A

DEVELOPMENT ENVIRONMENT
TEST ENVIRONMENT
ACCEPTANCE ENVIRONMENT
PRODUCTION ENVIRONMENT

Each test level requires an appropriate test environment -

For component testing - unit test environment, i.e. xUnit-type frameworks (e.g. JUnit)
Libraries for creating mock objects (e.g. EasyMock)
For acceptance testing - similar for production environment, because large part of field defects (those reported by user after the software is released) are defects created by the interaction of the system and the environment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

COMPONENT TESTING AND ITS OBJECTIVES

A

• Other name - UNIT TESTING OR PROGRAM TESTING,
• Focuses on modules that can be tested SEPARATELY
• GOALS:
- component risk mitigation
- checking the compatibility of the functional and nonfunctional behaviour of the component with the design and the specifications
•usually performed by DEVELOPER (author of the code), right after the code is written; sometimes automated tests

- building confidence in the quality of the component
- component defect detection
- preventing defects from reaching higher levels of testing

• in case of INCREMENTAL and ITERATIVE SDLC -> automated component regression test are a key component to ensure changes in code that are unavoidable in those type of SDLC didnt cause any malfuntion

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

ISOLATION IN COMPONENT TESTING

A

• MOST OFTEN it depends on type of SDLC
• that means it may be necessary to use service virtualization, test harnesses or mock objects (stubs nad drivers)
• component tests should be able to be executed in any order and their results should be the same regardless of the execution order

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

MOCK OBJECTS

A

TO ALLOW ACTUAL EXECUTIONS OF BATCHES OF CODE THAT REFER TO OTHER OBJECTS (e.g. database) THAT DO NOT YET EXIST

• TO TEST THE FUNCTIONALITY of code that’s not yet finished but its crucial to know for testing certain component

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

STUB VS DRIVER

A

MOCK OBJECTS
• they perform the same role, they simulate the operation of the other objects that do not yet exist

STUB - dummy object THAT IS CALLED BY OUR COMPONENT UNDER TEST

DRIVER - dummy object THAT CALLS COMPONENT UNDER TEST

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

TEST HARNESS

A

A test when BOTH STUBS AND DRIVES ARE USED FOR TESTING - THIS TYPE OF ENVIRONMENT IS TEST HARNESS

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

WHAT COMPONENT TESTING MAY COVER

A

• FUNCTIONALITY (e.g. correctness of calculations)
• NONFUNCTIONAL CHARACTERISTICS (performance, reliability)
• STRUCTURAL PROPERTIES (statement or decision testing)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

TYPICAL TEST BASIS FOR COMPONENT TESTING

A

• DETAILED DESIGN
• CODE
• DATA MODEL
• COMPONENT SPECIFICATION

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

TYPICAL TEST OBJECTS OF COMPONENT TESTING:

A

• MODULES, UNITS, COMPONENTS
• CODE AND DATA STRUCTURES
• CLASSES AND METHODS (in object-oriented programming)
•DATABASE COMPONENTS

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

TYPICAL DEFECTS AND FAILURES IN COMPONENT TESTING

A

• INCORRECT FUNCTIONALITY (e.g. inconsistent with project specifications)
• DATA FLOW PROBLEMS
• INCORRECT CODE AND LOGIC

DEFECTS. - usually fixed as soon as detected
data from this testing usually not collected

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

COMPONENT INTEGRATION TESTING

A

Focuses on INTERACTION and INTERFACES between INTEGRATED COMPONENTS.
COMPONENT -> CLASS, FILE, FUNCTION, PROCEDURE, PACKAGE etc.
Performed after component testing
Usually AUTOMATED
In ITERATIVE SDLC - part of CONTINUOUS INTEGRATION PROCESS

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

SYSTEM INTEGRATION TESTING

A

• Focuses on INTERACTIONS AND INTERFACES between SYSTEMS, PACKAGES AND MICROSERVICES
• ! May include interactions with with interfaces provided by EXTERNAL ORGANIZATIONS (e.g. web service providers)
because Devs can’t control the external interfaces - whole range of testing problems
e.g. fixing details in the code created by the external org that block testing or preparing testing environment
• system integration testing can take place after system testing or parallel with ongoing system testing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

INTEGRATION TESTING GOALS

A

TYPICAL OBJECTIVES:
1. MIGRATING RISKS arising from component/system interactions
2/ CHECKING COMPLIANCE of functional and nonfunctional behaviours of component/system interfaces with the design and specifications
3. Building CONFIDENCE IN THE QUALITY OF THE INTERFACES used by components/systems
4. Detecting DEFECTS IN INTERFACES and COMMUNICATION PROTOCOLS
5. PREVENTING DEFECTS from reaching higher levels of testing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

TEST BASIS OF INTEGRATION TESTING

A

• any kind of DOCUMENTATION DESCRIBING THE INTERACTION OR COOPERATION OF INDIVIDUAL COMPONENTS OR SYSTEMS

Examples of work products that can be used as test basis for integration testing:
1. Software and system design
2. Sequence diagrams
3. Interface/communication protocol specifications
4. Use cases
5. Architecture at the component and system level
6. work flows
7. Definitions of external interfaces

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

TEST OBJECTS OF INTEGRATION TESTING

A

Typical objects:
1. Application programming interfaces (APIs) that provide communication
2. Interfaces that provide communication between systems (e.g. system-to-system, system-to-database, microservice-to-microservice, etc.)
3. Communication protocols between components and systems

20
Q

TYPICAL DEFECTS AND FAILURES DETECTED IN COMPONENT INTEGRATION TESTING

A
  1. Incorrect and missing data or incorrect data coding
  2. Incorrect sequencing or incorrect synchronization of interface calls
  3. Incompatible interfaces
  4. Communication errors between components
  5. Failure to handle or incorrect handling of communication errors between components
  6. Incorrect assumptions about the meaning, units, or boundaries of data transferred between components
21
Q

TYPICAL DEFECTS AND FAILURES DETECTED IN SYSTEMS INTEGRATION TESTING

A
  1. Inconsistent message structures sent between systems
  2. Incorrect or missing data or incorrect data coding
  3. Incompatible interfaces
  4. Inter-system communication errors
  5. Failure to handle or improper handling of inter-system communication errors
  6. Incorrect assumptions about the meaning, units, or boundaries of data transferred between systems
  7. Failure to comply with mandatory security regulations
22
Q

SPECIFIC APPROACHES AND RESPONSIBILITIES IN INTEGRATION TESTING

A

•BOTH SYSTEM AND COMPONENT - FOCUS ON INTEGRATION ITSELF
- communication between module a and module b not functionality of separate modules; same with systems
• component integration testing - usually BY DEVELOPERS; system integration - BY TESTERS who are familiar with system architecture

23
Q

TEST ENVIRONMENT FOR INTEGRATION TESTING

A

AS SIMILAR AS POSSIBLE TO the TARGET OR PRODUCTION ENVIRONMENT
• very large number of failures arise from system’s interactions with the environment in which the system is seated

24
Q

INTEGRATION STRATEGIES and INTEGRATION TESTING STRATEGIES

A

• PLANNING - both tests and integration strategies PRIOR to building components or systems
-> to maximise efficiency
According to older ISTQB syllabus those strategies we can list as:
1. STRATEGIES BASED ON THE SYSTEM ARCHITECTURE
2. FUNCTIONAL TASK-BASED-STRATEGIES
3. STRATEGY BASED ON SEQUENCES OF TRANSACTION PROCESSING
4. STRATEGY BASED ON OTHER ASPECTS OF THE SYSTEM

25
Q

INTEGRATION TESTING STRATEGIES BASED ON TH SYSTEM ARCHITECTURE

A

TOP-DOWN
integration of a core component with the components or calls is tested first, then those components with the components they call; integration occurs by „levels” of recess of components already called
BOTTOM-UP
opposite direction of integration, start at the bottom and sequentially include components lying at higher levels
* frequently using stubs and drivers, because in general the called and calling modules are not yet written

26
Q

FUNCTIONAL TASK-BASED STRATEGIES

A

Testing first the integration of group of components that together are responsible for some functionality - hence testing first certain functionalities, then integration of other modules

27
Q

STRATEGY BASED ON SEQUENCES OF TRANSACTION PROCESSING

A

Testing the PATH OF INFORMATION FLOW THROUGH THE SYSTEM (e.g the path from specific input to specific output, to achieve the system observability as soon as possible) - testing first integration of components lying on this path, next integration of the remaining components

28
Q

STRATEGY BASED ON OHTER ASPECTS OF THE SYSTEM

A

We choose a criteria based on what’s important for certain software

29
Q

SYSTEM TESTING OBJECTIVES

A

• TYPICAL TESTERS ACTIVITY, component and integration testing are usually done by the developer, acceptance testing by the customer
• focuses on BEHAVIOUR AND CAPABILITIES OF AN ENTIRE ALREADY INTEGRATED SYSTEM OR PRODUCT, TAKING INTO ACCOUNT THE ENTIRETY OF THE TASKS OT CAN PERFORM AND THE NONFUNCTIONAL BEHAVIOURS IT EXHIBITS WHILE PERFORMING THOSE TASKS

30
Q

GOALS OF SYSTEM TESTING

A
  1. REDUCING THE RISK OF SYSTEM. MALFUNCTION
  2. CHECKING THE COMPLIANCE OF THE FUNCTIONAL AND NONFUNCTIONAL BEHAVIOUR OF THE SYSTEM WITH THE DESIGN AND SPECIFICATIONS
  3. CHECKING THE COMPLETENESS OF TEH SYSTEM AND THE CORRECTNESS OF ITS OPERATIONS
  4. BUILDING CONFIDENCE IN THE QUALITY OF TEH SYSTEM AS A WHOLE
  5. DETECTING DEFECTS IN THE SYSTEM
  6. PREVENTING DEFECTS FORM REACHING THE ACCEPTANCE TESTING LEVEL OR PRODUCTION
  7. VERIFYING DATA QUALITY

•involving automated system regression testing
• may be necessary to meet the requirements of applicable regulations and/or norms/standards
• test environment should be as close as possible to the specifics of the target or production environment
• most feedback generated from test process
VERIFICATION OF THE SYSTEM, BY COMPARING ITS BEHAVIOUR OF TEH SYSTEM AS DESCRIBED IN THE REQUIREMENTS WITH THE ACTUAL BEHAVIOUR

31
Q

TEST BASIS FOR SYSTEM TESTING - EXAMPLES OF WORK PRODUCTS

A

REFERRING TO A SYSTEM AS WHOLE
• specifications of requirements (functional and nonfunctional) for the system and software
• risk analysis report
• use cases
• User stories and epics
• models of system behaviour
• state diagrams
• system operation statements and manuals

32
Q

TEST OBJECTS OF SYSTEM TESTING

A

• SYSTEM UNDER TESTERS ACTIVITY (THE SYSTEM AS A WHOLE)
• SYSTEM CONFIGURATION AND CONFIGURATION DATA

33
Q

TYPICAL DEFECTS AND FAILURES IN SYSTEM TESTING

A

• INCORRECT CALCULATIONS
• INCORRECT OR UNEXPECTED FUNCTIONAL OR NONFUNCTIONAL BEHAVIOURS OF THE SYSTEM
• INCORRECT CONTROL FLOW AND/OR DATA FLOWS IN THE SYSTEM
• PROBLEMS WITH TEH CORRECT AND COMPLETE PERFORMANCE OF OVERALL FUNCTIONAL TASKS
• PROBLEMS WITH THE PROPER OPERATION OF THE SYSTEM IN A PRODUCTION ENVIRONMENT
• INCONSISTENCY OF THE SYSTEM OPERATION WITH THE DESCRIPTIONS CONTAINED IN THE SYSTEM’S STATEMENTS AND USER MANUAL

34
Q

SPECIFIC APPROACHES AND RESPONSIBILITIES IN SYSTEM TESTING

A
  • overall behaviour as a whole at both functional and nonfunctional aspects
  • different techniques that are most appropriate for the particular aspects of the system under test
  • performed by independent testers
    •defects in specifications can lead to issues and lead to unexpected behaviour of the system, so false-positive and false-negative results can occur; that’s why testers should be involved at any stage of software development
35
Q

NONFUNCTIONAL SYSTEM TESTS THAT MAY REQUIRE REPRESENTATIVE PRODUCTION ENVIRONMENT

A
  1. PERFORMANCE TESTING
    e.g. environment should reflect the actual network throughput, and components shouldn’t betelach with mock objects - this can disrupt reporting of a module’s response time
  2. SECURITY TESTING
    security attacks will depend on the specific system configuration and environment in which system is located
  3. USABILTY TESTING
    final interface usability tests should be conducted on the actual interface
36
Q

ACCEPTANCE TESTING OBJECTIVES

A

Behaviour and capabilities of a the entire system or product - FROM USER’S PERSPECTIVE, NOT DEVELOPMENT TEAM
• VALIDATION, rather than verification

Objectives:
1. BUILDING CONFIDENCE IN A SYSTEM
2. CHECKING THE COMPLETENESS OF THE SYSTEM AND ITS PROPER OPERATION FROM THE POINT OF VEIW OF ACHIEVING BUSINESS OBEJCTIVES

• assessing readiness for deployment
• defects may be detected but main objectives is validation of the system

37
Q

FORMS OF ACCEPTANCE TESTING

A

1ST TYPE OF DIVISION:
1. USER ACCEPTANCE TESTING (UAT)
2. OPERATIONAL ACCEPTANCE TESTING (OAT)
3. ACCEPTANCE TESTING FOR CONTRACTUAL AND LEGAL COMPLIANCE

Division by where it’s performed
I. ALPHA TESTING - testing by the customer at the producer’s site, on test environment
2. BETA TESTING - field testing, testing by the customers at their own target environment

38
Q

USER ACCEPTANCE TESTING

A

• IN SIMULATED PRODUCTION ENVIRONMENT
•MAIN GOAL - build confidence in a system, that it meets users’ needs
• validation of requirements
• performs the business processes with a minimum of problems, cost and risk

39
Q

OPERATIONAL ACCEPTANCE TESTING

A

• TESTS RUN BY OPERATORS AND ADMINISTRATORS IN A SIMULATED PRODUCTION ENVIRONMENT
• FOCUS ON OPERATIONAL ASPECTS
• IT MAY INCLUDE:
- testing backup and recovery mechanisms
- installing, uninstalling and updating software
- failure recovery
- user management
- maintenance activities
- data loading and migration activities
- checking for security vulnerabilities
- performing performance tests
MAIN GOAL: making sure that operators and system administrators will be able to ensure that users will be able to operate the system correctly in a production environment, even under exceptional and difficult conditions

40
Q

ACCEPTANCE TESTING OF CONTROL AND LEGAL COMPLIANCE

A

• IN ACCORDANCE WITH ACCEPTANCE CRITERIA WRITTEN IN THE CONTRACT
• BY USERS OR INDEPENDENT TESTERS
• LEGAL COMPLIANCE TESTING
- context of legislation (laws, regulations, safety standards)
- performed by users or independent testers - results audited by regulators

MAIN OBJECTIVE: obtain assurance that compliance with requirements under applicable contracts or regulations have been achieved

41
Q

ALPHA AND BETA TESTING

A

ALPHA - performed on the premises of software development organization, but instead of development team, potential and current clients and/or operators and/or testers perform tests

BETA - tests are performed by current or potential customers at their own locations

OBJECTIVES
• build the confidence that product can be used under normal conditions
• detect errors related to the conditions and environment in which the system is used, especially when those conditions are hard to reproduce by project team

42
Q

COMMERCIAL OF-THE-SHELF (COTS)

A

Software for general sale
Developers often want feedback from potential or existing customers before software hits the market - so they do alpha and beta testing

43
Q

TEST BASIS OF ACCEPTANCE TESTING•

A

• BUSINESS PROCESSES
• USER AND BUSINESS REQUIREMENTS
• REGULATIONS, AGREEMENTS, NORMS, STANDARDS
• USE CASES
• SYSTEM DOCUMENTATION OR USER MANUALS
• INSTALLATION PROCEDURES
• RISK ANALYSIS REPORTS

FOR OPERATIONAL (OAT)
• BACKUP AND RESTORATION PROCEDURES
• FAILURE RECOVERY PROCEDURES
• OPERATIONAL DOCUMENTATION
•IMPLEMENTATION AND INSTALLATION DOCUS
• PERFORMANCE ASSUMPTIONS
• NORMS, STANDARDS, REGULATIONS IN THE FIELD OF SECURITY

44
Q

TEST OBJECTS OF ACCEPTANCE TESTING

A

• SYSTEM UNDER TEST
• SYSTEM CONFIGURATION
• BUSINESS PROCESSES PERFORMED ON A FULLY INTEGRATED SYSTEM
• BACKUP SYSTEMS AND HOT SITE REPLACEMENT CENTERS (to test business continuity and failure recovery mechanisms)
• PROCESSES RELATED TO OPERATIONAL USE AND MAINTENACE
• FORMS
• REPORTS
• EXISTING AND CONVERTED PRODUCTION DATA

45
Q

TYPICAL DEFECTS AND FAILURES IN ACCEPTANCE TESTING

A

• SYSTEM WORKFLOW ARE INCOMPATIBLE WITH BUSINESS AND USER REQUIREMENTS
• INCORRECTLY IMPLEMENTED BUSINESS RULES
• FAILURE OF THE SYSTEM TO MEET CONTRACTUAL OR LEGAL REQUIREMENTS
• NONFUNCTIONAL FAILURES - security vulnerabilities, insufficient performance under heavy load, malfunctioning on a supported platform

46
Q

SPECIFIC APPROACHES AND RESPONSIBILITIES IN ACCEPTANCE TESIGN

A

• ROLE OF SDLC - often last level
- BUT can take place during installation or integration
- acceptance testing of a new functional enhancement before system testing begins
- in ITERATIVE SDLC: types of a.t. At the end of each iteration
tests focused on validation of new functionality
• rests with customers, business users, product owners, system operators, other stakeholders