Performance Testing Flashcards
User-based objectives
- focus primarily on end-user satisfaction and business goals.
- are less concerned about feature types or how a product gets delivered.
Technical objectives
focus on operational aspects and providing answers to questions regarding a system’s ability to scale, or under what conditions degraded performance may become apparent.
Key objectives of performance testing include
- identify potential risks
- find opportunities for improvement
- identify necessary changes
Questions to stakeholders about the tes
- What transactions will be executed in the performance test and what average response time is expected?
- What system metrics are to be captured (e.g., memory usage, network throughput) and what values are expected?
- What performance improvements are expected from these tests compared to previous test cycles?
The Performance Test Plan Content
- Objective
- Test Objectives
- System Overview
- Types of Performance Testing to be Conducted
- Acceptance Criteria
- Test Data
- System Configuration
- Test Environment
- Test Tools
- Profiles
- Relevant Metrics
- Risks
Objective from PTP
- describes the goals, strategies and methods for the performance test
- enables a quantifiable answer to the central question of the adequacy and the readiness of the system to perform under load.
Acceptance Criteria (PTP)
- response time is a user concern, throughput is a business concern, and resource utilization is a system concern
- AC should be set for all relevant measures and related back to the following as applicable:
a. overall objectives
b. SLAs
c. Baseline values
Baseline values
a set of metrics used to compare current and
previously achieved performance measurements. This enables particular performance improvements to be demonstrated and/or the achievement of test acceptance criteria to be confirmed.
Test Data can include:
- User account data
- User input data
- Database (e.g., the pre-populated database that is populated with data for use
in testing)
User Data creation process should address the following aspects:
- data extraction from production data
- importing data into the SUT
- creation of new data
- creation of backups that can be used to restore the data when new cycles of testing are performed
- data masking or anonymizing. Adds risk to the performance tests as it may not have the same data characteristics as seen in real-world use.
System Configuration (PTP)
- A description of the specific system architecture, including servers (e.g., web, database, load balancer)
- Definition of multiple tiers
- Specific details of computing hardware (e.g., CPU cores, RAM, Solid State
Disks (SSD), Hard Drive Disks (HDD) ) including versions - Specific details of software (e.g., applications, operating systems, databases,
services used to support the enterprise) including versions - External systems that operates with the SUT and their configuration and
version (e.g., Ecommerce system with integration to NetSuite) - SUT build / version identifier
Test Environment
The test environment is often a separate environment that mimics production, but at a smaller scale. How will the examples be extrapolated?
Operational profiles
- provide a repeatable step-by-step flow through the application for a particular usage of the system.
- Aggregating these operational profiles results in a load profile (commonly referred to as a scenario)
Risks (PTP)
- areas not measured as part of the performance testing
- limitations to the performance testing
- limitations of the test environment
Examples of Recommended Actions
- change physical components (hardware, routers)
- change software (e.g., optimizing applications and database calls),
- altering network (e.g., load balancing, routing)