Test Automation Engineering Flashcards
Certified Tester Advanced Level Test Automation Engineering (CTAL-TAE) v2.0
Test Case
A set of preconditions, inputs, actions (where applicable), expected results and postconditions, developed based on test conditions.
Test Step
A single interaction between an actor and a test object consisting of an input, an action, and an expected result.
Test Condition
A testable aspect of a component or system identified as a basis for testing.
Functional Testing
Testing performed to evaluate if a component or system satisfies functional requirements.
Non-Functional Testing
Testing performed to evaluate that a component or system complies with non-functional requirements.
System Under Test (SUT)
A type of test object that is a system.
Test Object
The work product to be tested.
Defect Report
Documentation of the occurrence, nature, and status of a defect.
Software Development Lifecycle (SDLC)
The activities performed at each stage in software development, and how they relate to one another logically and chronologically.
Waterfall Model
SDLC model that is both linear and sequential
V-Model
SDLC model where a process is executed in a sequential manner. Test levels: Component, Component Integration, System, System Integration & Acceptance
Component Testing
A test level that focuses on individual hardware or software components.
System Testing
A test level that focuses on verifying that a system as a whole meets specified requirements.
Integration Testing
A test level that focuses on interactions between components or systems.
Test Automation Framework (TAF)
A set of test harnesses and test libraries for test automation.
Test Harnesses
A collection of drivers and test doubles needed to execute a test suite.
Driver
A component or tool that temporarily replaces another component and controls or calls a test item in isolation.
Test Plan
Documentation describing the test objectives to be achieved and the means and the schedule for achieving them, organized to coordinate testing activities.
Test Suite
A set of test scripts or test procedures to be executed in a specific test run.
Test Run
The execution of a test suite on a specific version of the test object.
Test Log
A chronological record of relevant details about the execution of tests.
Test Automation
The conversion of test activities to automatic operation.
Test Automation Engineer
A person who is responsible for the design, implementation and maintenance of a test automation architecture as well as the technical evolution of the resulting test automation solution.
API Testing
A test approach performed by submitting requests to a test object using its application programming interface.
GUI Testing
A test approach performed by interacting with a test object using a graphical user interface.
Testability
The degree to which test conditions can be established for a component or system, and tests can be performed to determine whether those test conditions have been met.
Configuration need for a better testability of the SUT that can be generated atm by different development FWs or developers can set them manually…
Accessibility Identifiers
Configuration need for a better testability of the SUT related with application parameters…
System Environment Variables
Configuration need for a better testability of the SUT that is similar to system variables but can be set before starting deployment…
Deployment Variables
Designing for testability of a SUT consists of which aspects?
- Observability
- Controllability
- Architecture Transparency
Observability aspect consists on…
The SUT needs to provide interfaces that give insight into the SUT.
Controllability aspect consists on…
The SUT needs to provide interfaces that can be used to perform actions on the SUT (i.e. UI elements, function calls, TCP/IP, and others)
Architecture Transparency
The documentation of an architecture needs to provide clear, understandable components and interfaces that give observabillity and controllability at all test levels and foster quality.
Test Automation Solution (TAS)
The implementation of a test automation architecture for a test automation assignment by understanding of functional, non-functional, and technical requirements of the SUT.
Test Process
The set of interrelated activities comprising of test planning, test monitoring, test control, test analysis, test design, test implementation, test execution, and test completion.
Test Level
A specific instantiation of a test process.
Test Type
A group of test activities based on specific test objectives aimed at specific characteristics of a component or system.
Test Data
Data needed for test execution.
Behavior-Driven Development (BDD)
A collaborative approach to development in which the team is focusing on delivering expected behavior of a component or system for the customer, which forms the basis for testing.
Capture/Playback
A test automation approach in which inputs to a test object are recorded during manual testing to generate automated test scripts that can be executed later.
Data-Driven Testing (DDT)
A scripting technique that uses data files to contain the test data and expected results needed to execute the test scripts.
Generic Test Automation Architecture (gTAA)
A representation of the layers, components, and interfaces that allows for a structured and modular approach to implement test automation.
Keyword-Driven Testing
A scripting technique in which test scripts contain high-level keywords and supporting files that contain low-level scripts that implement those keywords.
Linear Scripting
A simple scripting technique without any control structure in the test scripts.
Model-Based Testing
Testing based on or involving models.
Structured Scripting
A scripting technique that builds and utilizes a library of reusable (parts of) scripts.
Test Adaptation Layer
The layer in a test automation architecture which provides the necessary code to adapt test scripts on an abstract level to the various components, configuration or interfaces of the SUT.
Test Script
A sequence of instructions for the execution of a test.
Testware
Work products produced during the test process for use in planning, designing, executing, evaluating and reporting on testing.
Test Step
A single interaction between an actor and a test object consisting of an input, an action, and an expected result.
Test-Driven Development
A software development technique in which the test cases are developed, automated and then the software is developed incrementally to pass those test cases.
Generic Test Automation Architecture (gTAA) Interfaces…
- SUT
- Project Management
- Configuration Management
- Test Management
SUT Interface
Describes the connection between the SUT and the TAF
Project Management Interface
Describes the test automation development progress
Test Management Interface
Describes the mapping of test case definitions and automated test cases
Configuration Management Interface
Describes the CI/CD pipelines, environments and testware
Core Test Automation Capabilities
- Test Generation
- Test Definition
- Test Execution
- Test Adaptation
Test Generation
Supports the automated design of test cases based on a test model. Is an optional capability.
Test Definition
Supports the definition and implementation of test cases and/or test suites, which optionally can be derived from a test model.
Test Execution
Test execution and test logging.
Test Adaptation
Provides the necessary functionality to adapt the automated tests for the various components or interfaces of the SUT. It provides different adaptors for connnecting to the SUT via APIs, protocols, and services.
Define a distinct border of classes that have similar purposes such as TCs, test reporting, test logging, encryption, and test harnesses…
TAF Layers
Test Scripts - TAF Layer
Purpose: Provide a TC repository of the SUT and test suite annotations. No direct calls should be made to the core libraries from test scripts.
Business Logic - TAF Layer
All the SUT -dependent- libraries are stored in this layer. Is used to set up the TAF to run against the SUT and the additional configurations.
Core Libraries
All the libraries that are -independent- of any SUT. This core libraries can be re-used.
Encapsulation
Hiding the internal details of an object and only exposing necessary parts.
Abstraction
Hiding complex implementation details and showing only essential features.
Inheritance
One class can inherit properties and behavior from another class.
Polymorphism
The ability to use the same method name with different implementations.
Single Responsibility Principle (SRP)
A class should have only one reason to change (one responsibility).
Open/Closed Principle (OCP)
A class should be open for extension but closed for modification.
Liskov Substitution Principle (LSP)
Subclasses should be replaceable with their base classes without breaking functionality.
Interface Segregation Principle (ISP)
A class should not be forced to implement methods it does not use.
Dependency Inversion Principle (DIP)
High-level modules should not depend on low-level modules. Both should depend on abstractions.
Risk
A factor that could result in future negative consequences.
Test Fixture
The predefined data and test environment to test software in a repeatable manner.
Deployment Risks
- Firewall issues
- Resource utilization
- Network connection
- Reliability
Technical Deployment Risks
- Packaging
- Logging
- Test structuring
- Updating
Packaging
Needs to be considered as version control of test automation is just as important as for the SUT.
Test Logging Levels
- Fatal
- Error
- Warn
- Info
- Debug
- Trace
Test Logging: FATAL
Is used to log events that may lead to abort the test execution
Test Logging: ERROR
Is used when a condition or interaction fails and therefore fails the test case as well
Test Logging: WARN
Is used when an unexpected condition/action occurs but does not break the flow of the TC
Test Logging: INFO
Is used to show basic info about a TC and what happens during test execution
Test Logging: DEBUG
Is used to store execution specific details that generally are not required for basic logs, but useful during investigation of a test failure
Test Logging: TRACE
Similar to debug, but has even more information
Test Structuring
Involves the most important part of the TAS is the test harnesses and the test fixtures included in it.
Updating
The technical risk of the automatic updates on the test harnesses (e.g. agents) and version changes on devices.
Hardcoding
Is the process of embedding values in the software without the ability to change them directly.
Pipeline (CI/CD Pipeline)
A pipeline is an automated sequence of steps used in software development to build, test, and deploy applications efficiently.
Application Programming Interface (API)
A type of interface in which the components or systems involved exchange information in a defined formal structure.
API Connections
Understand the business logic that can be tested automatically and the relationship between APIs
API Documentation
Serves as a baseline for test automation with all relevant information (e.g. parameters, headers, and distinct types of request-response of objects)
Contract Testing
Is a type of integration testing verifying that services can communicate with each other, and that the data shared between the services is consistent with a specified set of rules.
Measurement
The process of assigning a number or category to an entity to describe an attribute of that entity.
Metric
A measurement scale and the method used for measurement.
Test Progress Report
A type of periodic test report that includes the progress of test activities against a baseline, risks, and alternatives requiring a decision.
Must contain the test results, SUT info and documentation of the test environment.
There is usually a unique ID added to the interaction with the same ID for each subsequent call and integration in the system
Trace ID / Correlation ID
Static Analysis
The process of evaluating a component or system without executing it, based on its form, structure, content, or documentation.
Verification
The process of confirming that a work product fulfills its specification.
Validation
Confirmation by examination that a work product matches a stakeholder’s needs.
Debuggng
The process of finding, analyzing and removing the causes of failures in a component or system.
Root Cause
A source of a defect such that if it is removed, the occurrence of the defect type is decreased or removed.
Root Cause Analysis
An analysis technique aimed at identifying the root causes of defects.
Schema Validation
A type of static analysis based on a database schema.
Test Histogram
A visual representation that shows the distribution of test results.