ISTQB CT-TAE Exam Flashcards
Success Factors in Test Automation
- Test Automation Architecture (TAA)
- SUT Testability
- Test Automation Strategy
- Test Automation Framework (TAF)
SUT Factors Influencing Test Automation
- SUT interfaces
- Third party software
- Levels of intrusion
- Different SUT architectures
- Size and complexity of the SUT
SUT Design for Testability
- Observability
- Control(ability)
- Clearly defined architecture
The TAA of a TAS complies with the following principles
- Single responsibility
- Extension (see e.g., open/closed principle by B. Myer)
- Replacement (see e.g., substitution principle by B. Liskov)
- Component segregation (see e.g., interfaces segregation principle by R.C. Martin)
- Dependency inversion
gTAA layers
- Test generation
- Test definition
- Test execution
- Test adaptation
- Interfaces for project management, configuration management and test management
TAA Design
- Capture requirements needed to define an appropriate TAA
- Compare and contrast different design/architecture approaches
- Identify areas where abstraction can deliver benefits
- Understand SUT technologies and how these interconnect with the TAS
- Understand the SUT environment
- Time and complexity for a given testware architecture implementation
- Ease of use for a given testware architecture implementation
Approaches for Automating Test Cases
- Capture/playback approach
- Linear scripting
- Structured scripting
- Data-driven testing
- Keyword-driven testing
- Process-driven scripting
- Model-based testing
Technical considerations of the SUT
- Interfaces of the SUT
- SUT data
- SUT configurations
- SUT standards and legal settings
- Tools and tool environments used to develop the SUT
- Test interfaces in the software product
Considerations for Development/QA Processes
- Test execution control requirements
- Reporting requirements
- Role and access rights
- Established tool landscape
TAS Development
Analyze -> Design -> Develop -> Test -> Deploy -> Evolve
Compatibility between the TAS and the SUT
- Process compatibility
- Team compatibility
- Technology compatibility
- Tool compatibility
Synchronization between TAS and SUT
- Synchronization of requirements
- Synchronization of development phases
- Synchronization of defect tracking
- Synchronization of SUT and TAS evolution
Pilot Project
- Identify a suitable project
- Plan the pilot
- Conduct the pilot
- Evaluate the pilot
Types of Test Automation Maintenance
- Preventive maintenance
- Corrective maintenance
- Perfective maintenance
- Adaptive maintenance
TAS Metrics
- Automation benefits
- Effort to build automated tests
- Effort to analyze SUT failures
- Effort to maintain automated tests
- Ratio of failures to defects
- Time to execute automated tests
- Number of pass and fail results
- Number of false-fail and false-pass results
- Code coverage
- Tool scripting metrics
- Automation code defect density
- Speed and efficiency of TAS components
- Trend metrics
Implementation of Measurement
- Features of automation that support measurement and report generation
- Integration with other third party tools
- Visualization of results
Test Automation Reporting
- Content of the reports
- Publishing the reports
Criteria for Automation
- Frequency of use
- Complexity to automate
- Compatibility and tool support
- Maturity of test process
- Suitability of automation for the stage of the software product lifecycle
- Sustainability of the environment
- Controllability of the SUT (preconditions, setup and stability
- Technical planning in support of ROI analysis
To adequately prepare for transitioning to an automated environment, the following areas need to be addressed
- Availability of tools in the test environment for test automation
- Correctness of test data and test cases
- Scope of the test automation effort
- Education of test team to paradigm shift
- Roles and responsibilities
- Cooperation between developers and test automation engineers
- Parallel effort
- Automation reporting
Steps Needed to Implement Automation within Regression Testing
- Frequency of test execution
- Test execution time
- Functional overlap
- Data sharing
- Test interdependency
- Test preconditions
- SUT coverage
- Executable tests
- Large regression test sets
Verifying Automated Test Environment Components
- Test tool installation, setup, configuration, and customization
- Test scripts with known passes and failures
- Repeatability in setup/teardown of the test environment
- Configuration of the test environment and components
- Connectivity against internal and external systems/interfaces
- Intrusiveness of automated test tools
- Framework Component Testing
Verifying the Automated Test Suite
- Executing test scripts with known passes and failures
- Checking the test suite
- Verifying new tests that focus on new features of the framework
- Considering repeatability of tests
- Checking that there are enough verification points in the automated test suite and/or test cases
Options for Improving Test Automation
- Scripting
- Test Execution
- Verification
- Architecture
- Pre- and post-processing
- Documentation
- TAS features
- TAS updates and upgrades
Planning the Implementation of Test Automation Improvement
- Identify changes in the test environment components
- Increase efficiency and effectiveness of core TAS function libraries
- Target multiple functions that act on the same control type for consolidation
- Refactor the TAA to accommodate changes in the SUT
- Naming conventions and standardization
- Evaluation of existing scripts for SUT revision/elimination