CHAPTER 3 STATIC TESTING Flashcards
STATIC TESTING
• set of testing methods and techniques in which the component or system under test IS NOT RUN/EXECUTED
• can be applied to non-executable work products other than software, like
- DESIGN
- DOCUMENTATION
- SPECIFICATION, etc
GOALS OF STATIC TESTING
• QUALITY IMPROVEMENT
• DEFECT DETECTION
• EVALUATION OF CHARACTERISTICS LIKE:
- readability - completeness - correctness - testability - consistency of the work products under review
BOTH VERIFICATION AND VALIDATION
• in agile software development during requirements development whole team make sure that those requirements and related work products meet THE DEFINITION OF READY
• ENSURING THAT REQUIREMENTS ARE
COMPLETE
UNDERSTANDABLE
TESTABLE
USER CASES CONTAIN TESTABLE ACCEPTABLE CRITERIA
STATIC ANALYSIS - TECHNIQUES
Evaluating the work product under test (usually code, requirements or design documents) using tools
Examples of s.a. Techniques:
1. CODE MEASUREMENTS (e.g measuring its size or cyclomatic complexity)
2. CONTROL FLOW ANALYSIS
3. DATA FLOW ANALYSIS
4. CHECKING THE COMPATIBILITY OF VARIABILITY TYPES, VERIFICATION OF TEH CORRECT APPLICATION OF CODE WRITING STANDARS
Static analysis - included in CI frameworks as one step of TEH automated deployment pipeline
• assessing MAINTAINABILITY, PERFORMANCE AND VULNERABILITY OF CODE TO SECURITY ATTACKS
WORK PRODUCTS SUBJECTED TO STATIC ANALYSIS (any can, but most common)
- All kinds of SPECIFICATION (business, functional requirements, non-functional requirements etc)
- EPICS, USER STORIES, ACCEPTANCE CRITERIA, and other types of docu used in agile projects
- ARCHITECTURE DESGIN
- SOURCE CODE - other than code written by third parties
- ALL KIND OF TESTWARE (TEST PLANS, TEST PROCEDURES, TEST CASES, AUTOMATED TEST SCRIPTS, TEST DATA, RISK ANALYSIS DOCUMENTS, TEST CHARTERS)
- USER MANUALS, including build-in online help, system operator manuals, installation instructions, release notes
- WEBSITES (in terms of their content, structure, usability)
- PROJECT DOCUMENTS (contracts, project plans, schedules, budgets)
WORK PRODUCTS REVIEWED BY STATIC ANALYSIS THAT HAVE A FORMAL STRUCTURE WE TEST AGAINST
• SOURCE CODE (against standards and grammar of certain language)
• MODELS (e,g, UML diagrams)
• TEXT DOCUMENTS
VALUE OF STATIC TESTING
•EFFECTIVE AND EFFICIENT but not cheap
• helps catching DEFECTS EARLY
- ESPECIALLY WITH DESIGN DEFECTS
• building confidence in the product
• shard understanding with stakeholders
• IDENTIFYING DEFECTS THAT ARE DIFFICULT TO DETECT IN SUBSEQUENT DYNAMIC TESTING
• identifying defects impossible to defy in dynamic testing
- INFEASIBLE CODE (unreachable)
- UNUSED CODE
- INCORRECT USE OF LACK OF USE OF DESIGN PATTERNS IN CODE
- DEFECTS IN NON-EXECUTABLE PRODUCTS, like documentation
- detecting ambiguities, contradictions, omissions, oversights, redundant info or inconsistencies in documentation (requirement specification or architecture design) -> preventing defects this way
- increased efficiency of programming by:
• improving the design and code maintainability by imposing uniform standards
- reducing cost and time of software development
- reducing the cost of quality throughput the software development cycle by reducing costs of maintenance phase
- improving communication among team members by conducting reviews
COST OF QUALITY
TOTAL COST INCURRED FOR QUALITY ACTIVITIES, THAT IS COST OF:
- PREVENTATIVE ACTIVITIES (e.g. cost of training)
- DETECTION (e.g. cost of testing)
- INTERNAL FAILURES (e.g. cost of fixing defects found in production)
- EXTERNAL FAILURES (e.g. cost of fixing field defects found by users)
DIFFERENCE BETWEEN STATIC AND DYNAMIC TESTING
• same goal (evaluate product quality and identify defetcs) but different type of defects
• static testing - DEFECTS DIRECTLY IN The WORK PRODUCT
we do not find failures because software wasn’t executed
• dynamic testing- first sign of malfunction is FAILURE
• static - INTERNAL QUALITY AND CONSISTENCY OF WORK PRODUCTS
• dynamic - EXTERNAL, VISIBLE BEHAVIOUR
• static -> applied to non-executable work products
Dynamic - performed against a running work product; can measure performance, e.g. response time
TYPICAL DEFECTS THAT ARE EASIER AND CHEAPER TO DETECT AND FIX WITH STATIC TESTING
- DEFECTS IN REQUIREMENTS (inconsistencies, ambiguities, omissions, contradictions, inaccuracies, repetitions, redundant elements)
- DESIGN DEFECTS (e.g. inefficient algorithms or database structures, high coupling, low cohesion, poor code modularization)
- SPECIFIC TYPES OF CODE DEFECTS (e.g. variables with undefined values, variables declared but never used, inaccessible code, duplicated code, inefficiently implemented algorithms with too high time or memory complexity)
- DEVIATIONS FROM STANDARDS (e.g. lack of compliance with code development standards)
- INCORRECT INTERFACE SPECIFICATIONS (e.g. use of different units of measure in the calling and called systems, incorrect type or incorrect order of parameters passed to API function call)
- SECURITY VULNERABILITIES (e.g. susceptibility to buffer overflow attacks, SQL injection, XSS (cross-site scripting), DDoS attack))
- GAPS OR INACCURACIES IN TRACEABILITY OR COVERAGE (e.g. no tests that match acceptance criteria for a given use story)
REVIEWS
Form of providing early feedback to the team
•can be done early in SDLc
BENEFITS OF EARY AND FREQUENTLY STAKEHOLDER FEEDBACK
• info about potential quality issues
• meeting their vision- costly rework
• delivering what’s of most value to stakeholders
TYPES OF REVIEWS - DEPENDING ON FORMALITY
• GENERIC REVIEW - structured bit flexible framework
Planning —> review initiation —> individual review —> communication and analysis —> fixing and reporting
5 GENERIC ACTIVITIES IN THE WORK PRODUCT REVIEW PROCESS
- PLANNING
- REVIEW INITIATION
3.INDIVIDUAL REVIEW - COMMUNICATION AND ANALYSIS
- FIXING AND REPORTING
STATUSES FOR ANOMALIES - MOST COMMON CATEGORIZATION
- PROBLEM REJECTED
- PROBLEM RECORDED WITHOUT TAKING ANY ACTION
- PROBLEM TO BE SOLVES BY THE WORK PRODUCT AUTHOR
- PROBLEM UPDATED AS A RESULT OF FUTHER ANALYSIS
- PROBLEM ASSIGNED TO AN EXTERNAL STAKEHOLDER
PLANNING - REVIEW PROCESS 1#
- Defining the scope of work
- setting boundaries of the review:
WHO - WHAT - WHERE - WHEN - WHY - creating checklists
- role assignation
In FORMAL REVIEWS - like INSPECTION - formal entry and exit criteria are defined
REVIEW INITIATION - 2# REVIEW PROCESS
- sending review product to the review participants with necessary materials
- checklists - statements of procedures - defect report templates
- explanation of process, their roles, schedules - TIME/PLACE/ROLE
- arranging review training
INDIVIDUAL REVIEW - 3# REVIEW PROCESS
• CENTRAL PHASE
• review activities using chosen techniques
• taking notes - any comments, questions, recommendations, concerns, relevant observations
- documenting everything in a problem log - often supported by a defect management or review support tool
COMMUNICATION AND ANALYSIS - #4 REVIEW PROCESS
• meetings, calls
• analysis of ANOMALIES (found problems) reported by reviewers
• categorizing ANOMALIES AS DEFECTS OR FALSE POSITIVES
• if defect occurs - DELEGATING PEOPLE TO FIX THE DEFECT, DEFINING PARAMETERS, SUCH AS status priority, severity
• evaluating and documenting the level of quality characteristics that were defined in the planning phase as those being reviewed
• conclusions of the review are evaluated against exit criteria to decide what to do next
FIXING AND REPORTING - #5 REVIEW PROCESS
• final stage
• creates defect reports on detected defects that require changes
• author of the review process will carry out defect removal
• changes are confirmed
• Review report created
MODERN CODE REVIEW (MCR)
• quality control technique - verify if software quality and customer satisfaction by identifying defects, improving code, and speeding up the development process
• asynchronous and lightweight review process with tools like Gerrit
ROLES AND RESPONSIBILITIES IN REVIEWS
- MANAGER
- AUTHOR
- MODERATOR (facilitator)
- SCRIBE (recorder)
- REVIEWER
- REVIEW LEADER
MANAGER
•responsible for scheduling the review
•decides to conduct the review
• designates staff and sets a budget and timeframe
• monitors the cost-effectiveness of the review on an ongoing basis
• executes control decisions in case of unsatisfactory results
AUTHOR
• creates the work product under review
• removes defects in the work product under review (if necessary)
• prepares the material for review, though they might be distributed by the leader
• may provide technical explanations
• can evaluate the work of the reviewers in terms of meritorious value of their comments
MODERATOR/faciliator
• ensures the smooth running of review meetings (if they take place)
• acts as a mediator if it’s necessary
• ensures that a safe atmosphere of mutual trust and respect is created