CHAPTER 3 STATIC TESTING Flashcards
STATIC TESTING
• set of testing methods and techniques in which the component or system under test IS NOT RUN/EXECUTED
• can be applied to non-executable work products other than software, like
- DESIGN
- DOCUMENTATION
- SPECIFICATION, etc
GOALS OF STATIC TESTING
• QUALITY IMPROVEMENT
• DEFECT DETECTION
• EVALUATION OF CHARACTERISTICS LIKE:
- readability - completeness - correctness - testability - consistency of the work products under review
BOTH VERIFICATION AND VALIDATION
• in agile software development during requirements development whole team make sure that those requirements and related work products meet THE DEFINITION OF READY
• ENSURING THAT REQUIREMENTS ARE
COMPLETE
UNDERSTANDABLE
TESTABLE
USER CASES CONTAIN TESTABLE ACCEPTABLE CRITERIA
STATIC ANALYSIS - TECHNIQUES
Evaluating the work product under test (usually code, requirements or design documents) using tools
Examples of s.a. Techniques:
1. CODE MEASUREMENTS (e.g measuring its size or cyclomatic complexity)
2. CONTROL FLOW ANALYSIS
3. DATA FLOW ANALYSIS
4. CHECKING THE COMPATIBILITY OF VARIABILITY TYPES, VERIFICATION OF TEH CORRECT APPLICATION OF CODE WRITING STANDARS
Static analysis - included in CI frameworks as one step of TEH automated deployment pipeline
• assessing MAINTAINABILITY, PERFORMANCE AND VULNERABILITY OF CODE TO SECURITY ATTACKS
WORK PRODUCTS SUBJECTED TO STATIC ANALYSIS (any can, but most common)
- All kinds of SPECIFICATION (business, functional requirements, non-functional requirements etc)
- EPICS, USER STORIES, ACCEPTANCE CRITERIA, and other types of docu used in agile projects
- ARCHITECTURE DESGIN
- SOURCE CODE - other than code written by third parties
- ALL KIND OF TESTWARE (TEST PLANS, TEST PROCEDURES, TEST CASES, AUTOMATED TEST SCRIPTS, TEST DATA, RISK ANALYSIS DOCUMENTS, TEST CHARTERS)
- USER MANUALS, including build-in online help, system operator manuals, installation instructions, release notes
- WEBSITES (in terms of their content, structure, usability)
- PROJECT DOCUMENTS (contracts, project plans, schedules, budgets)
WORK PRODUCTS REVIEWED BY STATIC ANALYSIS THAT HAVE A FORMAL STRUCTURE WE TEST AGAINST
• SOURCE CODE (against standards and grammar of certain language)
• MODELS (e,g, UML diagrams)
• TEXT DOCUMENTS
VALUE OF STATIC TESTING
•EFFECTIVE AND EFFICIENT but not cheap
• helps catching DEFECTS EARLY
- ESPECIALLY WITH DESIGN DEFECTS
• building confidence in the product
• shard understanding with stakeholders
• IDENTIFYING DEFECTS THAT ARE DIFFICULT TO DETECT IN SUBSEQUENT DYNAMIC TESTING
• identifying defects impossible to defy in dynamic testing
- INFEASIBLE CODE (unreachable)
- UNUSED CODE
- INCORRECT USE OF LACK OF USE OF DESIGN PATTERNS IN CODE
- DEFECTS IN NON-EXECUTABLE PRODUCTS, like documentation
- detecting ambiguities, contradictions, omissions, oversights, redundant info or inconsistencies in documentation (requirement specification or architecture design) -> preventing defects this way
- increased efficiency of programming by:
• improving the design and code maintainability by imposing uniform standards
- reducing cost and time of software development
- reducing the cost of quality throughput the software development cycle by reducing costs of maintenance phase
- improving communication among team members by conducting reviews
COST OF QUALITY
TOTAL COST INCURRED FOR QUALITY ACTIVITIES, THAT IS COST OF:
- PREVENTATIVE ACTIVITIES (e.g. cost of training)
- DETECTION (e.g. cost of testing)
- INTERNAL FAILURES (e.g. cost of fixing defects found in production)
- EXTERNAL FAILURES (e.g. cost of fixing field defects found by users)
DIFFERENCE BETWEEN STATIC AND DYNAMIC TESTING
• same goal (evaluate product quality and identify defetcs) but different type of defects
• static testing - DEFECTS DIRECTLY IN The WORK PRODUCT
we do not find failures because software wasn’t executed
• dynamic testing- first sign of malfunction is FAILURE
• static - INTERNAL QUALITY AND CONSISTENCY OF WORK PRODUCTS
• dynamic - EXTERNAL, VISIBLE BEHAVIOUR
• static -> applied to non-executable work products
Dynamic - performed against a running work product; can measure performance, e.g. response time
TYPICAL DEFECTS THAT ARE EASIER AND CHEAPER TO DETECT AND FIX WITH STATIC TESTING
- DEFECTS IN REQUIREMENTS (inconsistencies, ambiguities, omissions, contradictions, inaccuracies, repetitions, redundant elements)
- DESIGN DEFECTS (e.g. inefficient algorithms or database structures, high coupling, low cohesion, poor code modularization)
- SPECIFIC TYPES OF CODE DEFECTS (e.g. variables with undefined values, variables declared but never used, inaccessible code, duplicated code, inefficiently implemented algorithms with too high time or memory complexity)
- DEVIATIONS FROM STANDARDS (e.g. lack of compliance with code development standards)
- INCORRECT INTERFACE SPECIFICATIONS (e.g. use of different units of measure in the calling and called systems, incorrect type or incorrect order of parameters passed to API function call)
- SECURITY VULNERABILITIES (e.g. susceptibility to buffer overflow attacks, SQL injection, XSS (cross-site scripting), DDoS attack))
- GAPS OR INACCURACIES IN TRACEABILITY OR COVERAGE (e.g. no tests that match acceptance criteria for a given use story)
REVIEWS
Form of providing early feedback to the team
•can be done early in SDLc
BENEFITS OF EARY AND FREQUENTLY STAKEHOLDER FEEDBACK
• info about potential quality issues
• meeting their vision- costly rework
• delivering what’s of most value to stakeholders
TYPES OF REVIEWS - DEPENDING ON FORMALITY
• GENERIC REVIEW - structured bit flexible framework
Planning —> review initiation —> individual review —> communication and analysis —> fixing and reporting
5 GENERIC ACTIVITIES IN THE WORK PRODUCT REVIEW PROCESS
- PLANNING
- REVIEW INITIATION
3.INDIVIDUAL REVIEW - COMMUNICATION AND ANALYSIS
- FIXING AND REPORTING
STATUSES FOR ANOMALIES - MOST COMMON CATEGORIZATION
- PROBLEM REJECTED
- PROBLEM RECORDED WITHOUT TAKING ANY ACTION
- PROBLEM TO BE SOLVES BY THE WORK PRODUCT AUTHOR
- PROBLEM UPDATED AS A RESULT OF FUTHER ANALYSIS
- PROBLEM ASSIGNED TO AN EXTERNAL STAKEHOLDER
PLANNING - REVIEW PROCESS 1#
- Defining the scope of work
- setting boundaries of the review:
WHO - WHAT - WHERE - WHEN - WHY - creating checklists
- role assignation
In FORMAL REVIEWS - like INSPECTION - formal entry and exit criteria are defined
REVIEW INITIATION - 2# REVIEW PROCESS
- sending review product to the review participants with necessary materials
- checklists - statements of procedures - defect report templates
- explanation of process, their roles, schedules - TIME/PLACE/ROLE
- arranging review training
INDIVIDUAL REVIEW - 3# REVIEW PROCESS
• CENTRAL PHASE
• review activities using chosen techniques
• taking notes - any comments, questions, recommendations, concerns, relevant observations
- documenting everything in a problem log - often supported by a defect management or review support tool
COMMUNICATION AND ANALYSIS - #4 REVIEW PROCESS
• meetings, calls
• analysis of ANOMALIES (found problems) reported by reviewers
• categorizing ANOMALIES AS DEFECTS OR FALSE POSITIVES
• if defect occurs - DELEGATING PEOPLE TO FIX THE DEFECT, DEFINING PARAMETERS, SUCH AS status priority, severity
• evaluating and documenting the level of quality characteristics that were defined in the planning phase as those being reviewed
• conclusions of the review are evaluated against exit criteria to decide what to do next
FIXING AND REPORTING - #5 REVIEW PROCESS
• final stage
• creates defect reports on detected defects that require changes
• author of the review process will carry out defect removal
• changes are confirmed
• Review report created
MODERN CODE REVIEW (MCR)
• quality control technique - verify if software quality and customer satisfaction by identifying defects, improving code, and speeding up the development process
• asynchronous and lightweight review process with tools like Gerrit
ROLES AND RESPONSIBILITIES IN REVIEWS
- MANAGER
- AUTHOR
- MODERATOR (facilitator)
- SCRIBE (recorder)
- REVIEWER
- REVIEW LEADER
MANAGER
•responsible for scheduling the review
•decides to conduct the review
• designates staff and sets a budget and timeframe
• monitors the cost-effectiveness of the review on an ongoing basis
• executes control decisions in case of unsatisfactory results
AUTHOR
• creates the work product under review
• removes defects in the work product under review (if necessary)
• prepares the material for review, though they might be distributed by the leader
• may provide technical explanations
• can evaluate the work of the reviewers in terms of meritorious value of their comments
MODERATOR/faciliator
• ensures the smooth running of review meetings (if they take place)
• acts as a mediator if it’s necessary
• ensures that a safe atmosphere of mutual trust and respect is created
SCRIBE (recorder)
• gathers potential anomalies detected and reported as part of individual review
• records new potential defects found during the review meeting, as well as decisions made at the meeting
• should be invisible to the participants - „transparent”
REVIEWER
• conducts a review, identifying potential defects in the work product under review
• can be a subject expert - person working at the project - stakeholder - person with specific technical or business experience
• can represent different viewpoints (of a tester, developer, user, operator, business analyst, usability specialist)
REVIEW LEADER
• bears overall responsibility fir the review process
• decides who is to participate in the review, determines the place and date of the review
•responsible for organising meetings
4 TYPES OF REVIEW
• INSPECTION
most formal type
- documented procedures
- participation of pre established team
- mandatory documentation of results
• TECHNICAL REVIEW
less formal
• WALKTHROUGH
less formal
• INFORMAL REVIEW
- no need for a defined process, info obtained doesn’t need to be formally documented
INFORMAL REVIEW
•BUDDY CHECK - PAIR REVIEW
MOST COMMON IN AGILE PROJECTS
CONVERSATION IN A KITCHEN
MAIN GOALS
detect potential defects
POTENTIAL ADDITIONAL GOALS
• generate new ideas, quickly solve simple problems
FORMAL PROCESS
• No formal process
•can be run by an author, author’s peer, group of people
DOCUMENTATION
documentation is optional
WALKTHROUGH
SEQUENTIAL REVIEW OF WORK PRODUCT
Usually when team can’t identify the cause of a software failure
DRY RUNS
E.g. manual simulation of the code execution - analyzing code line by line
CODE REVIEW
• MAIN GOALS:
detect potential defects, improve quality, consider alternatives, evaluate compliance with standards
POTENTIAL ADDITIONAL GOALS
exchange information, train participants, teach consensus
FORMAL PROCESS
optional individual preparation before the meeting
ROLES
meeting usually chaired by the author, scribe present
DOCUMENTATION OF RESULTS
optional defect logs and review reports are created
LEVEL OF FORMALISM
from informal to formal, vantage form of scenarios, dry runs ot simulation
TECHNICAL REVIEW
Performed by a technically qualified person
„Expert panel” making design or technical decision that has a major impact ont further development
MAIN GOAL
obtain consensus, detect potential defects
POTENTIAL ADDITIONAL GOALS
assess the quality of a work product, increase confidence in it, generate new ideas, motivate authors to improve future work products, evaluate alternatives
FORMAL PROCESS
mandatory individual preparation before
ROLES
moderator conducts the meeting, not the author, scribe present
DOCUMENTATION OF RESULTS
defect logs and review reports
LEVEL OF FROMALISM
from formal to informal, review meeting optional
INSPECTION
MAIN GOALS
detect potential defect, assess quality od the work product, i increase confidence, prevent similar defects from occurring in the future
POTENTIAL ADDITIONAL GOALS
motivate authors to improve future work products and the software development process, create conditions for it, reach consensus
FORMAL PROCESS
formal process based on rules and checklists, mandatory individual preparation
ROLES
strictly defined, moderator leads the meetings, not author, scribe, reader role optional
DOCUMENTATION OF RESULTS
defect logs and review reports
LEVEL OF FORMALITIES
formal, entry and exit criteria, checklists. Measures are collected that are used to improve the process
INSPECTION MEASUREMENTS
DISTRIBUTION OF DEFECTS ACCORDING TO 1992 STUDY
- 40,51% documentation defects
- 23,20% non-compliance with standard
- 7,22% defects in logic
- 6,57% functional defects
- 4,79% syntax defects
- 4,62% data defects
- 4,09% maintainability defects
METRICS THAT CAN BE COLLECTED AS PART OF THE INSPECTION PROCESS
• TIME TO PREPARE PER DEFECT
• TIME TO PREPARE PER MAJOR DEFECT
• NUMBER OF CRITICAL DEFECTS PER 1000 LINES OF CODE
• NUMBER OF NON-CRITICAL DEFECTS PER 1000 LINES OF CODE
• NUMBER OF LINES OF CODE CHECKED IN 1H
• NUMBER OF DEFECTS PER SESSION
• EFFORT TO PREPARE VERSUS EFFORT TO INSPECT
• NUMBER OF LINES CHECKED IN ONE SESSION
QUALITIES THAT INSPECTION CHECKS FOR IN A WORK PRODUCT
• COMPLETENESS
• CORRECTNESS
• COHERENCE
• STYLE
• CONSTRUCTION RULES
•
TYPICAL INSPECTION PROCESS TEMPLATE
PLANNING —> REVUEW PREPARATIONS —> CHECKING ENTRY CRITERIA —> INSPECTION —> CHECKING EXIT CRITERIA —> REPORT PREPARATION —> FINAL ACTIONS
WALKTHROUGH VS INSPECITON
WALKTHROUGH
Overall objective: do the right job
Specific objective: education, consensus, understanding
Trigger: author’s request
Measurement: walkthrough instances
INSPECTION
O.O: do the job right
S.o: defect detection, compliance checking
T.: phase exit criteria
M: product and process measurements
SUCCESS FACTORS FOR REVIEWS
•selection of a proper review type and review technique for given situation and good for the team
• having clear objectives defined during planning phase that can serve as measurable exit criteria
• various review techniques used effectively to identify defects present in a work product
• checklists used up-to-date and address the main risk
• large documents are written and reviewed in batches - quick and frequent feedback
•sufficient time to prepare for review - review scheduled in advance
• authors are given feedback from reviewers
• management support the review process
• reviews are considered a natural process supporting progress and learning
• involving people whose participation is conductive to achieving goals (different skills and view points)
•involving testers and their viewpoint
• allocation of sufficient time by participants
•conducted in small pieces
• detected defects are acknowledged, confirmed, and dealt with objectively
• no waste of time activities
• mutual trust and respect - judging product not a person
• no gestures and behaviours suggesting boredom, hostility, irritation
• due training
• atmosphere supporting expanding knowledge and improvement
5 REVIEW TECHNIQUES (not in syllabus)
- AD HOC REVIEW
unstructured - each reviewer finds defects - little guidance sequential reading of the work product - CHECKLIST-BASED
different reviewers, different checklists to increase the coverage - checklists max 10 items - SCENARIOS AND DRY RUNS
based on use cases - dry runs: checking the product functionalities, also if it’s described correctly - taking into account risk analysis - ROLE-BASED REVIEW
review from particular perspective, e.g. certain stakeholder - taking into account different types of end users or roles in the organization (user, administrator, performance testers etc.) - roles modeled by persona - PERSPECTIVE-BASED READING
most effective - taking perspective of user, business analyst, designer, tester, system administrator, technical support engineer, regulator etc. - considering multiple views - different uses of a work product - balancing viewpoints