Q&A3 Flashcards

1
Q

Severity of the impact on the system and/or the product stakeholders is usually determined by whom?

A

Technical behavior of the system

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Priority to fix then problem is usually determined by whom?

A

Business impact of the failure

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What technique that the subsystem or component in which the defect lies?

A

Defect cluster analysis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Distributed model

A

Alignments in methodologies

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Purpose of process improvements

A

?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Structure of unorthodox(which is not following the norms, not normal, not practiced)

A

?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Demotivates a tester

A
  • working late, successful project

- testing cut short, production defects results

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Documentation standards is determined by whom?

A

Test Mgrs shld be aware of standards, policy and whether or not useful for them to use.

Test Managers should determine the usefulness of the diff standards for which testing occurs.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Requirement based model?

A

?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Which Tool lifecycle - porting of perf tool to new software

A

Acquisition
Support and Maintenance
Evolution
Retirement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Model failure modes of product?

A

?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Model failure modes of product

A

?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Test Policy?

A

Tbd

  • methodical
  • standard-compliant
  • model
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

S/w characteristic of open source application tool

A
  • concurrency threshold
  • time limits?
  • memory leaks
  • coding standard issue
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Who can have the best skills to test the new software

A

Tbd
- who worked as customer support and maintenance of
previous software released?

  • former test manager?
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Who decides which test tool to use?

A

Tbd

  • test manager
  • pm
  • stakeholder
  • developer
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Current test progress 100% auto smoke test, 50% regression, 25% functional test. Goal is to automate all at the end of the release. What should be done?

A

?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Requirement: wiper will increase speed with slow/medium/fast intermittent, slow/med/fast constant
Test Condition: when moisture is increased to move the wiper speed from slow to fast
Test cases:
1.
2.
3.

  • test requirement met the test case 100%
  • test condition met the test case 100%
  • per TCs,test condition not met
  • per TCs,requirements not met
A

Tbd

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Technique to use to reduce the number of defects

A

Tbd

  • cost of quality analysis
  • defect triage
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

For distributed testing, the division of the test work across multiple locations MUST BE?

A

explicit and intelligently decided

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

A REVIEW can eliminate what? -

A

an issue at the requirements level before the problem is implemented into the code

22
Q

Which technique can help to enforce coding standards and check for problems that might be too laborious for the team to find by examination of the work product?

A

STATIC ANALYSIS

23
Q

Test Metrics that shows percent how much has been tested: passed/failed/executed?

A

Project Metrics

24
Q

Test Metrics: defect detected by testing?

A

Process Metrics

25
Q

Test Metrics: capability of individual/group, implementation of test cases by schedule

A

People Metrics

26
Q

Tedt Merrics: defect density

A

Product Metrics

27
Q

Requirements-based testing may utilize what techniques?

A
  1. Ambiguity reviews
  2. Test condition analysis
  3. Cause-effect graphing
28
Q

It can reduce an extremely large testing problem to a manageable number of test cases

Provide 100% functional coverage of the test basis

Identifies gaps in the test basis during test case design which can make identify defects-clearly in the sdlc when test design is started against the draft reqts

A

Cause-effect graphing

29
Q

Model-based approach

A

Operational profiles

Mix of use cases, users(personas), inputs, outputs
Depict the real world use of the system

Functionality
Usability 
Interoperability 
Reliability 
Security
Performance
30
Q

Methodical approach

A
  • Checklists:
    — what to test
    —how much
    —in what order
31
Q
  • Bug clusters focus of testing
  • Tend to miss major areas that are important but not suffering from a large number of bugs
  • Dynamic testing
A

Reactive approach

32
Q

Test team:

  • selects tests,
  • allocates test effort, and
  • initially prioritizes tests during requirements phase with oeriodic adjustments
A

Sequential V-model

33
Q

Breakdown in test process occurs during which sdlc process?

A

Design and Implementation

34
Q

Allocation and Prioritization is determined when?

A

Test Planning

35
Q

Tests are run and defects found, testers can examine the remaining residual risk level

A

Risk-based testing

36
Q

Test Manager can measure the degree to which testing is complete during which process?

A

Results Reporting and Exit Criteria Evaluation

37
Q

Test Manager should evaluate Metrics and success criteria which are pertinent to the needs and expectations of the testing stakeholders including the customers and users needs and expectations in terms of quality

A

Test closure

38
Q

Test Policy

A

Test objectives

39
Q

Test Strategy

A

Test methodology

40
Q

Master Test Plan or Project Test Plan

A

Implementation of test strategy

41
Q

Level Test Plan

A

Particular activities to be carried out within each test level

42
Q

Analytical strategies

A

Risk based testing

Test team analyzes the test basis to identify test conditions to cover

43
Q

Model based strategies

A

Operational profiling

44
Q

Methodical strategies

A

Used predetermined test set if test conditions

ISO 25000

45
Q

Process or standard-compliant strategies

A

Scrum Agile Mgmt technique

  • each iteration testers analyze user stories that describes features,
  • estimate test effort for each feature as part of the planning process
  • identify test conditions for each usr story
  • execute tests cover conditions
  • report the status of each user story(untested, failing passing) during test execution
46
Q

Reactive strategies

A

Defect-based attacks

  • team waits to design and implement tests until s/w is rcvd
  • EXPLORATORY TESTING
47
Q

Consultative strategies

A

User-directed testing

  • inputs from stakeholders to determine test conditions to cover
  • pairwise testing (high priority option)
  • equivalence partitioning (lower priority options)
48
Q

Regression-averse strategies

A

GUI based test automation tool

Regression automation

49
Q

Overall governance of testing effort

A

Master Test Plan

50
Q

Less formal projects

A

Test plan

-with all informational elements