F5 Flashcards

1
Q

Verification

A

The software should conform to its requirements specification.
– Are we building the product right?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Validation

A

The software should do what
the user really requires.
– Are we building the right product?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

VALIDERING AV KRAV

A

– Att säkerställa att vi har eliciterat och dokumenterat rätt krav
– ”Kommer vi att bygga rätt system med dessa krav?”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

validering av krav metoder

A

– Granskningar
– Tester
– Modellbaserad simulering
– Härledning med matematiska modeller

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

GRANSKNINGAR (INSPECTIONS)

A

– En systematisk utvärderingsteknik där dokument undersöks manuellt av andra än författaren för att detektera defekter
• Generellamålmedgranskningar:
– Detektera defekter
– Sprida kunskap inom projektet
– Fatta beslut baserat på granskningsdata – Dra lärdomar av granskningsdata

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

GRANSKNINGS- PROCESSEN

A

planning - preparation - inspection meeting - correction - follow up

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

OLIKA SÄTT ATT DETEKTERA FEL (READING TECHNIQUES)

A
• Adhoc
– Efter bästa förmåga (inga riktlinjer)
• Checklista
– En lista med frågor styr läsningen
• Perspektivbaserad
– Olika perspektiv kombineras
tex användare, designers, testare
– Utgå från användningsfall • N-faldig
– 􏰀N-foldinspection􏰀ärenvariantavinspektionerdärmananvänder sig av flera lag med olika individer för att hitta defekter.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

EXPERIENCES reviews

A
  • average number of hours per found error
  • number of errors found per review unit (e.g. page of design)
  • efficiency: reviews are 2-10 times more efficient than tests • reviews take time: 4-15% of a project’s resources
  • can we motivate this
  • reviews improve the quality
  • reviews improve the productivity
  • example: error in deployment costs $10.000 - which can pay for many hours of review
  • Reviews can replace testing
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

KRAVVALIDERING GENOM

OLIKA TESTER

A
– Manuell ”simulering” (walk-through) baserad på
scenarios/use cases/task descriptions
– Pappersprototyper
– Exekverbara prototyper
– Pilottester
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Checking and validation

A
Check that all parts match & everything is included
Validate that stakeholders are happy
(customer, user, developer)
Where are the major risks?
Quality product = meeting the spec?
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Classic: A good requirement spec is:

A
Correct
Each requirement reflects a need.
Complete
All necessary requirements included.
Unambiguous
All parties agree on meaning.
Consistent
All parts match, e.g. E/R and event list.
Ranked for importance and stability
Priority and expected changes per requirement.
Modifiable
Easy to change, maintaining consistency.
Verifiable
Possible to see whether requirement is met.
Traceable
To goals/purposes, to design/code.
Additional:
Traceable from goals to requirements. Understandable by customer and developer.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Contents check

A

Does the spec contain:
• Customer, sponsor, background
• Business goals + evidence of tracing
• Data requirements
(database, i/o formats, comm.state, initialize)
• System boundaries & interfaces
• Domain-level reqs (events & tasks)
• Product-level reqs (events & features)
• Design-level reqs (prototype or comm. protocol)
• Specification of non-trivial functions
• Stress cases & special events & task failures
• Quality reqs (performance, usability, security . . .)
• Other deliverables (documentation, training . . .)
• Glossary (definition of domain terms . . .)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Consistency checks

A

virtual window tasks E/Rmodel 1 VW 2 eventlist function list

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

CRUD matrix

A

Create, Read, Update, Delete + Overview

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

testing

A

Testing is the process in which a (probably unfinished) program is executed with the goal to find errors.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

purpose of testing

A

Detect deviations from specifications • Debugging
• Establish confidence in software • Operational testing
• Evaluate properties of software
• Reliability, Performance, Security, Usability etc

17
Q

bugs trace

A

Humans make ERRORS, that may lead to SW containing FAULTs. When executing such SW it may lead to FAILURE.
• FAULTS are also sometimes called “BUGS” or “DEFECTS”.

18
Q

Software inspections

A

Concerned with analysis of
the static system representation to discover problems (static verification and validation)
– May be supplement by tool-based document and code analysis.

19
Q

Software testing:

A

Concerned with exercising and observing product behaviour (dynamic verification and validation)
– The system is executed with test data and its operational behaviour is observed.

20
Q

VERIFICATION AND VALIDATION TECHNIQUES

A

Static defects detection: Reading techniques
• Inspections: rigorous evaluation using a check list of items
• Walkthrough: Examine source code/detailed design
• Reviews: Often done by document owners
Dynamic defect detection: testing

21
Q

Installation test:

A

Hardware, software, external products. Test basic functionality.

22
Q

System test:

A

Test all requirements.
(Except those needing daily operation) Stress test, special cases, etc.
Special data base contents.

23
Q

Deployment test:

A

Production data (data conversion). Test real user tasks with real users.

24
Q

Acceptance test:

A

= System test + Deployment test.

25
Q

Operational test:

A

Test remaining requirements, e.g. response times, availability, hot-line . .