Model Based Testing Flashcards

1
Q

Model-based testing

A

automatic generationof software test procedures

using models of system requirements and behavior

in combination with automated test execution

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Manual testing

A

+ easy & cheap to start
+ flexible testing
- expensive every execution
- no auto regression testing
- ad-hoc coverage
- no coverage measurement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Capture & Replay

A

+ auto regression testing
+ flexible testing
- expensive first execution
- fragile tests break easily
- ad-hoc coverage
- no coverage measurement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

testing with scripts

A

(e.g. JUnit)

+ auto regression testing
+ automatic execution
+/- test impl. = programming
- fragile tests break easily? (depends on abstraction)
- ad-hoc coverage
- no coverage measurement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Test Scenarios

A

+ abstract tests
+ automatic execution
+ auto regression testing
+ robust tests
- ad-hoc coverage
- no coverage measurement
Examples: UML Testing Profile (UTP) und Testing and Test Control
Notation (TTCN-3)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Model based testing

A

explore a model that mirrors the SUT to generate test cases

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Model-Based Testing Pros/Cons

A

+ abstract tests
+ automatic execution
+ auto regression testing
+ auto design of tests
+ systematic coverage
+ measure coverage of model and requirements
- modelling efforts

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

MBT Workflow

A
  • *Manual tasks:**
  • (requirements analysis)
  • model creation
  • model validation
  • concretion implementation
  • *Automated tasks:**
  • model verification
  • test-case generation
  • test-case concretion
  • test-case execution
  • assignement of verdicts
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Automata Learning Workflow

A

Automated tasks:

  • model creation
  • model verification
  • test-case generation
  • test-case concretion
  • test-case execution
  • assignement of verdicts

Manual tasks:

  • model validation
  • implementation of test driver (mapper)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

reactive system

A

a software component which reacts to
the stimuli of its environment
Do not terminate
Test cases: sequences of events

event-based
controllable events
observable events

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

mutation score

A

mutation score = #killed mutants / #mutants

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

mutation testing goals

A

find test cases that maximize mutation score

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

mutation testing problems

A

equivalent mutants
equivalence checking is hard

solutions
review of surviving mutants
generate mutants from model: model based mutation testing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

conformance

A

if an SUT alway passes a testcase, we have conformance

the test case conforms to the SUT

if we generate such a test case from a model → test case conforms to the model

if the SUT conforms to the model: test → model → SUT

Trace equivalence is the weakest notion of conformance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

mutation testing - non conformance

A

If the model does not conform to the SUT

and the Mutant conforms to the SUT

→ the model does not conform to the mutant

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

mutation testing - Fault-Detecting

A

Mutation generated from the model

Kills the mutant if the test case conforms to the model and not to the mutant

it is a counterexample to conformance, hence the model does not conform to the mutant

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Transformational Systems

A

Model and Mutant interpreted as predicates describing state transformations (s → s’)

behaviour allowed by mutant but not by original model?

18
Q

Explicit Conformance Checking

A

Build synchronous product modulo ioco between model and mutant

If mutant has additional
!output: ! fail sink state
?input: ! pass sink state

Extract test case covering fail state

19
Q

Action Systems

A

Non-deterministic choice of actions

Actions are guarded commands that represent events
Loop over Actions
Terminates if all guards disabled

Actions are labelled

20
Q

Action Systems - Semantics

A
  1. labelled transition systems
  2. predicative semantics
21
Q

Predicative Semantics of Action Systems

A

The transition relation is translated to a constraint over state variables s and event-traces tr

22
Q

Soundness

A

conformance, therefore all tests pass

No incorrect programs are deemed correct

23
Q

Exhaustiveness

A

All tests pass → therefore conformance

A correct program is deemed correct ( ⇒ no spurious errors)

24
Q

IOLTS

25
trace equivalent
A trace is an observable sequence of actions Two states are trace equivalent iff they have the same traces Trace equivalence is the weakest notion of conformance
26
bisimilar
Two states are bisimilar iff they simulate each other and go to states which are bisimilar Bisimulation is not suited for testing!
27
Suspension Automaton
adding loops for each quiescent state
28
IOCO
IUT ioco S iff outputs (and quiescences) of the IUT are possible in S after an arbitrary suspension trace of S
29
RAISE
**R**igorous **A**pproach to **I**ndustrial **S**oftware **E**ngineering a method for software development a formal specification language RSL computer based tools
30
property oriented specification
no model often represented as **pure functions** and **stateless** programs PBT is a testing technique that tries to **falsify a given** property by generating **random input data** and checking the **expected behaviour** you define the properties that shold hold, like e.g. the length of two concatenated lists should be list1.length + list2.length
31
model based testing
we use a **model** of our subject under test (SUT) the model is _precise_ and tries to _mirror_ the SUT this makes most sense when the system has more non-functional complexity
32
timed automata
are automata with the notion of time (clocks) can be used to test real-time systems
33
black-box system
We can interact with it • Perform **inputs** • **Observe** actions • **Reset** it to initial state
34
active automata learning
automatically generate learned models (finite state machines) of black-box systems with the learned model we can perform model-based testing and verification
35
DNF
disjunctive normal form, can be used for automated theorem proving each variable appears exactly once in every conjunction
36
construction of Z
▪ X … input alphabet (collection of all stimuli) ▪ W … characterization set Z = 𝑊 ∪ 𝑋 ∙ 𝑊 ⋯ ∪ 𝑋𝑚−𝑛∙ 𝑊
37
QuickCheck
property based testing random testing
38
Spec Explorer
Tool for testing reactive, object-oriented software systems State exploration produces a model automaton traversing the automaton → generated test ioco conformance relation then produces verdict
39
UPPAAL-TRON
ONLINE testing Testing Real-Time Embedded Software Generates and Executes Tests event-by-event in real time TRON: Replaces the environment of the IUT, simulates and monitors
40
ALERGIA
an identify any stochastic deterministic regular language determine the probabilities of the strings in the language configurable confidence interval
41
Model-based fuzz testing
black box testing technique the SUT is tested with an infinite number of randomized, invalid or unexpected inputs Model-based fuzz testing is used for randomizing sequential or concurrent behaviour of systems.