Day 3- RQA, CRQA and Entropy Flashcards

1
Q

Recurrence Rate

A

% of the plot occupied by points

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

% Determinism

A

% of points that fall on a diagonal line structure

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Average line length

A

Average length of time the system repeats

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Max line length

A

Longest diagonal line; strength of attractor

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Entropy (Shannon)

A

The probability that the length of a line will be repeated

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

To create plot in RQA (3 steps)

A
  1. Embedding dimension (FNN)
  2. Time delay (AMI)
  3. Radius (define a space to determine what points are recurrent
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Unweighted RQA plot

A

Binary matrix that indicates that two points are recurrent, once below a threshold

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Weighted RQA plot

A

Based on the distances between the points in phase space

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Cross RQA (CRQA)

A

Coupling of two different TS in one time scale

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Neighbors of that region

A

When some occupied region of phase space is revisited within the same “radius” of that space

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Rescaling (What do you rescale to? 2 Answers)

A

Rescaling both TS to mean or max distance separating points in the reconstructed phase space

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Radius

A

Define a space to determine what points are recurrent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

% Recurrence

Select radius between what criteria?

A

How often do two systems occupy the same space? Select radius between ~2-5%

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

CRQA Plot characteristics

A
  1. Rescaling
  2. Radius
  3. % recurrence
  4. % determinism
  5. Max line length
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Entropy (2 answers)

A
  1. Loss of information in a TS or signal

2. Amount of uncertainty

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Put in order from most ordered to least ordered (low to high entropy):

  1. Gas
  2. Solid
  3. Liquid
A
  1. Solid (least entropy)
  2. Liquid (middle entropy)
  3. Gas (most entropy)
17
Q

Low probability = how much information?

A

Low probability -> Surprise -> More information

18
Q

High probability = how much information?

A

High probability -> No Surprise -> No new information

19
Q

The amount of info =

A

Some function of probability

20
Q

Shannon’s entropy equation

A

H = - ( n SUM i = 1 ((Pi)log2(Pi)))

21
Q

Max H =

22
Q

0 < H < log2P

How does H affect outcome? More or less certain?

A

Outcome is more uncertain if H is larger

23
Q

What are the single scale input parameters (3)?

A
N = data length
m = pattern length (length of vector)
r = tolerance (criteria of similarity, filter)
24
Q

General procedures of single scale entropy (7 Steps, 1-3 repeat to 4-6; DCC).

A
  1. Divide data up into vectors length m
  2. Count like matches
  3. Calculate conditional probability
  4. Divide data up into vectors length m+1
  5. Count your like matches
  6. Calculate conditional probability
  7. Entropy is a ratio of conditional probabilities
25
Parameters: r
Allowable difference when determining similarity between elements in vector (tolerance)
26
Too small r
Similar data points are not seen as similar
27
Too large r
Details of the system dynamics are lost
28
Selection of r: Experimental error is known
Set r 3 times larger than noise amplitude
29
Selection of r: Experimental error is unknown
0.1~0.25 times the SD (common in literature at 0.20 SD)
30
Definition of complexity (DIS)
Very many definitions: Probability, predictability, regularity? 1. Originates from deterministic origin ( 1 to 1) 2. Infinitely entangled 3. Structural richness
31
Permutation entropy
Temporal order of data within vectors.
32
What is the only parameter needed for permutation entropy?
m = pattern length
33
Steps to permutation entropy
1. Data 2. Divide into length m vectors 3. Convert vectors to ordinal data 4. Count like patterns
34
Steps to symbolic entropy
1. Convert time series to binary symbol on threshold value 2. Form words from symbols (typically L = 3) 3. Convert binary to decimal 4. Calculate Shannon entropy from word symbol series 5. Normalize result by dividing max entropy for the possible number of words
35
How to select threshold?
Mean: Half of symbols zeroes and half ones.