Day 3- RQA, CRQA and Entropy Flashcards
Recurrence Rate
% of the plot occupied by points
% Determinism
% of points that fall on a diagonal line structure
Average line length
Average length of time the system repeats
Max line length
Longest diagonal line; strength of attractor
Entropy (Shannon)
The probability that the length of a line will be repeated
To create plot in RQA (3 steps)
- Embedding dimension (FNN)
- Time delay (AMI)
- Radius (define a space to determine what points are recurrent
Unweighted RQA plot
Binary matrix that indicates that two points are recurrent, once below a threshold
Weighted RQA plot
Based on the distances between the points in phase space
Cross RQA (CRQA)
Coupling of two different TS in one time scale
Neighbors of that region
When some occupied region of phase space is revisited within the same “radius” of that space
Rescaling (What do you rescale to? 2 Answers)
Rescaling both TS to mean or max distance separating points in the reconstructed phase space
Radius
Define a space to determine what points are recurrent
% Recurrence
Select radius between what criteria?
How often do two systems occupy the same space? Select radius between ~2-5%
CRQA Plot characteristics
- Rescaling
- Radius
- % recurrence
- % determinism
- Max line length
Entropy (2 answers)
- Loss of information in a TS or signal
2. Amount of uncertainty
Put in order from most ordered to least ordered (low to high entropy):
- Gas
- Solid
- Liquid
- Solid (least entropy)
- Liquid (middle entropy)
- Gas (most entropy)
Low probability = how much information?
Low probability -> Surprise -> More information
High probability = how much information?
High probability -> No Surprise -> No new information
The amount of info =
Some function of probability
Shannon’s entropy equation
H = - ( n SUM i = 1 ((Pi)log2(Pi)))
Max H =
log P
0 < H < log2P
How does H affect outcome? More or less certain?
Outcome is more uncertain if H is larger
What are the single scale input parameters (3)?
N = data length m = pattern length (length of vector) r = tolerance (criteria of similarity, filter)
General procedures of single scale entropy (7 Steps, 1-3 repeat to 4-6; DCC).
- Divide data up into vectors length m
- Count like matches
- Calculate conditional probability
- Divide data up into vectors length m+1
- Count your like matches
- Calculate conditional probability
- Entropy is a ratio of conditional probabilities