Day 3- RQA, CRQA and Entropy Flashcards
Recurrence Rate
% of the plot occupied by points
% Determinism
% of points that fall on a diagonal line structure
Average line length
Average length of time the system repeats
Max line length
Longest diagonal line; strength of attractor
Entropy (Shannon)
The probability that the length of a line will be repeated
To create plot in RQA (3 steps)
- Embedding dimension (FNN)
- Time delay (AMI)
- Radius (define a space to determine what points are recurrent
Unweighted RQA plot
Binary matrix that indicates that two points are recurrent, once below a threshold
Weighted RQA plot
Based on the distances between the points in phase space
Cross RQA (CRQA)
Coupling of two different TS in one time scale
Neighbors of that region
When some occupied region of phase space is revisited within the same “radius” of that space
Rescaling (What do you rescale to? 2 Answers)
Rescaling both TS to mean or max distance separating points in the reconstructed phase space
Radius
Define a space to determine what points are recurrent
% Recurrence
Select radius between what criteria?
How often do two systems occupy the same space? Select radius between ~2-5%
CRQA Plot characteristics
- Rescaling
- Radius
- % recurrence
- % determinism
- Max line length
Entropy (2 answers)
- Loss of information in a TS or signal
2. Amount of uncertainty
Put in order from most ordered to least ordered (low to high entropy):
- Gas
- Solid
- Liquid
- Solid (least entropy)
- Liquid (middle entropy)
- Gas (most entropy)
Low probability = how much information?
Low probability -> Surprise -> More information
High probability = how much information?
High probability -> No Surprise -> No new information
The amount of info =
Some function of probability
Shannon’s entropy equation
H = - ( n SUM i = 1 ((Pi)log2(Pi)))
Max H =
log P
0 < H < log2P
How does H affect outcome? More or less certain?
Outcome is more uncertain if H is larger
What are the single scale input parameters (3)?
N = data length m = pattern length (length of vector) r = tolerance (criteria of similarity, filter)
General procedures of single scale entropy (7 Steps, 1-3 repeat to 4-6; DCC).
- Divide data up into vectors length m
- Count like matches
- Calculate conditional probability
- Divide data up into vectors length m+1
- Count your like matches
- Calculate conditional probability
- Entropy is a ratio of conditional probabilities
Parameters: r
Allowable difference when determining similarity between elements in vector (tolerance)
Too small r
Similar data points are not seen as similar
Too large r
Details of the system dynamics are lost
Selection of r: Experimental error is known
Set r 3 times larger than noise amplitude
Selection of r: Experimental error is unknown
0.1~0.25 times the SD (common in literature at 0.20 SD)
Definition of complexity (DIS)
Very many definitions: Probability, predictability, regularity?
- Originates from deterministic origin ( 1 to 1)
- Infinitely entangled
- Structural richness
Permutation entropy
Temporal order of data within vectors.
What is the only parameter needed for permutation entropy?
m = pattern length
Steps to permutation entropy
- Data
- Divide into length m vectors
- Convert vectors to ordinal data
- Count like patterns
Steps to symbolic entropy
- Convert time series to binary symbol on threshold value
- Form words from symbols (typically L = 3)
- Convert binary to decimal
- Calculate Shannon entropy from word symbol series
- Normalize result by dividing max entropy for the possible number of words
How to select threshold?
Mean: Half of symbols zeroes and half ones.