Random Lecture 1 Flashcards
What does deterministic mean in physics
given initial conditions and underlying phyiscal laws governing the evolution can predict the state of the system at all remaining times
What is the probability of event occuring
P(E) is defined in terms of multiple trials as P(E) = lim as N(S) tends to infinity N(E)/N(S) Where N(S) is number of times the situation occurs and N(E) is number of times the event E follows
What does a P(E) = 1 and P(E) = 0 mean
1 that the event is certain
0 that the event is impossible
In reality from a finite number of trial cannot assert that an estimated probability of 0 indicates impossibility
What is the probability of throwing a 6 P(6)
For a true die P(6) = 1/6
If two events are mutually exclusive then
the occurence of one precludes the offurence of the other in this case i.e. cant do both at the same time, probability of turning left or turning right, cant do both at the same time
P(E1 U E2) = P(E1) + P(E2)
What does th U symbol represent in probability
logical or operation so P(E1 U E2) is the probability that the event E1 or event E2 occurs
If E1 and E2 are not mutually exclusive what is the probability of P(E1 U E2)
P(E1) + P(E2) - P(E1 n E2)
What does the symbol n (upside down U) represent
the logical and operation or set theoretic intersection
Two events are statiscally indepdent if
the occurence of in no way influences the probability of the other
For a statiscally indepedent event what is the probaility of P(E1 n E2)
= P(E1) * P(E2)
What are random variables (RVs)
Variable that is completely random but the probability of a value can be determined, like the throw of a dice
Whats the difference between discrete and continous random variables
discrete RVs have specific values, continous RVs have a continous range of values
What is the issue with continous random variables and how do we solve this
If continous there are infinite number of possible outcomes thus
P(A1) + P(A2) + … + P(Ai) = 1
if i is infinite individual probabulity must all be zero
Solution is to specifiy a range of values
What is the probability density function (PDF)
px(dx) is the probability that X takes a value between x and x+dx
What will a probability density function (PDF) look like
be a curve with a mean at the most likely value returning to 0 for impossible values
How can you find the most probable result for a PDF
will be at the centroid of the PDF
P(X=x; a <= x <= b) = integral from a to b for p(x)dx
Geomtertically what is the probability of a result in a certain range for a PDF
the area under the graph (why result for a single number is zero as area under a point is zero)
What must the total area underneath a PDF be equal to
integral from - inf to inf of p(x)dx = 1, as saying proability of all possible outcomes
This mean p(x) must tend to zero as x tends to +- infinity
How would you create a PDF
PDF are for continous variables, so complete test number of times and measure results in certain ranges to create a histogram then connect the midpoints of each bar in the graph
General formula for expected values of discrete RVs
E(x) = sum of xi P(X=xi) xi
Expected value of sum is equal to sum of each possible outcomes probability of occuring multiplied by its value
When all values are equally likely with is the E(single cast of true dice) value equal to when all outcomes are equally likely
E = 1 + 2 + 3 + 4 + 5 + 6 /6 = 3.5
Could also do the long way by formula 1 1/6 + 21/6 + 3*1.6 + … etc but expect value is the arthmetic mean when all outcomes are equally likely
How is the expected value of RV X often referred to
mean and denoted by xbar
What is the expection of a continous RVs formally written
Xbar = E(X) = integral from - inf to inf of x*p(x)*dx where p(x) is the PDF for the random variable X Range need only be taken over the range of possible value for x however limits are usually taken as +-inf
What does the expected value of a RV not need to be same as
the peak value, can occur where PDF is unbalanced to oneside
Draw a diagram of PDF where the peak value and expected value are not necessairly the same
lobsided PDF
form x*e^-x often works
Through drawing of two PDFs show why standard deviation or variance is as important as mean
for a thin sharp PDF means a good guess, for a wide fat PDF mean not necessairly a good guess
How can the standard deviation also be descirbed
expected value of the error E(epsilon)
What is the expected value of E(epsilon)
E(epsilon) = epsilon is the variance which is the difference between the expected value and the mean value therefore E(X-xbar) = E(X) - E(xbar) = xbar - xbar = 0 as postivies and negative errors are equally likely
How can the issue with E(epsilon) be avoided
E(epsilon) = 0, use E(epsilon^2) equal to variance sigma^2
What is E(epsilon^2) equal to
sigma^2 = E(epsilon^2) = E((X - xbar)^2) = E(X^2) - E(2xbarX) + E(xbar^2) = E(X^2) - xbar^2
What is E(xbar) equal to
expectation of an average, expecation doesnt impact average so just stays xbar
What is variance equal to for continous RVs
sigma^2 = integral from - inf to inf (x - xbar)^2 p(x) dx
What is the variance comparable to
eucledian distance, point distance from the mean
What is standard deviation
sigma which is the square root of the variance
measures the width of a distribution in the same way as the means says where the centre is
What is the gaussian or normal distribution
Most important of all distributions, completely fixed by a knowldege of its mean and variance
What is the equation for gaussian distrbution
see powerpoint
Why is the gaussian distibution so important
central limit theorem e.g. when independent random variables are summed up, their properly normalized sum tends toward a normal distribution even if the original variables themselves are not normally distributed
What is a stochastic process
Family of RVs Xt parameterised by t (usually time) the set of times can be continous or discrete
Each RV takes values from a probability distribution pt(x)
In general RV at t could depend on values at previous times, often Xt is assumed independent of previoous values (independent)
What is indeticall distributed stochastic process
Where the PDF pt(x) are all indentical for all values of t
What are the most important stochastic processes
Those which are independent and identically distributed (i.i.d) thus you only need one PDF p(x)
When creating a clasification tree for random signals what creates the first branch
PDF p(x) for indepenent and indentically distributed stochastic process or not
Draw a complete classification tree for random signals
See powerpoint
Random -> i.i.d and non i.i.d
i.i.d -> gaussian and non gaussian i.i.d
How are underlying distributions pt(x) or p(x) calculated for random signal
If not specified by physics, difficult
in practice mainly want lower order terms e.g. mean and variance
How is ensemble averging completed
In general case, generate several example realistaion of the process starting from the same initial conditions
x^(i)(t), i = 1, … Np where Np are a set of identical experiments
Mean of the RV X(t) is given by
E[Xt] = 1/Np * sum of i=1 to Np of x^(i)(t)
How does on get E[Xt] for a random signal
through ensemble average
individual x^(i)(t) are called realisation
What is the issue with ensamble averging
Might only have one experimental rig
What is an ergodic process
One for which time and ensemble average are equivelant
If a process is ergodic
if process or i.i.d or special (i.e get period in chunks) then pt(x) is same for all t or a period T and one might estimate the statistics by averaging over time E[Xt] = 1/T * integral from 0 to T of x*p(x)*dt
What is a stationary signal
One in which statstical moments do not change with time
if only the mean and standard deviation are constant the signal is called weakly station
How can i.i.d signal be automatically classified
automatically ergodic and thus stationary
Draw a diagram of a signal in which it is nonstationary in the mean
mean value increases
see presentation
Draw a diagram of a signal in which is it is nonstationary in the variance
average amplitude of the value gradually increases
see presentation