Random Lecture 1 Flashcards

1
Q

What does deterministic mean in physics

A

given initial conditions and underlying phyiscal laws governing the evolution can predict the state of the system at all remaining times

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the probability of event occuring

A
P(E) is defined in terms of multiple trials as
P(E) = lim as N(S) tends to infinity N(E)/N(S)
Where N(S) is number of times the situation occurs and N(E) is number of times the event E follows
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What does a P(E) = 1 and P(E) = 0 mean

A

1 that the event is certain
0 that the event is impossible
In reality from a finite number of trial cannot assert that an estimated probability of 0 indicates impossibility

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the probability of throwing a 6 P(6)

A

For a true die P(6) = 1/6

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

If two events are mutually exclusive then

A

the occurence of one precludes the offurence of the other in this case i.e. cant do both at the same time, probability of turning left or turning right, cant do both at the same time
P(E1 U E2) = P(E1) + P(E2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What does th U symbol represent in probability

A

logical or operation so P(E1 U E2) is the probability that the event E1 or event E2 occurs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

If E1 and E2 are not mutually exclusive what is the probability of P(E1 U E2)

A

P(E1) + P(E2) - P(E1 n E2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What does the symbol n (upside down U) represent

A

the logical and operation or set theoretic intersection

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Two events are statiscally indepdent if

A

the occurence of in no way influences the probability of the other

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

For a statiscally indepedent event what is the probaility of P(E1 n E2)

A

= P(E1) * P(E2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What are random variables (RVs)

A

Variable that is completely random but the probability of a value can be determined, like the throw of a dice

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Whats the difference between discrete and continous random variables

A

discrete RVs have specific values, continous RVs have a continous range of values

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the issue with continous random variables and how do we solve this

A

If continous there are infinite number of possible outcomes thus
P(A1) + P(A2) + … + P(Ai) = 1
if i is infinite individual probabulity must all be zero
Solution is to specifiy a range of values

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the probability density function (PDF)

A

px(dx) is the probability that X takes a value between x and x+dx

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What will a probability density function (PDF) look like

A

be a curve with a mean at the most likely value returning to 0 for impossible values

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

How can you find the most probable result for a PDF

A

will be at the centroid of the PDF

P(X=x; a <= x <= b) = integral from a to b for p(x)dx

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Geomtertically what is the probability of a result in a certain range for a PDF

A

the area under the graph (why result for a single number is zero as area under a point is zero)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What must the total area underneath a PDF be equal to

A

integral from - inf to inf of p(x)dx = 1, as saying proability of all possible outcomes
This mean p(x) must tend to zero as x tends to +- infinity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

How would you create a PDF

A

PDF are for continous variables, so complete test number of times and measure results in certain ranges to create a histogram then connect the midpoints of each bar in the graph

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

General formula for expected values of discrete RVs

A

E(x) = sum of xi P(X=xi) xi

Expected value of sum is equal to sum of each possible outcomes probability of occuring multiplied by its value

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

When all values are equally likely with is the E(single cast of true dice) value equal to when all outcomes are equally likely

A

E = 1 + 2 + 3 + 4 + 5 + 6 /6 = 3.5
Could also do the long way by formula 1 1/6 + 21/6 + 3*1.6 + … etc but expect value is the arthmetic mean when all outcomes are equally likely

22
Q

How is the expected value of RV X often referred to

A

mean and denoted by xbar

23
Q

What is the expection of a continous RVs formally written

A
Xbar = E(X) = integral from - inf to inf of x*p(x)*dx
where p(x) is the PDF for the random variable X
Range need only be taken over the range of possible value for x however limits are usually taken as +-inf
24
Q

What does the expected value of a RV not need to be same as

A

the peak value, can occur where PDF is unbalanced to oneside

25
Q

Draw a diagram of PDF where the peak value and expected value are not necessairly the same

A

lobsided PDF

form x*e^-x often works

26
Q

Through drawing of two PDFs show why standard deviation or variance is as important as mean

A

for a thin sharp PDF means a good guess, for a wide fat PDF mean not necessairly a good guess

27
Q

How can the standard deviation also be descirbed

A

expected value of the error E(epsilon)

28
Q

What is the expected value of E(epsilon)

A
E(epsilon) = epsilon is the variance which is the difference between the expected value and the mean value therefore
E(X-xbar) = E(X) - E(xbar) = xbar - xbar = 0 as postivies and negative errors are equally likely
29
Q

How can the issue with E(epsilon) be avoided

A

E(epsilon) = 0, use E(epsilon^2) equal to variance sigma^2

30
Q

What is E(epsilon^2) equal to

A

sigma^2 = E(epsilon^2) = E((X - xbar)^2) = E(X^2) - E(2xbarX) + E(xbar^2) = E(X^2) - xbar^2

31
Q

What is E(xbar) equal to

A

expectation of an average, expecation doesnt impact average so just stays xbar

32
Q

What is variance equal to for continous RVs

A

sigma^2 = integral from - inf to inf (x - xbar)^2 p(x) dx

33
Q

What is the variance comparable to

A

eucledian distance, point distance from the mean

34
Q

What is standard deviation

A

sigma which is the square root of the variance

measures the width of a distribution in the same way as the means says where the centre is

35
Q

What is the gaussian or normal distribution

A

Most important of all distributions, completely fixed by a knowldege of its mean and variance

36
Q

What is the equation for gaussian distrbution

A

see powerpoint

37
Q

Why is the gaussian distibution so important

A

central limit theorem e.g. when independent random variables are summed up, their properly normalized sum tends toward a normal distribution even if the original variables themselves are not normally distributed

38
Q

What is a stochastic process

A

Family of RVs Xt parameterised by t (usually time) the set of times can be continous or discrete
Each RV takes values from a probability distribution pt(x)
In general RV at t could depend on values at previous times, often Xt is assumed independent of previoous values (independent)

39
Q

What is indeticall distributed stochastic process

A

Where the PDF pt(x) are all indentical for all values of t

40
Q

What are the most important stochastic processes

A

Those which are independent and identically distributed (i.i.d) thus you only need one PDF p(x)

41
Q

When creating a clasification tree for random signals what creates the first branch

A

PDF p(x) for indepenent and indentically distributed stochastic process or not

42
Q

Draw a complete classification tree for random signals

A

See powerpoint
Random -> i.i.d and non i.i.d
i.i.d -> gaussian and non gaussian i.i.d

43
Q

How are underlying distributions pt(x) or p(x) calculated for random signal

A

If not specified by physics, difficult

in practice mainly want lower order terms e.g. mean and variance

44
Q

How is ensemble averging completed

A

In general case, generate several example realistaion of the process starting from the same initial conditions
x^(i)(t), i = 1, … Np where Np are a set of identical experiments
Mean of the RV X(t) is given by
E[Xt] = 1/Np * sum of i=1 to Np of x^(i)(t)

45
Q

How does on get E[Xt] for a random signal

A

through ensemble average

individual x^(i)(t) are called realisation

46
Q

What is the issue with ensamble averging

A

Might only have one experimental rig

47
Q

What is an ergodic process

A

One for which time and ensemble average are equivelant

48
Q

If a process is ergodic

A
if process or i.i.d or special (i.e get period in chunks)
then pt(x) is same for all t or a period T and one might estimate the statistics by averaging over time
E[Xt] = 1/T * integral from 0 to T of x*p(x)*dt
49
Q

What is a stationary signal

A

One in which statstical moments do not change with time

if only the mean and standard deviation are constant the signal is called weakly station

50
Q

How can i.i.d signal be automatically classified

A

automatically ergodic and thus stationary

51
Q

Draw a diagram of a signal in which it is nonstationary in the mean

A

mean value increases

see presentation

52
Q

Draw a diagram of a signal in which is it is nonstationary in the variance

A

average amplitude of the value gradually increases

see presentation