Random Variables Flashcards

1
Q

Definition Random Variables

A

A random variable is an object, usually a number of a vector, whose value depends on the outcome of a random experiment

Definition
Let
–> S be the sample space of a probability model
–> E be a Set
A function:
X : S –> E
s –> X(s)
is called a random variable with values in E

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Distribution of X

A

To every
B is a subset of E
we associate

P(XϵB) = µx(B)

It can be shown that µx is a probability on the subsets of E, called the distribution of X

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Discrete Random Variable

A

A random variable X is said to be discrete if it can attain finitely or countably many values.
More precisely, there exists a
N is a subset of E finite or countable.
Such that
P(XϵN) = 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Discrete Random Variable : countable elements

A

a set N is countable if its elements can be enumerated

N = {x1, x2, …, xn}

–> countable set:
- integer numbers
- rational numbers
–> not countable set:
- real numbers

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Discrete density

A

If X is a discrete random variable, define
px: E –> [0,1]
x –> P(X=x)

px(x) = P(X=x)

px is called probability mass function or discrete density

By using ͳ-additivity of the probability we have for A is a subset of E

µx(A) = P(xϵA) = Σ P(X=x)
µx(A) = Σ px(x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Expected value (or mean value)

A

Let X be a discrete random variables with values in R

Denote by
N = {x1, x2, …, xn}

the finite or countable sets of its values

The expected value of X is defined by

E(X)= Σxi px(xi)= Σx px(x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Properties of expectation

A

1) Constant random variables
2) Invariance by linear transformation
3) E[f(x)]=Σ f(xi) px(xi)
4) Linearity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Property of expectation (1): constant random variable

A

Let C be a real number. A constant can be viewed as a random variable that attains the value c with probability 1. So:

E(c) = c * 1 = c

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Property of expectation (2): Invariance by linear transformation

A

Suppose
–> X is a:
- discrete
- real valued
- random variable

–> a,b ϵ R

Then

E(aX + b) = a * E(x) + b

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Property of expectation (3)

A

Suppose X is a:
- discrete random variable
- values in the set E

Let f: E –> R

Remark: a random variable is a function that goes from a sample space to E x: S –> E

S -x-> E -f-> R
2 function related one after another one

E[f(x)] = Σ f(xi) px(xi)

How to read this formula
If I have a discrete random variable with his density and I consider a generic transformation of this random variable than I can know the expectation value of this transformation by knowing the density of the discrete random variable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Property of expectation (4): Linearity

A

Suppose:
X and Y are two real valued variables defined on the same sample space S

If a,b ϵ R

E(aX+bY) = aE(X) + bE(Y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Moments of X

A

Suppose
X is a discrete real valued random variable
For m>1 consider the random variable X^m

By property 3

E(X^m)= Σ xi^m px(xi)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Variance of X

A

High variance indicates that X attains values far from its expectation with a significance probability

Var(x) = Σ (xi - µ)² px(xi)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Standard deviation

A

SD(x) = √Var(x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Properties of Variance

A

1) Var(x) = E(x²) - (E(x))²
2) Var(aX+b) = a² * Var(X)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Bernoulli random variables

A

Only take values 0 and 1
Density –> X~Be(p)
P(X=1) = p
P(X=0) = 1-p

E(X) = p
Var(X) = p(1-p)

17
Q

Binomial random variables

A

Consider the random experiment of independent trials with probability of success p
Denote by X the number of successes in the first n trials

px(k) = P(X=k) =(n) p^k (1-p)^(n-k)
(k)

Density –> X~Bin(n,p)

E(X) = np
Var(X) = np(1-p)

E(X^m)=npE[(y+1)^(m-1)]
(With Y~Bin(n-1,p))

18
Q

Poisson random variables

A

We say that X is a Poisson random variable with parameter λ>0, and we write

X~Pois(λ)

px(n) = P(X=n) = (e^-λ) * (λ^n)/(n!)

Note: this is a density since
Σ (λ^n)/(n!) = (e^λ)

They emerge as approximation of Binomial random variables
This bond is used when:
▪ p«1 and n»1
▪ np = λ is neither small nor larger
▪ np²«1

E(X) = λ
Var(X) = λ

E(X^m) = λ E((X+1)^(m-1))
If m = 2 then E(X²) = λ²+λ

19
Q

Geometric random variables

A

Consider repeated independent trials with probability of success p, and denote by X the trial corresponding to the first success

X = n <–> the first n-1 trials are losses and the nth trial is a success

px(n) = P(X=n) = p * (1-p)^(n-1)

X~Geo(p)

E(X) = 1/p
Var(X) = (1-p)/p²

20
Q

Loss of memory property

A

P ( X>n+k|X>n ) = P ( X>k )