Probability Theory Flashcards
What is a probability mass function (pmf) and how is it defined for a discrete random variable?
A probability mass function (pmf) is a function from the sample space to non-negative reals such that the sum over all points in the domain equals 1. It defines the probability distribution of a discrete random variable.
How is a continuous random variable characterized in terms of probability distribution?
A continuous random variable is characterized by its probability distribution function (pdf), which is a function from the sample space of non-negative reals where the integration over the domain represents the probability distribution.
What is the moment-generating function of a random variable and its significance?
The moment-generating function of a random variable is defined as the expectation of e^(tX) where t is a parameter. It encodes all the moments (k-th moments) of a random variable, providing a unified way to study all statistical information of the variable.
What does the Law of Large Numbers indicate about a large number of trials in an experiment?
The Law of Large Numbers indicates that as the number of trials in an experiment increases, the average of the results obtained from the trials will converge to the expected value, demonstrating the stability of long-term results in random experiments.
Describe the Central Limit Theorem and its significance in probability theory.
The Central Limit Theorem states that the distribution of the sum (or average) of a large number of independent, identically distributed random variables, each with a finite mean and variance, will approximate a normal distribution, regardless of the underlying distribution.
How is a log-normal distribution defined and what is its relationship to the normal distribution?
A log-normal distribution is defined for a random variable whose logarithm is normally distributed. It is derived from a normal distribution using a change of variable formula, and is used to model distributions where the values are positively skewed, such as financial asset prices.
What is the difference between mutually independent and pairwise independent events in probability?
Mutually independent events imply that any collection of events from the set are independent of each other, while pairwise independence only ensures that each pair of events is independent. In the latter, a larger collection of events may not be independent.
Explain the concept of exponential family in probability distributions.
A distribution belongs to the exponential family if its probability density function can be expressed in a specific format involving parameters, functions dependent only on x, and functions dependent only on the parameters. Distributions in this family have desirable statistical properties.
What distinguishes a discrete random variable from a continuous random variable?
A discrete random variable is characterized by a probability mass function (pmf) and takes on countable values, while a continuous random variable is characterized by a probability distribution function (pdf) and takes on a continuous range of values.
What is the role of the sample space in defining a probability distribution?
The sample space is the set of all possible outcomes of a random experiment, and it forms the domain over which the probability mass function (for discrete variables) or probability distribution function (for continuous variables) is defined.
Define ‘expectation’ in the context of probability theory.
Expectation, or the mean of a random variable, is the weighted average of all possible values that the variable can take on, with each value weighted according to its probability of occurrence.
How is the independence of two random variables defined?
Two random variables X and Y are independent if the occurrence of an event in X does not affect the probability of an event in Y, mathematically defined as P(X in A and Y in B) = P(X in A) * P(Y in B) for all events A and B.
What does it mean for random variables to be mutually independent?
Mutually independent random variables mean that any collection of these variables is independent, implying that the occurrence of any event in one variable does not influence the occurrence of events in any other variables in the collection.
Describe pairwise independence in random variables.
Pairwise independence in random variables means that each pair of variables is independent of each other, but it does not necessarily imply independence among larger sets of these variables.
What is a normal distribution and its significance?
A normal distribution is a continuous probability distribution characterized by its bell-shaped curve, symmetrical about the mean. It is significant in probability and statistics due to its universality in modeling a wide range of natural phenomena.
Explain the concept of a uniform random variable.
A uniform random variable has a distribution where all intervals of the same length within its range have an equal probability of occurrence, often represented by a constant probability density function over its interval.
What is a probability distribution function (pdf) in the context of continuous random variables?
For continuous random variables, the probability distribution function (pdf) describes the density of probability over the variable’s range. It is the function whose integral over an interval gives the probability of the variable falling within that interval.
How is the expectation of a continuous random variable calculated?
The expectation of a continuous random variable is calculated as the integral of the product of the variable’s value and its probability density function over the entire range of the variable.
What is a moment-generating function and how is it related to moments?
A moment-generating function is a function that, when expanded, provides the moments of a probability distribution. The k-th derivative of the moment-generating function at 0 gives the k-th moment of the distribution.
Explain the relationship between moment-generating functions and the distribution of random variables.
If two random variables have the same moment-generating function, they have the same distribution. The moment-generating function uniquely characterizes the distribution of a random variable if it exists.
Why might a moment-generating function not exist for a given distribution?
A moment-generating function might not exist if the expected value of e^(tX) does not converge for a given distribution, as can be the case with distributions like the log-normal distribution.
How does the Weak Law of Large Numbers relate to the mean of a distribution?
The Weak Law of Large Numbers states that as the sample size increases, the sample mean will converge in probability to the expected value (mean) of the distribution.