Counting and Probability Flashcards
What is the “Rule of sum” in the context of counting theory?
The rule of sum says that the number of ways to choose one element from one of two disjoint sets is the sum of the cardinalities of the sets. | A U B | = |A| + |B|.
What is the “Rule of product” in the context of counting theory?
The rule of product says that the number of ways to choose an ordered pair is the number of ways to choose the first element times the number of ways to choose the second element. | A x B | = |A| . |B|
What is a string in context of sets? k-string?
A string over a finite set S is a sequence of elements of S. A string of length k is sometimes called a k-string.
What is a set permutation?
A permutation of a finite set S is an ordered sequence of all the elements of the S with each element appearing exactly once.
What is the number of set permutations?
Set which contains n elements has n! permutations.
What is a k-permutation and what is the number of k-permutations of n-set?
k-permutation of S is an ordered sequence of k elements of S, with no element appearing more than once in the sequence. The number of k-permutations of an n-set is:
n! / (n - k)!
What is a k-combination of a set?
A k-combination of an n-set S is a k-subset of S. Example:
{a,b,c,d} has six 2-combinations:
ab, ac, ad, bc, bd, cd
What is the number of k-combinations of n-set?
n! / (k! (n-k)!)
What are binomial coefficients?
(n choose k) denotes the number of k-combination of an n-set. (n choose k) = n! / (k! (n - k)!)
What is probability sample space?
Probability sample space is a set whose elements are called elementary events. Elementary event is a possible outcome of the experiment, so sample space is a set of all possible outcomes of an experiment.
What are probability axioms?
- Pr {A} >= 0 for any event A
- Pr {S} = 1
- Pr {A U B} = Pr{A} + Pr{B} for any two mutually exclusive events.
What is a discrete probability distribution?
A probability distribution is discrete if it is defined over a finite or countably infinite sample space.
What is a uniform probability distribution?
If S is finite and every elementary event s has probability Pr{s} = 1 / |S|.
What is continuous uniform probability distribution?
The continuous uniform probability distribution is an example of a probability distribution in which not all subsets of the sample space are considered to be events. The continuous uniform probability distribution is defined over a closed interval [a,b], where a < b. Each point between a and b should be “equally likely”. There are an uncountable number of points, so if all points were to be given the same finite, positive probability, then the probability axioms could not be satisfied. That’s why the probability is defined only on some of the subsets of S. For any closed interval [c,d], where a <= c <= d <= b, the continuous uniform probability distirbution defines the probability of the event [c,d] to be:
Pr{[c,d]} = (d - c) / (b - c)
How is the conditional probability of an event A given that another event B occured?
Pr{A | B} = Pr{ A ^ B } / Pr{B}
When are two events independent?
Two events are independent if:
Pr { A ^ B } = Pr{A} Pr{B}
What is Bayes’s theorem?
From the definition of conditional probability and the commutative law A^B = B^A, it follows:
Pr{B}Pr{A | B} = Pr{A}Pr{B | A}
Solving for Pr{ A | B } = Pr{A}Pr{B | A} / Pr{B}
What is a (discrete) random variable?
A (discrete) random variable X is a function from a finite or countably infinite sample space S to the real numbers. It associates a real number with each possible outcome of the experiment.
What is the probability density function?
For a random variable X and a real number x, we define the event X = x to be { s belongs S : X(s) = x }; thus,
Pr {X = x} = SUM[s belongs S : X(s) = x] Pr{s}
The function f(x) = Pr{X = x} is the probability density function.
What is the expected value of a random variable?
Expected value is the simplest and most usefull summary of the distribution of a random variable. It’s the “average” of the values it takes on. The expected value (or, synonymously, expectation or mean) of a discrete random variable X is:
E[X] = SUM[x] x Pr{X = x}
What is linearity of expectation?
The expectation of the sum of two random variables is the sum of their expectations, that is:
E[X + Y] = E[X] + E[Y]
What is variance of random variable X?
Variance mathematically expresses how far from the mean a random variable’s values are likely to be. The variance of a random variable X with mean E[X] is:
Var[X] = E [(X - E[X])^2]
= E [X2 - 2XE[X] + E2[X]]
= E[X2] - E2[X]
What is the standard deviation of a random variable X?
The standard deviation of a random variable X is the nonnegative square root of the variance of X.
What are Bernoulli trials?
Bernoulli trials are mutually independent trials and, unless specifically specified otherwise the have same probability for success.
What is the geometric distribution?
Geometric distribution is a distribution satisfying the equation:
Pr {X = k} = q^(k-1)p.
This distribution shows how many trials before a sequence Bernoulli trials obtains a success. p is the probability of success, q is probability of failure (q = 1 - p) and k is the number of trials until success.
What is the binomial distribution?
Binomial distribution shows how many successes occur during n Bernoulli trials with probability p for success and probability q = 1 - p for failure.
Pr{X = k} = (n over k)(p^k)(q^(n-k))
What are two common ways for randomly permuting arrays?
- Permuting by sorting - Assign each element A[i] of the array A a random priority P[i], and then sort the elements of A according to these priorities.
- Permuting in place - Iterate over the array. At iteration i select the element A[i] from range A[i..n]. After it is selected the A[i] element does not change.