Stat Flashcards
Roll a die once. Payoff $1 for each “dot”. Fair die. What is the ticket price of this game?
Expectation of rolling a die: 1/6(1) + 1/6(2) +…+(1/6)(6) = 3.5
Roll a die no more than three times. You can stop me anytime. Payoff is whatever (1,2, or 3) dots are there on the last roll. What is your strategy?
Must work backwards like an American Option pricing. When would you ask for a third roll - expectation of 2nd roll is 3.5. Only ask for 3rd roll if you see (1, 2 o r 3), take profit if (4, 5, or 6). Work backward to 1st roll, what is my distribution of remaining rolls? [.5{(4+5+6)/3)+(.5)(3.5)] = 4.25. Thus you would ask for roll 2, only if you see 1, 2, 3 or 4 on roll one. Value = [(2/3)(4.25)+(1/3)((5+6)/2))] = 14/3 = 4.67. We are assuming risk-neutrality. Risk-averseness would increase the cost of this game
Exchange Paradox: Two sealed envelopes. You know one envelope has m/2 dollars and another one has 2m dollars. Q1: Without peeking what is your expected benefit to switching? Q2. After peeking, what is your expected benefit to switching?
Ans1: Assuming no peeking, your expectation of switching is [ .5 * 2m + .5* .5m ] = 1.25 m. This is the benefit of .25m. Both of us should benefit from switching. Ans2: Decision becomes subjective, if I see an amount that is too high then I might not switch and vice versa. Mathematically - first response can be improved upon by looking at the nature of prior probabilities. P($m) - probability that I get $m (lower amount), E(V) - expected value of switching:::: E(V) = [E(v|$m) x P($m)] + [E( v|$2m) x P($2m)] = (+$m x .5 + (-$m x .5) = 0. The expected value is zero - thus you are indifferent - which resolves the paradox.
The correlation between X and Y is p. What is the correlation between X+5 and Y? And Correlation between 5x and Y?
Correlation between X+5 and Y is the same. Adding a number does not change anything. Similarly multiplying X by 5 also does not change the correlation.
I toss 4 coins and you toss 5 coins. You win if you get more heads than I do. What is the probability that you win?
Remember tosses are independent. Hence the binomial joint density fxy(x,y) is the product of marginals. The answer is pretty straightforward - think about it. Since you have one more coin than I do then your probability of getting a head in one extra toss is 1/2. This is the probability that you will win.
Remember the World Series problem
Work the problem backward. Like you would do it on a binomial tree
You have three children, buy only one apple. You want to toss a fair coin to determine which child gets the apple. You want each child to be equally likely to get the apple. What is your strategy?
You toss the coin twice, Child A gets the apple if both coins are heads (HH), Child B gets the apple if both coins tails (TT) and Child C gets the apple if first coin is H and 2nd coin is T. If TH comes then replay the game, this way TH is out of the sample space.
Follow up question: What is the expected number of tosses are needed to complete this strategy?
You have three children, buy only one apple. You want to toss a fair coin to determine which child gets the apple. You want each child to be equally likely to get the apple. What is your strategy?
At minimum, we know that we will require 2 tosses. We only restart the game if we see TT, hence the probability of finishing the game in first two tosses are 3/4. There is 1/4 probability that number of tosses will be more than 2. Let N be the number of total tosses,
E(N) = [ 3/4 * (2) + 1/4 * (2 + E(N)) ], solving for E(N), we receive 8/3.
A dice game. You roll a die until a number other than one appears. When such number appears for the first time, I pay you the same number of dollars as there are dots on the upturned face of the die, and the game ends. What is the expected payoff to this game?
If 1 comes, you keep on rolling till any other number other than one appears. This game should cost $4.
Fun fact: Both sides of the die sum up to seven, hence the sum of all die numbers is 21.
You are dealt exactly two playing cards from a well shuffled standard 52-card deck. The standard deck contains exactly four Kings. What is the probability that both of my card are king?
Probability = (4/52)(3/51) = (1/13)(1/17) = 1/221.
Know the conditional probabilities:
P(Both are kinds) = P(2nd is king | First is king) * P(First is king)
Monty Hall - Let’s make a deal (Logical Explanation)
Intuition: Unconditional probability of picking the door with goat is 2/3. The host will be forced to open the door that has the goat behind it. Hence you have 2/3 probability of winning if you switch vs. 1/3 for not switching.
Assumption 1: You will switch the door.
Assumption 2: You try you best to pick a door that is empty. Which has 2/3 probability.
In this case (assuming you succeed in picking an empty door), the host will show you the other empty door, hence switching yields the prize. In this scenario, we have 2/3 probability that I win by switching and 1/3 probability that I lose.
Monty Hall - Let’s make a deal (Formal Bayes’ Theorem Proof)
Assume you choose Door 3. Assume the prize is behind door 1 and host opens door 2.
Eq 1: P (B1 | H2) = P ( B1 intersect H2) / P (H2) = [P (H2 | B1) * P (B1) ]/ P (H2)
We know that P(B1) = 1/3. If you choose door 3, and if the prize is behind door 1 then P(H2 | B1) = 1. However how do we calculate P(H)?
P(H) = [ P(H2|B1) * P(B1) + P(H2|B2) * P(B2) + P(H2|B3) * P(B3) ] = [ 1(1/3) + 0(1/3) + (1/2)*(1/3) ] = 1/2 .
Plug this number in Eq 1, and we have 2/3. that is the answer.
Monty Hall Question - however the host asks someone from the audience to open the door? Should I switch my option?
Assuming that the audience member does not know the location of prize money. If he just happens to show an empty door then the audience member has blindly removed one door from sample space, and we have a 50/50 chance of winning if we switch. Hence we should be indifferent.
Two empty jars. 50 White marbles and 50 black marbles. Your are to put all 100 of the marbles into the two jars in any way I choose. I am blindfolded and someone shakes the jar up. Then I get to select at least one jar (either left/right handed) and then select ONLY one marble. No other chances. How many of each color marble I should place in each jar to maximize my probability that blindfolded random draw obtains a white marble? Another question: How would you minimize the probability of black marble?
Put 99 marbles in (let’s say) left hand side jar, and 1 marble in right hand side jar. Pick the right hand side jar and pick that one marble. Here my probability of picking the white marble is 50%, unmatched with either 50/99 or 49/99. Remember this problems is unique because I am blindfolded.
Another way to tackle this problem is by putting one white marble in the right hand side jar. (assuming I have a choice of rearranging jars)
Mr. 10, Mr. 30 and Mr. 60. Number also represents the %chance of killing someone in one gun shot. Shooting occurs in order, Mr. 10, Mr. 30 and then Mr. 60. Goal is to stay alive, if I get to shoot first, at whom do I shoot?
(1) Shoot in the air and let Mr. 30 take on Mr. 60, if Mr. 30 fails then Mr. 60 should take out Mr. 30 and then on my next turn I should take a shot at whoever survived.
Pros/Cons of this strategy: If I shoot in the air then I have an advantage in the final shoot-out with either Mr. 30 and Mr. 60 (assuming that they want to kill each other first since Mr. 10 is not an immediate threat). On the other side, Mr. 10 can shoot at Mr. 60 in order to increase the likelihood of him facing Mr. 30 vs Mr. 60 in the final shootout.