Math and logic puzzle Flashcards

1
Q

you have to bucket a 5/4 bucket and a 3/4 bucket, weirdly shaped, halh of the height is not half of the volume. An infinite amount of water. How do you measure 4/4 of water?

A

You should fill the 5/4 put in into the 3/4. Now you have 3/4 and 2/4. You throw the 3/4 away put the 2/4 in it and fill the 5/4 with that you fill the 3/4 or better the 1/4 missing and you are done.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Flatten pills: you have 20 bottles of pills. 19 have 1 gram pills one has 1.1g pills. how do you identify the heavy one by measuring with a precise scale only once??

A

You have one try. So most of the bottle needs to be used on the scale. If you do not use more than 1 you can not identify it. Also, you have to use them differently otherwise how do you identify the heavy one in the buck? So… Take a different number of pills from everyone 1 for first 2 from second etc… then you will be able to identify it,

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Mutiledted dominos, can you cover a 8x8 checkboard with mutileted corners with 31 dominos (that covers 2 places?)?

A

No even if you can cover 31x2 62 squares and you have 62 squares a domino always cover one black and one white. if you cut 2 blacks you are screwed

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

combination n objects take k

A

you create permutations of length k of n objects but now order does not matter so you divide by k! the rearrangement of the k-subset objects n!/(k!(n-k)!)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Blue eye islander: on an island all the blue eyed people have to leave. There is on flight per night. You can see other people eye but you do not know your own and they can not tell you. Do not know how many blue eyes but know there is at least one

A

If only one person you can see all other no blue eyes. So you leave. If 2 you are not sure, cause you see another one with blue eyes. but if she does not leave the first night it means you also have blue eyes and on the second night you both leave. So it takes c+1 with c is the number of people with bljue eye

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Permutations: how many string of n characters

A

n!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Relation between log_10 P and log_2P

A

Remember P=10^log_10(P)= 2^log_2(P) log_10(P) = log_2(P)/log_2(10) So they only differ by a constant factor.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

9 balls one is heavier. Identify it using only 2 uses of a scale that tells you if left or right is heavier.

A

weight 3 and 3. Then either one of those 3 to identify which one is heavier or if they are the same the remaining 2 you have not weighted the first time.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

sum of power of 2 say 2^0+2^1…2^n

A
  • you can see it using binary try to sum up them in binary format and you will see it. it is 2^(n+1)-1
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the sum ffor 1…N integers?

A

you can see this by writing them and grouping in couple them so that they sum up to n+1 (N+1)*N/2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Permutations with length

A

n characters you want k character string n!/(n-k)! because you can choose n characters for the fisr n-1 for the second etc but you stop at n-k+1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Given a generator of unbiased bernoulli numbers (0 or 1 with p=0.5), create a biased bernoulli trial generator (generate 0 or 1 with the specified 0 < p < 1)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

How to write a function to make a biased coin from a fair coin and vice versa

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

How do we handle categorical variables in decision trees?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Random forest:

What happens when we have correlated features in our data?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Random Forest

What if instead of finding the best split, we randomly select a few splits and just select the best from them. Will it work?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

RAndom forest

Is it easy to parallelize training of a random forest model? How can we do it?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What is gradient boosting trees?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What’s the difference between random forest and gradient boosting?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

What are the main parameters in the gradient boosting model?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

How do you select the number of trees in the gradient boosting model?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Feature importance in gradient boosting trees  —  what are possible options?

A
23
Q

Is it possible to parallelize training of a gradient boosting model? How to do it? ‍⭐️

A
24
Q

What is Adam? What’s the main difference between Adam and SGD?

A
25
Q

Do we want to have a constant learning rate or we better change it throughout training?

A
26
Q

Why do we actually need convolutions? Can’t we use fully-connected layers for that?

A
27
Q

Are CNNs resistant to rotations? What happens to the predictions of a CNN if an image is rotated?

A
28
Q

ow does max pooling work? Are there other pooling techniques?

A
29
Q

What kind of CNN architectures for classification do you know?

A
30
Q

What is object detection? Do you know any architectures for that?

A
31
Q

hat is object segmentation? Do you know any architectures for that?

A
32
Q

What is bag of words? How we can use it for text classification?

A
33
Q

What are N-grams? How can we use them?

A
34
Q

How large should be N for our bag of words when using N-grams?

A
35
Q

What is TF-IDF? How is it useful for text classification?

A
36
Q

Which model would you use for text classification with bag of words features?

A
37
Q

Would you prefer gradient boosting trees model or logistic regression when doing text classification with bag of words?

A
38
Q
A
39
Q

What are word embeddings? Why are they useful? Do you know Word2Vec?

A
40
Q

If you have a sentence with multiple words, you may need to combine multiple word embeddings into one. How would you do it?

A
41
Q

How can we use CNN for text classification?

A
42
Q

What is the ranking problem? Which models can you use to solve them?

A
43
Q

What are good unsupervised baselines for text information retrieval?

A
44
Q

Can we formulate the search problem as a classification problem? How? ‍

A
45
Q

What are good baselines when building a recommender system?

A
46
Q

What is collaborative filtering?

A
47
Q

How we can incorporate implicit feedback (clicks, etc) into our recommender systems? ‍⭐️

A
48
Q

What is the cold start problem?

A
49
Q

Possible approaches to solving the cold start problem?

A
50
Q

How would you select a representative sample of search queries from 5 million queries?

not algorithmically more on bird -ey view of the features you need

A

Some key features need to be kept in mind while selecting a representative sample.

Diversity: A sample must be as diverse as the 5 million search queries. It should be sensitive to all the local differences between the search query and should keep those features in mind.

Consistency: We need to make sure that any change we see in our sample data is also reflected in the true population which is the 5 million queries.

Transparency: It is extremely important to decide the appropriate sample size and structure so that it is a true representative. These properties of a sample should be discussed to ensure that the results are accurate.

51
Q

There are 6 marbles in a bag, 1 is white. You reach in the bag 100 times. After drawing a marble, it is placed back in the bag. What is the probability of drawing the white marble at least once?

A

The probability of drawing out at least one marble is the complement of probability of drawing not a single white marble at all. Therefore, we’ll calculate the Probability of drawing all non-white marbles over a hundred times and subtract by 1:

P(White at least once) = 1 – [P(Non-white marbles) ^ 100] = 1 - [(5/6) ^ 100]

52
Q

You call 3 random friends who live in Seattle and ask each independently if it’s raining. Each of your friends has a 2/3 chance of telling you the truth and a 1/3 chance of lying. All three say “yes”. What’s the probability it’s actually raining?

A

We have to find the probability of raining in Seattle given that all three friends said ‘Yes’.

Therefore, we are trying to find: P(rain | yes, yes, yes)

Using Bayes Theorem, our equation will now be:

P(rain | yes, yes, yes) =

P(yes,yes,yes|rain)*P(rain) / [P(yes,yes,yes|rain)*P(not rain) + P(yes,yes,yes|rain)*P(not rain)]

We have the following values:

P(yes, yes, yes | rain) = 2/3 ^ 3 = 8/27

P(yes, yes, yes | not rain) = 1/3 ^ 3 = 1/27

P(rain) = R (it is not given in question, so we’ll assume R)

P(not rain) = 1 - R

Substituting these values in equation we get:

P(rain | yes, yes, yes) = 8P/(7P + 1)

53
Q

You have 2 dice. What is the probability of getting at least one 4? Also find out the probability of getting at least one 4 if you have n dice.

A

For 2 die the probability of getting at least one four is: P(at least 1 four) = 1 – P(No four) = 1 – 5/65/6 = 1 – (5/6)^2 = 11/36

Following the pattern above, the probability with n dice will be: P(at least 1 four) = 1 – P(No four) = 1 – 5/6n