Final Local Search Flashcards

1
Q

Q1. What is the state space of local search for a CSP?

A

Answer: It is the space of complete assignments to the variables of the CSP.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Q2. What are two limitations of local search?

A

Answer: First, it is not guaranteed to find a solution if one exists (not complete). Second, it is not able to show if solution does not exist.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Q3. Plateaus are set of connected states {n1, …, nk} with h(n1) = h(n2) = … = h(nk). Why are plateaus a problem for local search?

A

Answer: Because local search relies on h(n) when deciding which neighbor to choose. If they all are the same it will struggle and will probably fail.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Q4. Why does randomization help in local search?

A

Answer: Randomness helps avoiding getting trapped in local minima.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Q5. Explain how the next state is chosen in simulated annealing.

A

Answer: First we pick a variable on random and assign a value on random. Then we check if it is an improvement over current state. If it is, adopt it, otherwise adopt it probabilistically with probability e^((h(n) - h(n’))/T) where n is current state, n’ is next state, T is temperature parameter.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Q6. How would you change a local search algorithm to better find global maxima that are surrounded by local maxima? Assume that the neighbor relation is fixed.

A

Answer: We can add randomization. Random reset and random walk to escape local maximas.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Q7. How would you change a local search algorithm to better find global maxima that are surrounded by plateaus? Assume that the neighbor relation is fixed.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Q8. How would you design a local search algorithm for a problem where there are no plateaus and no local minima that are not global minima? Assume that the neighbor relation is fixed.

A

Answer: Use only greedy descent. No randomization.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Q9. What is the difference between a random step and a random restart?

A

Answer: In a random walk, you move to a random neighbour. In a random restart, you randomly assign values to all variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Q10. Consider two local search algorithms, A and B. A solves x% of the problems it is given within (.25x)^2 minutes. B solves y% of the problems it is given within y minutes. Is one algorithm always better than the other? If not, which algorithm would you use when?

A

Answer: For A, solving time for x% is (0.25)^2*x^2. For small x% (< ~15%) it will solve faster than B, but as x% and y% increase, B does better than A.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

QA11. What is the key weakness of stochastic local search?

A

Answer: With SLS we cannot show that no solution exists.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

QA12. In local search, how do we determine neighbours?

A

Answer: A neighbour is usually just a small incremental change to variable assignment. For example, a neighbour might be an assignment in which one variable’s value has changed.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly