Optimization training Flashcards

1
Q

Hill climbing

A
  1. Marginally more complex than greedy random training.
  2. Instead of throwing out random vector for better on, it does refinement in the current one.
  3. Which is why the name actually is very descriptive.
  4. Hill climbing the process terminates at the nearest high point which is called local maxima
  5. Need to consider more dimensions to search. You will very likely never find global maxima and minima.
  6. Sometimes have to randomize and start over if you find your training to be”stuck”
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Hill climbing implementation

A
  • Hill climbing algorithm is implemented in two separate functions.
  • The first function performs initialization and second one does iteration
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Simulated Annealing

A

Simulated annealing is a method for finding a good (not necessarily perfect) solution to an optimization problem. If you’re in a situation where you want to maximize or minimize something, your problem can likely be tackled with simulated annealing.

The traveling salesman problem is a good example: the salesman is looking to visit a set of cities in the order that minimizes the total number of miles he travels. As the number of cities gets large, it becomes too computationally intensive to check every possible itinerary. At that point, you need an algorithm.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Why choose simulated annealing?

A

There are many optimization algorithms, including hill climbing, genetic algorithms, gradient descent, and more. Simulated annealing’s strength is that it avoids getting caught at local maxima - solutions that are better than any others nearby but aren’t the very best.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Simulated annealing basic structure?

A

basic algorithm:

First, generate a random solution

Calculate its cost using some cost function you’ve defined

Generate a random neighboring solution

Calculate the new solution’s cost

Compare them:

If cnew < cold: move to the new solution

If cnew > cold: maybe move to the new solution

Repeat steps 3-5 above until an acceptable solution is found or you reach some maximum number of iterations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly