Search Algorithms Flashcards
Definition of Search algorithms:
Given a set of possible solutions, find the best (a reasonable) one
What purpose does Search serve?
Algorithm for finding a solution with specified properties among a collection of solutions
A Search solution:
- may be stored individually as records in a database
- may be elements of a search space defined by a
mathematical formula or procedure - a combination of the two, such as the Hamiltonian
circuits of a graph
Solution Spaces:
Ordered space
(e.g., dictionary)
* Unordered space
(e.g., Where’s Wally?)
* Higher-dimensional
space
Exhaustive (brute force)
Try all combinations — usually unfeasible
Branch and bound
- Divide, find upper and lower bounds for error/quality
— linear programming - Time-consuming
Randomized
- Generate random combinations, pick the best among them
- Random walk — generate random changes based on the last seen and move if better than current best
Iterative improvement methods
- Tabu Search
- Simulated Annealing
- Evolutionary Algorithm
Gradient Descent Approach refers to:
- Refers to learning methods that seek to minimize an objective function by which system performance is measured
- At each point in time, the policy is to choose the next step that yields the minimal objective function value
- The learning rate parameters refer to the step size taken at each point in time
- Each step is computed only on the basis of local information, which is extremely efficient, but introduces possibility of traps in local minima
Hill-Climbing
Analogous process whereby objective function is maximized
Hill-Climbing Algorithm
loop do
neighbor ← a highest-valued successor of current
if neighbor.VALUE ≤ current.VALUE then return current.STATE
current ← neighbor
Tabu Search: move selection
- Each move is chosen from a (generated) neighborhood
- Tabu list of forbidden moves (for a given duration)
- Tabu restrictions may be overridden when the tabu
move evaluation in a solution is better than any other visited so far: aspiration criterion
Tabu Search
- Search around last solution neighborhood
- Tries to overcome local minima by exploring neighbors and deciding moves based on “memory”
- Semi-random exploration based in maintaining a selective history of the states encountered during the search
- New states (neighbors) are feasible moves if they are not in the tabu list (tabu rule) → memory (short & long)
- The neighborhood of a state is not a static set, but rather a set that can change according to the history of the search
Simulated Annealing combines:
Gradient descent with random walk
Simulation of cooling of a heated material
- A solid material is heated past its melting point and then cooled back into a solid state (annealing)
- The final structure depends on how the cooling is performed
- slow cooling → large crystal (low energy)
- fast cooling → imperfections (high energy)
- Metropolis’ algorithm simulates the change in energy of the system when subjected to the cooling process, converging to a final state (of a certain energy)
Simulated Annealing new solution is accepted if
- Has lower cost than the previous
- When cost is higher – with a probability based on the
difference to the best and the defined “temperature”
Simulated Annealing Algorithm
current ← MAKE-NODE(problem.INITIAL-STATE)
for t = 1 to ∞ do
T ← schedule(t)
if T = 0 then return current
next ← a randomly selected successor (similar) of current
∆E ← next.VALUE – current.VALUE
if ∆E > 0 then current ← next
else current ← next only with probability e∆E/T
Simulated Annealing Issues
- Minimum and maximum temperature
- Scheme for decreasing temperature
- Temperature length (number of iterations at each temperature)
- Evaluation (cost) function
- Initialization
Genetic Algorithm
Tries to find the best
fitted solutions by:
- Generating a
population of
solutions - Evaluating all (fitness
function) - Recombining and/or
Mutating the best
individuals in the
evolutionary
population to create a
new generation
Genetic Algorithm implementation
repeat
new population ← empty set
for i = 1 to SIZE(population) do
x ← RANDOM-SELECTION(population,FITNESS-FN)
y ← RANDOM-SELECTION(population,FITNESS-FN)
child ← REPRODUCE(x , y)
if (small random probability) then child ← MUTATE(child)
add child to new population
population ← new population
until some individual is fit enough, or enough time has elapsed
return the best individual in population, according to FITNESS-FN
Important Aspects Genetic Algorithm
Implementation
- choose a good representation
- decide on the population size, mutation rate, . . .
- select/delete individuals
- best crossover(s), kind of mutation(s)
- Termination criteria
- Performance, scalability
- Adequate evaluation function
Genetic Algorithms advantages
*Are easily understandable
- Support multi-objective optimization
- Show good performance in noisy environments
- Answers get better over time
- Inherently parallel and easy to distribute