4 | Reasoning with constraints Flashcards

1
Q

What is a possible world?

A

A possible world is a possible way the world (the real world or some imaginary world) could be.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What types of variables do we have?

A

An algebraic variable is a symbol used to denote features of possible worlds.

A discrete variable is one whose domain is finite or countably infinite.

A binary variable is a discrete variable with two values in its domain.
One particular case of a binary variable is a Boolean variable, which is a variable with domain {true, false}.

We can also have variables that are not discrete; for example, a variable whose domain corresponds to the real line or an interval of the real line is a continuous variable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is an assignment?

A

Given a set of variables, an assignment on the set of variables is a function from the variables into the domains of the variables. We write an assignment on {X1,X2,…,Xk} as {X1=v1,X2=v2,…,Xk=vk}, where vi is in dom(Xi). This assignment specifies that, for each i, variable Xi is assigned value vi. A variable can only be assigned one value in an assignment.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is an total assignment?

A

A total assignment assigns all of the variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is a possible world?

A

A possible world is a possible way the world could be. A possible world is defined to be a total assignment.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is a hard constraint?

A

A hard constraint, or simply constraint, specifies legal combinations of assignments of values to some of the variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is a scope and a relation?

A

A scope is a set of variables. A relation on a scope S is a function from assignments on S to {true,false}.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is a constraint?

A

A constraint c is a scope S and a relation on S. A constraint is said to involve each of the variables in its scope.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What does a constraint satisfaction problem consist of?

A

a set of variables,

a domain for each variable, and

a set of constraints.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Given a CSP, what tasks are useful?

A

Determine whether or not there is a model.

Find a model.

Count the number of models.

Enumerate all of the models.

Find a best model, given a measure of how good models are.

Determine whether some statement holds in all the models.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What are generate-and-test algorithms?

A

The generate-and-test algorithm to find one model is as follows: check each total assignment in turn; if an assignment is found that satisfies all of the constraints, return that assignment. A generate-and-test algorithm to find all models is the same except, instead of returning the first model found, it saves all of the models found.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

How do you convert a CSP to a graph?

A

The nodes are assignments of values to some subset of the variables.

The neighbors of a node n
are obtained by selecting a variable Y that is not assigned in node n and by having a neighbor for each assignment of a value to Y that does not violate any constraint.

Suppose node n is the assignment {X1=v1,…,Xk=vk}. To find the neighbors of n, select a variable Y that is not in the set {X1,…,Xk}. For each value yi∈dom(Y), where X1=v1,…,Xk=vk,Y=yi is consistent with each of the constraints, the node {X1=v1,…,Xk=vk,Y=yi} is a neighbor of n

The start node is the empty assignment that does not assign a value to any variables.

A goal node is a node that assigns a value to every variable. Note that this only exists if the assignment is consistent with all of the constraints.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What does a constraint network consist of?

A

The consistency algorithms are best thought of as operating over the network of constraints formed by the CSP:

There is a node for each variable. These nodes are drawn as circles or ovals.

There is a node for each constraint. These nodes are drawn as rectangles.

Associated with each variable, X, is a set DX of possible values. This set of values is initially the domain of the variable.

For every constraint c, and for every variable X in the scope of c, there is an arc ⟨X,c⟩.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the generalized arc consistency (GAC) algorithm?

A

It makes the entire network arc consistent by considering a set to_do of potentially inconsistent arcs, the to-do arcs. The set to_do initially consists of all the arcs in the graph. While the set is not empty, an arc ⟨X,c⟩ is removed from the set and considered. If the arc is not arc consistent, it is made arc consistent by pruning the domain of variable X. All of the previously consistent arcs that could, as a result of pruning X, have become inconsistent are placed back into the set to_do. These are the arcs ⟨Z,c′⟩, where c′ is a constraint different from c that involves X, and Z is a variable involved in c′ other than X.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is domain splitting?

A

The idea is to split a problem into a number of disjoint cases and solve each case separately. The set of all solutions to the initial problem is the union of the solutions to each case.

In the simplest case, suppose there is a binary variable X with domain {t,f}. All of the solutions either have X=t or X=f. One way to find all of the solutions is to set X=t, find all of the solutions with this assignment, and then assign X=f and find all solutions with this assignment. Assigning a value to a variable gives a smaller reduced problem to solve. If we only want to find one solution, we can look for the solutions with X=t, and if we do not find any, we can look for the solutions with X=f.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is variable elimination (VE)?

A

simplifies the network by removing variables.

The idea of VE is to remove the variables one by one. When removing a variable X, VE constructs a new constraint on some of the remaining variables reflecting the effects of X on all of the other variables. This new constraint replaces all of the constraints that involve X, forming a reduced network that does not involve X. The new constraint is constructed so that any solution to the reduced CSP can be extended to a solution of the CSP that contains X. In addition to creating the new constraint, VE provides a way to construct a solution to the CSP that contains X from a solution to the reduced CSP.

17
Q

What is local search?

A

Local search methods start with a total assignment of a value to each variable and try to improve this assignment iteratively by taking improving steps, by taking random steps, or by restarting with another total assignment. Many different local search techniques have been proposed. Understanding when these techniques work for different problems forms the focus of a number of research communities, in both operations research and AI.

18
Q

What is iterative best improvement?

A

Iterative best improvement is a local search algorithm that selects a successor of the current assignment that most improves some evaluation function. If there are several possible successors that most improve the evaluation function, one is chosen at random. When the aim is to minimize, this algorithm is called greedy descent. When the aim is to maximize a function, this is called hill climbing or greedy ascent. We only consider minimization; to maximize a quantity, you can minimize its negation.

19
Q

What is a local optimum?

A

A local optimum is an assignment such that no possible successor improves the evaluation function. This is also called a local minimum in greedy descent, or a local maximum in greedy ascent.

20
Q

What is a global optimum?

A

A global optimum is an assignment that has the best value out of all assignments. All global optima are local optima, but there can be many local optima that are not a global optimum.

21
Q

What are two random ways to escape local minima that are not global minima?

A

random restart, in which values for all variables are chosen at random. This lets the search start from a completely different part of the search space.

random walk, in which some random steps are taken interleaved with the optimizing steps. With greedy descent, this process allows for upward steps that may enable random walk to escape a local minimum that is not a global minimum.

A random walk is a local random move, whereas a random restart is a global random move. For problems involving a large number of variables, a random restart can be quite expensive.

22
Q

What is a stochastic local search?

A

A mix of iterative best improvement with random moves is an instance of a class of algorithms

23
Q

What is the most improving step method?

A

The most improving step method always selects a variable–value pair that makes the best improvement. If there are many such pairs, one is chosen at random.

24
Q

What is the two-stage choice method?

A

The two-stage choice algorithm maintains a priority queue of variables, where the weight of a variable is the number of conflicts in which it participates. At each time, the algorithm selects a variable that participates in the maximum number of conflicts. Once a variable has been chosen, it can be assigned either a value that minimizes the number of conflicts or a value at random. For each constraint that becomes true or false as a result of this new assignment, the other variables participating in the constraint must have their weight reevaluated.

25
Q

What is the any value conflict method?

A

Instead of choosing the best variable, an even simpler alternative is to select any variable participating in a conflict. A variable that is involved in a conflict is a conflicting variable. In the any-conflict algorithm, at each step, one of the conflicting variables is selected at random. The algorithm assigns to the chosen variable one of the values that minimizes the number of violated constraints or a value at random.

There are two variants of this algorithm, which differ in how the variable to be modified is selected:

In the first variant, a conflict is chosen at random, and then a variable that is involved in the conflict is chosen at random.

In the second variant, a variable that is involved in a conflict is chosen at random.

26
Q

What is simulated annealing?

A

Simulated annealing is a stochastic local search algorithm where the temperature is reduced slowly, starting from a random walk at high temperature eventually becoming pure greedy descent as it approaches zero temperature. The randomness should allow the search to jump out of local minima and find regions that have a low heuristic value; whereas the greedy descent will lead to local minima. At high temperatures, worsening steps are more likely than at lower temperatures.

Like the other methods, simulated annealing maintains a current total assignment. At each step, it picks a variable at random, then picks a value for that variable at random. If assigning that value to the variable does not increase the number of conflicts, the algorithm accepts the assignment of that value to the variable, resulting in a new current assignment. Otherwise, it accepts the assignment with some probability, depending on the temperature and how much worse the new assignment is than the current assignment. If the change is not accepted, the current assignment is unchanged.

27
Q

What is beam search?

A

Beam search is a method similar to iterative best improvement, but it maintains up to k assignments instead of just one. It reports success when it finds a satisfying assignment. At each stage of the algorithm, it selects k best possible successors of the current individuals (or all of them if there are less than k) and picks randomly in the case of ties. It repeats with this new set of k total assignments.

Beam search considers multiple assignments at the same time. Beam search is useful for memory-bounded cases, where k can be selected depending on the memory available. The variants of stochastic local search presented earlier are also applicable to beam search; you can spend more time finding the best k, or spend less time and only approximate the best k.

28
Q

What is stochastic beam search?

A

Stochastic beam search is an alternative to beam search, which, instead of choosing the best k individuals, selects k of the individuals at random; the individuals with a better evaluation are more likely to be chosen. This is done by making the probability of being chosen a function of the evaluation function. A standard way to do this is to use a Gibbs distribution or Boltzmann distribution and to select an assignment A

with probability proportional to
e−h(A)/T,

where h(A) is an evaluation function and T is a temperature.

Stochastic beam search tends to allow more diversity in the k individuals than does plain beam search. As an analogy to evolution in biology, the evaluation function reflects the fitness of the individual; the fitter the individual, the more likely it is to pass its genetic material – here its variable assignment – onto the next generation. Stochastic beam search is like asexual reproduction; each individual gives slightly mutated children and then stochastic beam search proceeds with survival of the fittest. Note that under stochastic beam search it is possible for an individual to be selected multiple times at random.

29
Q

What is genetic algorithms?

A

Genetic algorithms further pursue the evolution analogy. Genetic algorithms are like stochastic beam searches, but each new element of the population is a combination of a pair of individuals – its parents. In particular, genetic algorithms use an operation known as crossover that select pairs of individuals and then create new offspring by taking some of the values for the offspring’s variables from one of the parents and the rest of the values from the other parent, loosely analogous to how DNA is spliced in sexual reproduction.

Randomly select pairs of individuals where the fitter individuals are more likely to be chosen. How much more likely a fit individual is to be chosen than a less fit individual depends on the difference in fitness levels and a temperature parameter.

For each pair, perform a crossover.

Randomly mutate some (very few) values by choosing other values for some randomly chosen variables. This is a random walk step.

30
Q

What is a optimization problem?

A

a set of variables, each with an associated domain

an objective function that maps total assignments to real numbers, and

an optimality criterion, which is typically to find a total assignment that minimizes or maximizes the objective function.

31
Q

What is a constrained optimization problem?

A

A constrained optimization problem is an optimization problem that also has hard constraints specifying which variable assignments are possible. The aim is to find an optimal assignment that satisfies the hard constraints.

32
Q

What is a soft constraint?

A

A soft constraint has a scope that is a set of variables. The soft constraint is a function from the domains of the variables in its scope into a real number, a cost.

33
Q

What is a one-point crossover?

A

A common method is one-point crossover, which assumes a total ordering of the variables. An index i is selected at random. One of the children is constructed by selecting the values for the variables up to i from one of the parents, and the values for variables after i from the other parent. The other child gets the other values. The effectiveness of the crossover depends on the total ordering of the variables. The ordering of the variables is part of the design of the genetic algorithm.

34
Q

What is a conflict?

A

A violated constraint is called a conflict

35
Q

What is maximizing or minimizing a function?

A

When we talk of maximizing or minimizing a function what we mean is what can be the maximum possible value of that function or the minimum possible value of that function.