Algebraic analysis of changes (sensitivity) Flashcards
Recall (matrix notaiton) when a basic solution is feasible
x_b = B^(-1)b >= 0
Recall (matrix notation) when a basic solution is optimal
optimal if
c_n - c_bB^(-1)N >= 0 (minimization problem)
it is opposite sign if we have max problem
elaborate on c_b B^(-1)
this is the slack-portion objective function row. Correspond to the dual values. Called shadow prices. They are found find reversed signs if they correspond to slack variables in >= constraints
NOTE: if >= constraints, they will not have the reverse sign in the tableau. we have to identify it, and flip the signs ourselves to get the true B^(-1) matrix.
Elaborate on changing a RHS coefficeint
WE change something in b. Therefore, we know that reduced costs will not be affected. Only the RHS columns will be affected.
The question is: Within which interval can we change b while keeping the current basic solution feasible. The current basic solution refers to the specific set of variables in basis, not their values (we change b, so we expect a change in the variable values).
Changing b essneitally means shifting the constraint up or down. If the constraint is binding, we will see immediate impact on the result. If not binding, we typically have infinite interval on one of the sides.
By changing hte constraint RHS value, we epxect that at some point, the solution will be determined by another set of constraints. These are the points/limit we want ot find.
We fidn the interval by doing the following:
x_b = B^(-1)(b + ∆b) = B^(-1)b + B^(-1)∆b
This gives us RHS values with delta values. Then we make use of the feasibility requirement of x_b >= 0, to find equations that we can solve for.
Note that this interval is also where the current shadow price of the constraint we change RHS to remains constant. As long as we are within the interval, we keep this solution feasible, and it remains optimal.
elaborate on changing an objective function coefficeint
The question is: Within which interval can an objective function coefficient vary in order for the current optimal solution to remain otpimal.
We use the convergence criterion here, with c_n >= 0 for minimization problems, and c_n <= 0 for max prboems.
c_n^(bar) = (c_n + ∆c_n) - (c_b + ∆c_b)B^(-1)N >= 0 (min)
note that if we change nonbasic coefficientm, the change is relatively simple. however, if we change c_b we change all the columns (in obj row) because of how it is multiplied by the matrices.
elaborate on adding a new variable
if we want to add a new variable x_k with objective function coefficient c_k and constraint column a_k, the reduced cost for this variable is calculated in order to verify whether the optimal soltuion is affected or not. The new variable DOES NOT affect the optimal solution if its reduced cost is larger than 0 (min problem). if it affects, it basically means that we can obtain a better objective function value by movng its direction. We therefore have to compute new iterations that does this.