Reduced Cost

In linear programming, reduced cost, or opportunity cost, is the amount by which an objective function coefficient would have to improve (so increase for maximization problem, decrease for minimization problem) before it would be possible for a corresponding variable to assume a positive value in the optimal solution. It is the cost for increasing a variable by a small amount, i.e., the first derivative from a certain point on the polyhedron that constrains the problem. When the point is a vertex in the polyhedron, the variable with the most extreme cost, negatively for minimisation and positively maximisation, is sometimes referred to as the steepest edge.

Given a system minimize subject to, the reduced cost vector can be computed as, where is the dual cost vector.

It follows directly that for a minimisation problem, any non-basic variables at their lower bounds with strictly negative reduced costs are eligible to enter that basis, while any basic variables must have a reduced cost that is exactly 0. For a maximisation problem, the non-basic variables at their lower bounds that are eligible for entering the basis have a strictly positive reduced cost.

Read more about Reduced Cost:  Interpretation, Reduced Costs in Pivot Strategy, Reduced Cost in Linear Programming, See Also

Famous quotes containing the words reduced and/or cost:

    It is Mortifying to suppose it possible that a people able and zealous to contend with the Enemy should be reduced to fold their Arms for want of the means of defence; yet no resources that we know of, ensure us against this event.
    Thomas Jefferson (1743–1826)

    To become a token woman—whether you win the Nobel Prize or merely get tenure at the cost of denying your sisters—is to become something less than a man ... since men are loyal at least to their own world-view, their laws of brotherhood and self-interest.
    Adrienne Rich (b. 1929)