Stochastic Programming - Statistical Inference

Statistical Inference

Consider the following stochastic programming problem


\min\limits_{x\in X}\{ g(x) = f(x)+E \}

Here is a nonempty closed subset of, is a random vector whose probability distribution is supported on a set, and . In the framework of two-stage stochastic programming, is given by the optimal value of the corresponding second-stage problem.

Assume that is well defined and finite valued for all . This implies that for every the value is finite almost surely.

Suppose that we have a sample of realizations of the random vector . This random sample can be viewed as historical data of observations of, or it can be generated by Monte Carlo sampling techniques. Then we can formulate a corresponding sample average approximation


\min\limits_{x\in X}\{ \hat{g}_N(x) = f(x)+\frac{1}{N} \sum_{j=1}^N Q(x,\xi^j) \}

By the Law of Large Numbers we have that, under some regularity conditions converges pointwise with probability 1 to as . Moreover, under mild additional conditions the convergence is uniform. We also have, i.e., is an unbiased estimator of . Therefore it is natural to expect that the optimal value and optimal solutions of the SAA problem converge to their counterparts of the true problem as .

Read more about this topic:  Stochastic Programming

Famous quotes containing the word inference:

    I shouldn’t want you to be surprised, or to draw any particular inference from my making speeches, or not making speeches, out there. I don’t recall any candidate for President that ever injured himself very much by not talking.
    Calvin Coolidge (1872–1933)