Overview
We illustrate some important concepts with a very simple example, the computation of
For most integrands we can't use the fundamental theorem of calculus to compute the integral analytically; we have to approximate it numerically. We compute the values of at n points
The n numbers are the partial information about the true integrand We combine these n numbers by a combinatory algorithm to compute an approximation to the integral. See the monograph Complexity and Information for particulars.
Because we have only partial information we can use an adversary argument to tell us how large n has to be to compute an -approximation. Because of these information-based arguments we can often obtain tight bounds on the complexity of continuous problems. For discrete problems such as integer factorization or the travelling salesman problem we have settle for conjectures about the complexity hierarchy. The reason is that the input is a number or a vector of numbers and can thus be entered into the computer. Thus there is typically no adversary argument at the information level and the complexity of a discrete problem is rarely known.
The univariate integration problem was for illustration only. Significant for many applications is multivariate integration. The number of variables is in the hundreds or thousands. The number of variables may even be infinite; we then speak of path integration. The reason that integrals are important in many disciplines is that they occur when we want to know the expected behavior of a continuous process. See for example, the application to mathematical finance below.
Assume we want to compute an integral in d dimensions (dimensions and variables are used interchangeably) and that we want to guarantee an error at most for any integrand in some class. The computational complexity of the problem is known to be of order (Here we are counting the number of function evaluations and the number of arithmetic operations so this is the time complexity.) This would take many years for even modest values of The exponential dependence on d is called the curse of dimensionality. We say the problem is intractable.
We stated the curse of dimensionality for integration. But exponential dependence on d occurs for almost every continuous problem that has been investigated. How can we try to vanquish the curse? There are two possibilities:
- We can weaken the guarantee that the error must be less than (worst case setting) and settle for a stochastice assurance. For example, we might only require that the expected error be less than (average case setting.) Another possibility is the randomized setting. For some problems we can break the curse of dimensionality by weakening the assurance; for others, we cannot. There is a large IBC literature on results in various settings; see Where to Learn More below.
- We can incorporate domain knowledge. See An Example: Mathematical Finance below.
Read more about this topic: Information-based Complexity