Mean Value Theorem - Mean Value Theorem For Vector-valued Functions

Mean Value Theorem For Vector-valued Functions

There is no exact analog of the mean value theorem for vector-valued functions. J..Dieuodenne in his classic treatise "foundations of Modern analysis " discards the mean value theorem and replaces it by mean inequality as the proof is not constructive and by no way one can find the mean value. in applications one only needs mean inequality. S.lang in analysis I uses the mean value theorem, in integral form, as an instant reflex but this requires use the continuity of the derivative. If one uses Henstock-Kurzweil integral one can have the mean value theorem in integral form without the additional assumption that derivative should be continuous as every derivative is Henstock-Kurzweil integrable. The problem is roughly speaking the following: If is a differentiable function (where is open) and if is the line segment in question (lying inside ), then one can apply the above parametrization procedure to each of the component functions of (in the above notation set ). In doing so one finds points on the line segment satisfying

But generally there will not be a single point on the line segment satisfying

for all simultaneously. (As a counterexample one could take defined via the component functions . Then, but and are never simultaneously zero as ranges over .)

However a certain type of generalization of the mean value theorem to vector-valued functions is obtained as follows: Let f be a continuously differentiable real-valued function defined on an open interval I, and let x as well as x + h be points of I. The mean value theorem in one variable tells us that there exists some between 0 and 1 such that

On the other hand we have, by the fundamental theorem of calculus followed by a change of variables,

Thus, the value at the particular point has been replaced by the mean value . This last version can be generalized to vector valued functions:

Let be open, continuously differentiable, and vectors such that the whole line segment remains in . Then we have:

where the integral of a matrix is to be understood componentwise. ( denotes the Jacobian matrix of ƒ.)

From this one can further deduce that if ||(x + th)|| is bounded for t between 0 and 1 by some constant M, then


Proof of (*). Write for the real valued components of . Define the functions by

Then we have

f_i(x+h)-f_i(x)\, =\, g_i(1)-g_i(0) =\int_0^1 g_i'(t)dt = \int_0^1 \left(\sum_{j=1}^n \frac{\partial f_i}{\partial x_j} (x+th)h_j\right)\,dt =\sum_{j=1}^n \left(\int_0^1 \frac{\partial f_i}{\partial x_j}(x+th)\,dt\right)h_j.

The claim follows since is the matrix consisting of the components, q.e.d.

Proof of (**). From (*) it follows that ||f(x+h)-f(x)||=\left\|\int_0^1 (Df(x+th)\cdot h)\,dt\right\| \leq \int_0^1 ||Df(x+th)|| \cdot ||h||\, dt \leq M||h||.

Here we have used the following

Lemma. Let be a continuous function defined on the interval . Then we have

Proof of (***). Let denote the value of the integral Now

||u||^2 = \langle u,u \rangle = \left\langle \int_a^b v(t) dt,u \right\rangle = \int_a^b \langle v(t),u \rangle \,dt \leq \int_a^b ||v(t)||\cdot ||u||\,dt = ||u|| \int_a^b ||v(t)||\,dt,

thus as desired. (Note the use of the Cauchy–Schwarz inequality.) This shows (***) and thereby finishes the proof of (**).

Read more about this topic:  Mean Value Theorem

Famous quotes containing the words theorem and/or functions:

    To insure the adoration of a theorem for any length of time, faith is not enough, a police force is needed as well.
    Albert Camus (1913–1960)

    One of the most highly valued functions of used parents these days is to be the villains of their children’s lives, the people the child blames for any shortcomings or disappointments. But if your identity comes from your parents’ failings, then you remain forever a member of the child generation, stuck and unable to move on to an adulthood in which you identify yourself in terms of what you do, not what has been done to you.
    Frank Pittman (20th century)