Experimental Uncertainty Analysis - Discussion

Discussion

Systematic errors in the measurement of experimental quantities leads to bias in the derived quantity, the magnitude of which is calculated using Eq(6) or Eq(7). However, there is also a more subtle form of bias that can occur even if the input, measured, quantities are unbiased; all terms after the first in Eq(14) represent this bias. It arises from the nonlinear transformations of random variables that often are applied in obtaining the derived quantity. The transformation bias is influenced by the relative size of the variance of the measured quantity compared to its mean. The larger this ratio is, the more skew the derived-quantity PDF may be, and the more bias there may be.

The Taylor-series approximations provide a very useful way to estimate both bias and variability for cases where the PDF of the derived quantity is unknown or intractable. The mean can be estimated using Eq(14) and the variance using Eq(13) or Eq(15). There are situations, however, in which this first-order Taylor series approximation approach is not appropriate – notably if any of the component variables can vanish. Then, a second-order expansion would be useful; see Meyer for the relevant expressions.

The sample size is an important consideration in experimental design. To illustrate the effect of the sample size, Eq(18) can be re-written as


RE_{\hat g} \,\, = \,\,{{\hat\sigma _g \,} \over {\hat g}}\,\,\, \approx \,\,\,\sqrt {\,\,\left( {{{s_L } \over {n_L \,\bar L}}} \right)^2 \,\,\, + \,\,\,\,4\left( {{{s_T } \over {n_T \,\bar T}}} \right)^2 \,\, + \,\,\,\,\left( {{{\bar \theta } \over 2}} \right)^4 \left( {{{s_\theta } \over {n_\theta \,\bar \theta }}} \right)^2 \,}

where the average values (bars) and estimated standard deviations s are shown, as are the respective sample sizes. In principle, by using very large n the RE of the estimated g could be driven down to an arbitrarily small value. However, there are often constraints or practical reasons for relatively small numbers of measurements.

Details concerning the difference between the variance and the mean-squared error (MSe) have been skipped. Essentially, the MSe estimates the variability about the true (but unknown) mean of a distribution. This variability is composed of (1) the variability about the actual, observed mean, and (2) a term that accounts for how far that observed mean is from the true mean. Thus

where β is the bias (distance). This is a statistical application of the parallel-axis theorem from mechanics.

In summary, the linearized approximation for the expected value (mean) and variance of a nonlinearly-transformed random variable is very useful, and much simpler to apply than the more complicated process of finding its PDF and then its first two moments. In many cases, the latter approach is not feasible at all. The mathematics of the linearized approximation is not trivial, and it can be avoided by using results that are collected for often-encountered functions of random variables.

Read more about this topic:  Experimental Uncertainty Analysis

Famous quotes containing the word discussion:

    The whole land seems aroused to discussion on the province of woman, and I am glad of it. We are willing to bear the brunt of the storm, if we can only be the means of making a break in that wall of public opinion which lies right in the way of woman’s rights, true dignity, honor and usefulness.
    Angelina Grimké (1805–1879)

    Power is action; the electoral principle is discussion. No political action is possible when discussion is permanently established.
    Honoré De Balzac (1799–1850)

    This is certainly not the place for a discourse about what festivals are for. Discussions on this theme were plentiful during that phase of preparation and on the whole were fruitless. My experience is that discussion is fruitless. What sets forth and demonstrates is the sight of events in action, is living through these events and understanding them.
    Doris Lessing (b. 1919)