Location Estimation in Sensor Networks - Known Noise PDF

Known Noise PDF

We begin with an example of a Gaussian noise, in which a suggestion for a system design is as follows


m_n(x_n)=I(x_n-\tau)=
\begin{cases} 1 & x_n > \tau \\ 0 & x_n\leq \tau
\end{cases}

\hat{\theta}=\tau-F^{-1}\left(\frac{1}{N}\sum\limits_{n=1}^{N}m_n(x_n)\right),\quad
F(x)=\frac{1}{\sqrt{2\pi}\sigma} \int\limits_{x}^{\infty}
e^{-w^2/2\sigma^2} \, dw

Here is a parameter leveraging our prior knowledge of the approximate location of . In this design, the random value of is distributed Bernoulli~. The processing center averages the received bits to form an estimate of, which is then used to find an estimate of . It can be verified that for the optimal (and infeasible) choice of the variance of this estimator is which is only times the variance of MLE without bandwidth constraint. The variance increases as deviates from the real value of, but it can be shown that as long as the factor in the MSE remains approximately 2. Choosing a suitable value for is a major disadvantage of this method since our model does not assume prior knowledge about the approximated location of . A coarse estimation can be used to overcome this limitation. However, it requires additional hardware in each of the sensors.

A system design with arbitrary (but known) noise PDF can be found in. In this setting it is assumed that both and the noise are confined to some known interval . The estimator of also reaches an MSE which is a constant factor times . In this method, the prior knowledge of replaces the parameter of the previous approach.

Read more about this topic:  Location Estimation In Sensor Networks

Famous quotes containing the word noise:

    I wonder about the trees.
    Why do we wish to bear
    Forever the noise of these
    More than another noise
    So close to our dwelling place?
    Robert Frost (1874–1963)