Multiple-try Metropolis
In Markov chain Monte Carlo, the Metropolis–Hastings algorithm (MH) can be used to sample from a probability distribution which is difficult to sample from directly. However, the MH algorithm requires the user to supply a proposal distribution, which can be relatively arbitrary. In many cases, one uses a Gaussian distribution centered on the current point in the probability space, of the form . This proposal distribution is convenient to sample from and may be the best choice if one has little knowledge about the target distribution, . If desired, one can use the more general multivariate normal distribution, where is the covariance matrix which the user believes is similar to the target distribution.
Although this method must converge to the stationary distribution in the limit of infinite sample size, in practice the progress can be exceedingly slow. If is too large, almost all steps under the MH algorithm will be rejected. On the other hand, if is too small, almost all steps will be accepted, and the Markov chain will be similar to a random walk through the probability space. In the simpler case of, we see that steps only takes us a distance of . In this event, the Markov Chain will not fully explore the probability space in any reasonable amount of time. Thus the MH algorithm requires reasonable tuning of the scale parameter ( or ).
Read more about Multiple-try Metropolis: Problems With High Dimensionality, Multiple-try Metropolis, See Also
Famous quotes containing the word metropolis:
“If Los Angeles has been called the capital of crackpots and the metropolis of isms, the native Angeleno can not fairly attribute all of the citys idiosyncrasies to the newcomerat least not so long as he consults the crystal ball for guidance in his business dealings and his wife goes shopping downtown in beach pajamas.”
—For the State of California, U.S. public relief program (1935-1943)