Approximate Bayesian Computation - Example

Example

An illustrative example is a bistable system that can be characterized by a hidden Markov model (HMM) subject to measurement noise (Figure 2). Such models are employed for many biological systems: they have for example been used in Development, signaling, activation/deactivation, logical processing and non-equilibrium thermodynamics. For instance, the behavior of the Sonic Hedgehog (Shh) transcription factor in Drosophila melanogaster can be modeled with a HMM. The (biological) dynamical model consists of two states: A and B. If the probability of a transition from one state to the other is defined as in both directions, the probability to remain in the same state at each time step is 1-. The probability to measure the state correctly is (conversely, the probability of an incorrect measurement is 1-).

Due to the conditional dependencies between states at different time points, calculation of the likelihood of time series data is somewhat tedious, which illustrates the motivation to use ABC. A computational issue for the basic ABC is the large dimensionality of the data in an application like this. This can be reduced using the summary statistic S, which is the frequency of switches between the two states. As a distance measure, the absolute difference is used, combined with a tolerance . The posterior inference about the parameter can be done following the five steps presented in Figure 1:

Step 1: Assume that the observed data are the state sequence AAAABAABBAAAAAABAAAA, which was generated using . The associated summary statistic, the number of switches between the states in the experimental data, is .

Step 2: Assuming nothing is known about, a uniform prior in the interval is employed. A number n of parameter points are drawn from the prior, and the model is simulated for each of the parameter points, which results in sequences of simulated data. In this example, n=5, with each drawn parameter and simulated dataset recorded in Table 1, column 2-3. In practice, n would need to be much larger to obtain an appropriate approximation.

Table 1: Example of ABC rejection algorithm
i Simulated datasets (step 2) Summary statistic
(step 3)
Distance
(step 4)
Outcome
(step 4)
1 0.08 AABAAAABAABAAABAAAAA 8 2 accepted
2 0.68 AABBABABAAABBABABBAB 13 7 rejected
3 0.87 BBBABBABBBBABABBBBBA 9 3 rejected
4 0.43 AABAAAAABBABBBBBBBBA 6 0 accepted
5 0.53 ABBBBBAABBABBABAABBB 9 3 rejected

Step 3: The summary statistic is being computed for each sequence of simulated data, (Table 1, column 4).

Step 4: The distance between the observed and simulated transition frequencies is computed for all parameter points (Table 1, column 5). Parameter points for which the distance is smaller than or equal to are accepted as approximate samples from the posterior (Table 1, column 6).

Step 5: The posterior distribution is approximated with the accepted parameter points. The posterior distribution should have a non-negligible probability for parameter values in a region around the true value of in the system, if the data are sufficiently informative. In this example, the posterior probability mass is evenly split between the values 0.08 and 0.43.

Figure 3 shows the posterior probabilities obtained by ABC and a large n using either the summary statistic combined with ( and ) or the full data sequence. These are compared with the true posterior, which can be computed exactly and efficiently using the Viterbi algorithm. The used summary statistic is not sufficient, and it is seen that even with, the deviation from the theoretical posterior is considerable. Of note, a much longer observed data sequence would be required to obtain a posterior that is concentrated around the true value of .

This example application of ABC used simplifications for illustrative purposes. A number of review articles provide pointers to more realistic applications of ABC.

Read more about this topic:  Approximate Bayesian Computation

Famous quotes containing the word example:

    Our intellect is not the most subtle, the most powerful, the most appropriate, instrument for revealing the truth. It is life that, little by little, example by example, permits us to see that what is most important to our heart, or to our mind, is learned not by reasoning but through other agencies. Then it is that the intellect, observing their superiority, abdicates its control to them upon reasoned grounds and agrees to become their collaborator and lackey.
    Marcel Proust (1871–1922)