Search Theory - Search From An Unknown Distribution

Search From An Unknown Distribution

When the searcher does not even know the distribution of offers, then there is an additional motive for search: by searching longer, more is learned about the range of offers available. Search from one or more unknown distributions is called a multi-armed bandit problem. The name comes from the slang term 'one-armed bandit' for a casino slot machine, and refers to the case in which the only way to learn about the distribution of rewards from a given slot machine is by actually playing that machine. Optimal search strategies for an unknown distribution have been analyzed using allocation indices such as the Gittins index.

Read more about this topic:  Search Theory

Famous quotes containing the words search, unknown and/or distribution:

    Gaily bedight,
    A gallant knight,
    In sunshine and in shadow,
    Had journeyed long,
    Singing a song,
    In search of Eldorado.
    Edgar Allan Poe (1809–1849)

    I was a closet pacifier advocate. So were most of my friends. Unknown to our mothers, we owned thirty or forty of those little suckers that were placed strategically around the house so a cry could be silenced in less than thirty seconds. Even though bottles were boiled, rooms disinfected, and germs fought one on one, no one seemed to care where the pacifier had been.
    Erma Bombeck (20th century)

    The man who pretends that the distribution of income in this country reflects the distribution of ability or character is an ignoramus. The man who says that it could by any possible political device be made to do so is an unpractical visionary. But the man who says that it ought to do so is something worse than an ignoramous and more disastrous than a visionary: he is, in the profoundest Scriptural sense of the word, a fool.
    George Bernard Shaw (1856–1950)