Search Theory - Search From An Unknown Distribution

Search From An Unknown Distribution

When the searcher does not even know the distribution of offers, then there is an additional motive for search: by searching longer, more is learned about the range of offers available. Search from one or more unknown distributions is called a multi-armed bandit problem. The name comes from the slang term 'one-armed bandit' for a casino slot machine, and refers to the case in which the only way to learn about the distribution of rewards from a given slot machine is by actually playing that machine. Optimal search strategies for an unknown distribution have been analyzed using allocation indices such as the Gittins index.

Read more about this topic:  Search Theory

Famous quotes containing the words search, unknown and/or distribution:

    Let the maiden, with erect soul, walk serenely on her way, accept the hint of each new experience, search in turn all the objects that solicit her eye, that she may learn the power and charm of her new-born being, which is the kindling of a new dawn in the recesses of space.
    Ralph Waldo Emerson (1803–1882)

    The unknown always seems unbelievable, Lucas.
    Harry Essex (b. 1910)

    The question for the country now is how to secure a more equal distribution of property among the people. There can be no republican institutions with vast masses of property permanently in a few hands, and large masses of voters without property.... Let no man get by inheritance, or by will, more than will produce at four per cent interest an income ... of fifteen thousand dollars] per year, or an estate of five hundred thousand dollars.
    Rutherford Birchard Hayes (1822–1893)