Search Theory - Search From An Unknown Distribution

Search From An Unknown Distribution

When the searcher does not even know the distribution of offers, then there is an additional motive for search: by searching longer, more is learned about the range of offers available. Search from one or more unknown distributions is called a multi-armed bandit problem. The name comes from the slang term 'one-armed bandit' for a casino slot machine, and refers to the case in which the only way to learn about the distribution of rewards from a given slot machine is by actually playing that machine. Optimal search strategies for an unknown distribution have been analyzed using allocation indices such as the Gittins index.

Read more about this topic:  Search Theory

Famous quotes containing the words search, unknown and/or distribution:

    It no longer makes sense to speak of “feeding problems” or “sleep problems” or “negative behavior” is if they were distinct categories, but to speak of “problems of development” and to search for the meaning of feeding and sleep disturbances or behavior disorders in the developmental phase which has produced them.
    Selma H. Fraiberg (20th century)

    Out of the earth to rest or range
    Perpetual in perpetual change,
    The unknown passing through the strange.
    John Masefield (1878–1967)

    My topic for Army reunions ... this summer: How to prepare for war in time of peace. Not by fortifications, by navies, or by standing armies. But by policies which will add to the happiness and the comfort of all our people and which will tend to the distribution of intelligence [and] wealth equally among all. Our strength is a contented and intelligent community.
    Rutherford Birchard Hayes (1822–1893)