Optimization - History

History

Fermat and Lagrange found calculus-based formulas for identifying optima, while Newton and Gauss proposed iterative methods for moving towards an optimum. Historically, the first term for optimization was "linear programming", which was due to George B. Dantzig, although much of the theory had been introduced by Leonid Kantorovich in 1939. Dantzig published the Simplex algorithm in 1947, and John von Neumann developed the theory of duality in the same year.

The term programming in this context does not refer to computer programming. Rather, the term comes from the use of program by the United States military to refer to proposed training and logistics schedules, which were the problems Dantzig studied at that time.

Later important researchers in mathematical optimization include the following:

  • Richard Bellman
  • Ronald A. Howard
  • Narendra Karmarkar
  • William Karush
  • Leonid Khachiyan
  • Bernard Koopman
  • Harold Kuhn
  • Joseph Louis Lagrange
  • László Lovász
  • Arkadi Nemirovski
  • Yurii Nesterov
  • Boris Polyak
  • Lev Pontryagin
  • James Renegar
  • R. Tyrrell Rockafellar
  • Cornelis Roos
  • Naum Z. Shor
  • Michael J. Todd
  • Albert Tucker

Read more about this topic:  Optimization

Famous quotes containing the word history:

    If you look at history you’ll find that no state has been so plagued by its rulers as when power has fallen into the hands of some dabbler in philosophy or literary addict.
    Desiderius Erasmus (c. 1466–1536)

    Universal history is the history of a few metaphors.
    Jorge Luis Borges (1899–1986)

    All history becomes subjective; in other words there is properly no history, only biography.
    Ralph Waldo Emerson (1803–1882)