Optimization Glossary - History

History

Fermat and Lagrange found calculus-based formulas for identifying optima, while Newton and Gauss proposed iterative methods for moving towards an optimum. Historically, the first term for optimization was "linear programming", which was due to George B. Dantzig, although much of the theory had been introduced by Leonid Kantorovich in 1939. Dantzig published the Simplex algorithm in 1947, and John von Neumann developed the theory of duality in the same year.

The term programming in this context does not refer to computer programming. Rather, the term comes from the use of program by the United States military to refer to proposed training and logistics schedules, which were the problems Dantzig studied at that time.

Later important researchers in mathematical optimization include the following:

  • Richard Bellman
  • Ronald A. Howard
  • Narendra Karmarkar
  • William Karush
  • Leonid Khachiyan
  • Bernard Koopman
  • Harold Kuhn
  • Joseph Louis Lagrange
  • László Lovász
  • Arkadi Nemirovski
  • Yurii Nesterov
  • Boris Polyak
  • Lev Pontryagin
  • James Renegar
  • R. Tyrrell Rockafellar
  • Cornelis Roos
  • Naum Z. Shor
  • Michael J. Todd
  • Albert Tucker

Read more about this topic:  Optimization Glossary

Famous quotes containing the word history:

    Boys forget what their country means by just reading “the land of the free” in history books. Then they get to be men, they forget even more. Liberty’s too precious a thing to be buried in books.
    Sidney Buchman (1902–1975)

    The history of our era is the nauseating and repulsive history of the crucifixion of the procreative body for the glorification of the spirit.
    —D.H. (David Herbert)

    A man will not need to study history to find out what is best for his own culture.
    Henry David Thoreau (1817–1862)