Mathematical Optimization - History

History

Fermat and Lagrange found calculus-based formulas for identifying optima, while Newton and Gauss proposed iterative methods for moving towards an optimum. Historically, the first term for optimization was "linear programming", which was due to George B. Dantzig, although much of the theory had been introduced by Leonid Kantorovich in 1939. Dantzig published the Simplex algorithm in 1947, and John von Neumann developed the theory of duality in the same year.

The term programming in this context does not refer to computer programming. Rather, the term comes from the use of program by the United States military to refer to proposed training and logistics schedules, which were the problems Dantzig studied at that time.

Later important researchers in mathematical optimization include the following:

  • Richard Bellman
  • Ronald A. Howard
  • Narendra Karmarkar
  • William Karush
  • Leonid Khachiyan
  • Bernard Koopman
  • Harold Kuhn
  • Joseph Louis Lagrange
  • László Lovász
  • Arkadi Nemirovski
  • Yurii Nesterov
  • Boris Polyak
  • Lev Pontryagin
  • James Renegar
  • R. Tyrrell Rockafellar
  • Cornelis Roos
  • Naum Z. Shor
  • Michael J. Todd
  • Albert Tucker

Read more about this topic:  Mathematical Optimization

Famous quotes containing the word history:

    The best history is but like the art of Rembrandt; it casts a vivid light on certain selected causes, on those which were best and greatest; it leaves all the rest in shadow and unseen.
    Walter Bagehot (1826–1877)

    Spain is an overflow of sombreness ... a strong and threatening tide of history meets you at the frontier.
    Wyndham Lewis (1882–1957)

    There is nothing truer than myth: history, in its attempt to “realize” myth, distorts it, stops halfway; when history claims to have “succeeded” this is nothing but humbug and mystification. Everything we dream is “realizable.” Reality does not have to be: it is simply what it is.
    Eugène Ionesco (b. 1912)