Convex Optimization - Quasiconvex Minimization

Quasiconvex Minimization

Problems with convex level sets can be efficiently minimized, in theory. Yuri Nesterov proved that quasi-convex minimization problems could be solved efficiently, and his results were extended by Kiwiel. However, such theoretically "efficient" methods use "divergent-series" stepsize rules, which were first developed for classical subgradient methods. Classical subgradient methods using divergent-series rules are much slower than modern methods of convex minimization, such as subgradient projection methods, bundle methods of descent, and nonsmooth filter methods.

Solving even close-to-convex but non-conconvex problems can be computationally intractable. Minimizing a unimodal function is intractable, regardless of the smoothness of the function, according to results of Ivanov.

Read more about this topic:  Convex Optimization