Page History: Model-based Derivative-free Methods for Optimization

Compare Page Revisions



« Older Revision - Back to Page History - Newer Revision »


Page Revision: 2016/02/16 13:14


Derivative-free methods can tackle complex optimizations in real domains, such as non-convex, non-differentiable, and non-continuous problems with many local optima.

Papers:

  • General analysis: In the CEC'14 (PDF) paper, we proposed the sampling-and-learning (SAL) framework to capture the essence of model-based optimization algorithms, and analyzed its performance using the query complexity for achieving approximate solutions with a probability. We derived a general query complexity bound for SAL algorithms where the learning model is a classifier.

  • Classification-based optimization: In the AAAI'16 (PDF) (Appendix) paper, we discovered key factors for classification-based optimization methods, and designed the RACOS algorithm accordingly. RACOS has been shown superior to some state-of-the-art derivative-free optimization algorithms.

  • Scale to high-dimension by random embedding: In the AAAI'16 (PDF) paper, we consider solving high-dimensional optimization problems with a low effective dimension. We proved that the random embedding algorithm can reduce the regret bound of the simultaneous optimistic optimization (SOO) algorithm, which is a theoretical-grounded derivative-free method, from depending on the size of the high-dimensions to depending on the size of the low effective dimensions.

Codes:

  • The derivative-free optimization by classification algorithm: RACOS

The end