Derivative-free optimization by classification

Modified: 2016/12/09 14:48 by admin - Uncategorized
[Back to main page ]

For optimization non-covnex and complex functions, derivative-based methods may not effective because a point-wise derivative does not reflect the global landscape of the function. Instead, sampling in the solution space can reveal some global information about the function, and thus sampling-based methods, such as evolutionary algorithms, can be more suitable for complex optimizations.

Edit

RACOS

RACOS is designed according to the general complexity upper bound of a sampling-and-learning framework. It can be used to optimize functions in bounded continuous, discrete, and mixed solutions space. For details please see:
Yang Yu, Hong Qian, and Yi-Qi Hu. Derivative-free optimization via classification. In: Proceedings of the 30th AAAI Conference on Artificial Intelligence (AAAI'16), Phoenix, AZ, 2016. (PDF) (Appendix)


Edit

Sequential RACOS

Sequential RACOS (SRACOS) is the online version of RACOS, which updates the model after every sample of solution. In online scenarios, where solutions have to be evaluated one after another, SRACOS can converge much faster than RACOS. For details please see:
Yi-Qi Hu, Hong Qian, and Yang Yu. Sequential classification-based optimization for direct policy search. In: Proceedings of the 31st AAAI Conference on Artificial Intelligence (AAAI’17), San Francisco, CA, 2017. (PDF with Appendix)
The code of SRACOS is currently available in the Python version in Github (please see the instruction in Github). A major rewrite of RACOS is coming soon.



The latest version is in Github (available in Java, Matlab, C++, and Python): https://github.com/eyounx/RACOS
Local download:
Java version (used in the experiments): (Code Download in Zip, 32KB)
Matlab version: (Code Download in Zip, 10KB)
  • The codes are released under the GNU GPL 2.0 license. For commercial purposes, please contact or .

The end