: This package includes the Python code of the SGES algorithm  for reducing the high variance of the gradient estimator of evolutionary strategies in high-dimensional optimization. SGES utilizes historical estimated gradients to construct a low-dimensional subspace for sampling search directions, and adjusts the importance of this subspace adaptively. Experiments on two classes of high-dimensional tasks: black-box functions from the recently open-sourced Nevergrad library  and continuous MuJoCo locomotion tasks from the OpenAI Gym library , show the excellent performance of SGES. README files are included in the package, showing how to use the code.
:  Fei-Yu Liu, Zi-Niu Li and Chao Qian. Self-Guided Evolution Strategies with Historical Estimated Gradient. In: Proceedings of the 29th International Joint Conference on Artificial Intelligence (IJCAI'20), Yokohama, Japan, 2020.
 Jeremy Rapin and Olivier Teytaud. Nevergrad - A gradient-free optimization platform. http://GitHub.com/ FacebookResearch/Nevergrad, 2018.
 Greg Brockman, Vicki Cheung, Ludwig Pettersson, Jonas Schneider, John Schulman, Jie Tang, and Wojciech Zaremba. OpenAI Gym. CoRR abs/1606.01540, 2016.
: This package is free for academic usage. You can run it at your own risk. For other purposes, please contact Dr. Chao Qian (firstname.lastname@example.org).
: The package was developed with Python.
: This package was developed by Mr. Fei-Yu Liu (email@example.com) and Mr. Zi-Niu Li (firstname.lastname@example.org). For any problem concerning the code, please feel free to contact Mr. Liu or Mr. Li.