Description : This package includes the Python code of the SGES algorithm [1] for reducing the high variance of the gradient estimator of evolutionary strategies in high-dimensional optimization. SGES utilizes historical estimated gradients to construct a low-dimensional subspace for sampling search directions, and adjusts the importance of this subspace adaptively. Experiments on two classes of high-dimensional tasks: black-box functions from the recently open-sourced Nevergrad library [2] and continuous MuJoCo locomotion tasks from the OpenAI Gym library [3], show the excellent performance of SGES. README files are included in the package, showing how to use the code.

References: [1] Fei-Yu Liu, Zi-Niu Li and Chao Qian. Self-Guided Evolution Strategies with Historical Estimated Gradient. In: Proceedings of the 29th International Joint Conference on Artificial Intelligence (IJCAI'20), Yokohama, Japan, 2020. [2] Jeremy Rapin and Olivier Teytaud. Nevergrad - A gradient-free optimization platform. FacebookResearch/Nevergrad, 2018. [3] Greg Brockman, Vicki Cheung, Ludwig Pettersson, Jonas Schneider, John Schulman, Jie Tang, and Wojciech Zaremba. OpenAI Gym. CoRR abs/1606.01540, 2016.

ATTN: This package is free for academic usage. You can run it at your own risk. For other purposes, please contact Dr. Chao Qian (

Requirement: The package was developed with Python.

ATTN2: This package was developed by Mr. Fei-Yu Liu ( and Mr. Zi-Niu Li ( For any problem concerning the code, please feel free to contact Mr. Liu or Mr. Li.

Download: code (1MB)