Evolutionary Multi-Objective Optimization Algorith with the Machine Learning Surrogate Models
Authors: Gorbunov S.M., Stanovov V.V. | Published: 01.07.2025 |
Published in issue: #2(151)/2025 | |
DOI: | |
Category: Informatics, Computer Engineering and Control | Chapter: System Analysis, Control, and Information Processing | |
Keywords: expensive multicriteria optimization, evolutionary algorithms, surrogate models, kriging, support vector machines |
Abstract
The efficiency of the evolutionary algorithm is improved in solving the expensive multicriteria optimization problems by implementing efficient regression models of the machine learning that approximate objective functions to accelerate convergence to the true Pareto frontier with a limited number of the criterion computation (the so-called surrogate approach). The developed method of multicriteria optimization with a surrogate approach is based on the MOEA/D evolutionary algorithm. The regression analysis method based on the Gaussian processes, i.e., kriging (KRG), and the support vector machine (SVM) designed to solve classification and regression problems are considered as the surrogate models. The paper shows that the modified MOEA/D algorithm demonstrates comparable or better results on all the test problems in the international competition of the multicriteria optimization algorithms. It notes that comparison with the other participants in the competition were not carried out due to other conditions, in particular, due to the reduced computation resources. The proposed kriging method shows the best average results for the IGD metric, which is designed to assess the multicriteria optimization algorithm quality. The developed kriging method also demonstrates more diverse solutions compared to the support vector methods and the MOEA/D
The work was performed with support by the Ministry of Science and Higher Education of the Russian Federation within the framework of the State Task (project no. FEFE-2023-0004)
Please cite this article in English as:
Gorbunov S.M., Stanovov V.V. Evolutionary multi-objective optimization algorith with the machine learning surrogate models. Herald of the Bauman Moscow State Technical University, Series Instrument Engineering, 2025, no. 2 (151), pp. 48--62 (in Russ.). EDN: TCDOWZ
References
[1] Zitzler E. Evolutionary algorithms for multiobjective optimization: methods and applications. Zurich, Swiss Federal Institute of Technology, 1999.
[2] Li C., Han S., Zeng S., et al. Expensive optimization. In: Intelligent optimization. Singapore, Springer, 2024, pp. 265--281. DOI: https://doi.org/10.1007/978-981-97-3286-9_14
[3] Rahat A.A., Everson R.M., Fieldsend J.E. Alternative infill strategies for expensive multi-objective optimization. In: GECCO’17, 2017, pp. 873--880. DOI: https://doi.org/10.1145/3071178.3071276
[4] Wang H., Jin Y., Doherty J. Committee-based active learning for surrogate-assisted particle swarm optimization of expensive problems. IEEE Trans. Cybern., 2017, vol. 47, no. 9, pp. 2664--2677. DOI: https://doi.org/10.1109/TCYB.2017.2710978
[5] Partha P.B., Ponnuthurai N.S. Multiobjective evolutionary optimization. In: Wiley encyclopedia of electrical and electronics engineering. New York, John Wiley, 2018. DOI: https://doi.org/10.1002/047134608x.w8380
[6] Li H., Zhanq Q. MOEA/D: a multiobjective evolutionary algorithm based on decomposition. IEEE Trans. Evol. Comput., 2007, vol. 11, no. 6, pp. 712--731. DOI: https://doi.org/10.1109/TEVC.2007.892759
[7] Chiles J., Desassis N. Fifty years of kriging. In: Handbook of mathematical geosciences. Cham, Springer, 2018, pp. 589--612. DOI: https://doi.org/10.1007/978-3-319-78999-6_29
[8] Cortes C., Vapnik V. Support vector networks. Mac. Learn., 1995, vol. 20, no. 3, pp. 273--297. DOI: https://doi.org/10.1007/bf00994018
[9] Rasmussen C.E., Christopher W. Gaussian processes for machine learning. London, MIT Press, 2006.
[10] Vapnik V. The nature of statistical learning theory. Berlin, Heidelberg, Springer, 2000.
[11] Santner T., Williams B., Notz W. The design and analysis of computer experiments. Berlin, Springer, 2003.
[12] Sacks J., Schiller S.B., Welch W.J. Designs for computer experiments. Technometrics, 1989, vol. 31, no. 1, pp. 41--47. DOI: https://doi.org/10.1080/00401706.1989.10488474
[13] Press W.H., Teukolsky S.A. Numerical recipes in C. The art of scientific computing. Cambridge, Cambridge University Press, 1992.
[14] Horn R.A., Johnson C.R. Matrix analysis. Cambridge, Cambridge University Press, 1990.
[15] Bennett K., Mangasarian O. Bilinear separation of two sets in n-space. Comput. Optim. Applic., 1993, vol. 2, pp. 207--227. DOI: https://doi.org/10.1007/BF01299449
[16] Bishop C.M. Pattern recognition and machine learning. New York, Springer Sciense & Business Media, 2006.
[17] Smith K. Precalculus. Burlington, Jones & Bartlett Learning, 2011.
[18] Zitzler E., Thiele L. Multiobjective evolutionary algorithms: a comparative case study and the strength Pareto approach. IEEE Trans. Evol. Comput., 1999, vol. 3, no. 4, pp. 257--271. DOI: https://doi.org/10.1109/4235.797969
[19] Zhang Q., Liu W., Tsang E., et al. Expensive multiobjective optimization by MOEA/D with Gaussian process model. IEEE Trans. Evol. Comput., 2010, vol. 14, no. 3, pp. 456--474. DOI: https://doi.org/10.1109/TEVC.2009.2033671
[20] Zhang Q., Zhou A., Zhao S., et al. Multiobjective optimization test instances for the CEC 2009 special session and competition. Technical Report CES-487. Colchester, University of Essex, 2008.
[21] Sun Y., Yen G., Yi Z. IGD indicator-based evolutionary algorithm for many-objective optimization problems. IEEE Trans. Evol. Comput., 2019, vol. 23, no. 2, pp. 173--187. DOI: https://doi.org/10.1109/TEVC.2018.2791283
[22] Hamdam M. On the disruption-level of polynomial mutation for evolutionary multi-objective optimisation algorithms. Computing and Informatics, 2010, vol. 29, no. 5, pp. 783--800.
[23] Ram B., Deb K. Simulated binary crossover for continuous search space. Complex Systems, 1995, vol. 9, no. 2. URL: https://www.complex-systems.com/abstracts/v09_i02_a02
[24] Mann H.B., Whitney D.R. On a test of whether one of two random variables is stochastically larger than the other. Ann. Math. Statist., 1947, vol. 18, no. 1, pp. 50--60. DOI: https://doi.org/10.1214/aoms/1177730491