A hybrid steady-state evolutionary algorithm using random swaps for Gaussian model-based clustering

作者:

Highlights:

摘要

A new hybrid evolutionary algorithm (EA) for Gaussian mixture model-based clustering is proposed. The EA is a steady-state method that, in each generation, selects two individuals from a population, creates two offspring using either mutation or crossover, and fine-tunes the offspring using the expectation maximization (EM) algorithm. The offspring compete with their parents for survival into the next generation. The approach proposed uses a random swap, which replaces a component mean with a randomly chosen feature vector as a mutation operator. In the crossover operator, a random component is copied from the source mixture into a destination mixture. Copying in crossover favors components of the source mixture located away from the components of the destination mixture. In computational experiments, the approach was compared to a multiple restarts EM algorithm, a random swap EM method, and a state-of-the-art hybrid evolutionary algorithm for Gaussian mixture model learning on one real and 29 synthetic datasets. The results indicate that, given the same computational budget, the proposed method usually learns mixtures with higher log-likelihoods than other benchmarks competing algorithms. The partitions of data obtained by the method correspond best to the original divisions of datasets into classes.

论文关键词:Clustering,Gaussian mixture model,Evolutionary algorithm,Expectation–maximization algorithm,Hybrid optimization,Random swap

论文评审过程:Received 24 December 2021, Revised 22 April 2022, Accepted 8 July 2022, Available online 16 July 2022, Version of Record 25 July 2022.

论文官网地址:https://doi.org/10.1016/j.eswa.2022.118159