Abstract

Wavelet transformation is well applied in the field of image processing, and parameter optimization of wavelet transformation has always been an eternal topic on its performance improvement. In this paper, an adaptive self-organizing migration algorithm (ASOMA) is proposed to optimize the wavelet parameters to elevate the performance of wavelet denoising. Firstly, based on the original SOMA, an adaptive step size adjustment method is proposed by recording the step information of successful individuals, which improves the search ability of the SOMA. Secondly, an exploratory selection method of leader is proposed to effectively balance the exploration and exploitation of the SOMA. Finally, ASOMA is compared with the original SOMA and its variants using wavelet general threshold denoising on classical test images in denoising performance, which is evaluated by the indicators of peak signal-to-noise ratio (PSNR) and structural similarity (SSIM). The experimental results demonstrate that ASOMA has better denoising performance than the wavelet general threshold, the original SOMA, and the related variants of SOMA.

1. Introduction

The generation of noise will affect the quality of the image to a certain extent, and the visual effect presented by the image is greatly reduced [14]. To achieve a higher level of image processing, image denoising is particularly important [5]. In the field of traditional image denoising, there are mainly three kinds of image denoising methods, such as frequency-domain filtering and optimal linear filtering methods [6]. Spatial domain denoising [7] refers to a process of directly operating the pixels of the image itself. Frequency-domain denoising [8] is a process in which the image is operated in the frequency domain through an infinite trigonometric function using Fourier transform. The optimal linear filtering [9] is to design an optimal denoising filter under a specified criterion.

Nowadays, with the continuous in-depth study of the concept of wavelet, image denoising based on wavelet transformation has been widely used [10, 11]. Among all available technologies, denoising based on wavelet transform (WT) has been proven to have excellent performance [12, 13]. Because wavelet can have the localization characteristics of both time domain and frequency domain, it can retain not only the frequency-domain information of the image, but also the time-domain information of the image, so the application of wavelet technology becomes more and more widely [14]. In dealing with different problems, the effects obtained by the selection of different wavelet functions will also have their advantages and disadvantages.

In the traditional threshold selection, the fixed value usually cannot get satisfactory results in all cases. However, there is no best applicable method for multiscale wavelet transform, the selection of wavelet types, and how to select thresholds at different scales. For the optimization of the wavelet function, the most important thing is the processing of its threshold. How to find the optimal threshold more accurately is the key to improve the denoising effect. Due to the wide distribution of noise, searching for the values of parameters suitable for solving different noises becomes a complicated operation. Using the idea of metaheuristic to model the parameter search problem as a nondeterministic polynomial (NP) hard problem has become an efficient and feasible method.

Some scholars have combined the evolutionary algorithm (EA) model with the wavelet denoising and quickly searched for the optimal threshold through the idea of metaheuristic. Some of these methods are wavelet threshold processing developed by Donoho and Johnstone [15] and their variants (e.g., VisuShrink [16], SureShrink [17], and BayersShrink [18]), different methods are used to determine the wavelet threshold parameters. Matti and Al-Sulaifanie combined genetic algorithm (GA) with wavelet transform (WT) for signal denoising by using the parameters of WT as the input of GA and the output of MSE as the fitness value [19]. Reference [20] used differential evolution (DE) to estimate the best parameter set for wavelet shrinkage denoising. Bhutada et al. proposed a new method in which the adjustment parameters of the threshold function are optimized by a relatively new type of stochastic global optimization (PSO) algorithm [21]. Wang et al. proposed an improved chaotic particle swarm optimization algorithm, which uses the chaotic PSO method to optimize the wavelet threshold [20]. Liu et al. proposed the PSO shrinking method, which explores the complete solution space suitable for the threshold [22]. Liu et al. proposed a variant of PSO, which is a fast particle swarm optimization method to obtain the best wavelet threshold [23].

Besides, many scholars have also optimized the selection of the parameter “wavelet function” and the level of “decomposition level”. El-Dahshan used GA to select the best wavelet denoising parameters to maximize the filtering performance. The verification shows that the denoising scheme with genetic algorithm has better performance than other reported wavelet threshold algorithms, and the denoising quality of ECG signals is more suitable for clinical diagnosis [24]. Bhutada et al. proposed a satellite image denoising method based on evolutionary algorithm. Stochastic global optimization techniques such as cuckoo search (CS) algorithm, artificial bee colony (ABC), and particle swarm optimization (PSO) and their different variants are used to optimize the parameters of the required adaptive threshold function [25]. Anupriya and Tayal obtained the best parameter set (wavelet type, decomposition level, and threshold) for wavelet shrinkage denoising by using self-organizing migration algorithm (SOMA) [26]. Poornachandra and Kumaravel proposed a wavelet denoising method combined with genetic algorithm for different decomposition levels to determine the best denoising parameters (wavelet, threshold selection rule, and thresholding method) [27]. Gupta et al. proposed an adaptive threshold technology based on multilevel wavelet decomposition, which uses an improved particle swarm optimization (IPSO) algorithm to find the optimal value of the threshold and decomposition level of a given objective function [28].

In the above methods, the parameters involved in wavelet transformation are selected by metaheuristic algorithm from the perspective of the selection of wavelet threshold, wavelet function, and wavelet classification level. Only a few scholars have adjusted the relevant parameters involved in wavelet processing. Reference [26] proved that the use of SOMA can effectively improve the performance of wavelet denoising. This paper proposes a SOMA with adaptive step size on the basis of SOMA and compares the effect on wavelet noise reduction with the SOMA method used in [26]. Finally, an experimental test was carried out on the classic digital image “Lena,” and the performance of the algorithm on the wavelet denoising problem was compared. The experimental results also show that ASOMA can obtain a more accurate image denoising effect and can restore the original image more effectively.

The rest of this paper is organized as follows. Section 2 briefly introduces the basic principles of wavelet denoising. Section 3 briefly introduces the basic principle of SOMA. Section 4 introduces the proposed ASOMA. The experiments data are compared and analyzed in Section 5. Finally, some conclusions are given in Section 6.

2. The Principle of Wavelet Denoising

2.1. Wavelet Transformation

Wavelet transformation was first proposed in 1974 by J. Morlet, a French engineer engaged in petroleum signal work [29]. Its purpose is to better solve the problem that the Fourier transform cannot analyze the time and frequency of the signal at the same time [30, 31]. The core of wavelet analysis is to analyze the signal through the scale and to better analyze the correlation between the signal and the wavelet by shifting and stretching the wavelet basis. The expression of wavelet transform is as follows:where α represents the scale factor, τ represents the translation factor, and represents the wavelet function. The wavelet function is affected by the two values of α and τ, and the expression is called a continuous wavelet basis. It can be observed from (1) that the wavelet transform is also seen as an integral transform to a certain extent, which is the similar to the Fourier transform. However, the difference between the two transforms is also obvious. The Fourier transform has only one frequency , while the wavelet transform has two different variable parameters α and τ. α corresponds to frequency, and τ corresponds to time. Hence, the final f(t) will be projected to the time-scale plane [29].

2.2. Principle of Wavelet Denoising

The image denoising based on wavelet threshold was first proposed by Weaver et al. [32]. The wavelet denoising problem is a function approximation problem from a mathematical point of view. The main purpose is to find the best mapping of the actual signal through the wavelet function. Wavelet transform decomposes the signal into two parts, which are high frequency and low frequency. The high-frequency part reflects the details of the image, and the low-frequency part is the approximate coefficient. The low-frequency signal obtained by the first decomposition is then decomposed to obtain the low-frequency and high-frequency signals of the second layer, and the same operation is repeated until the specified number of decomposition layers is reached. The above operations are called decomposition tree operations, as shown in Figure 1.

In Figure 1, S is the original signal. Dj (j∈{1,2, …, n}) represents the high-frequency signal, and Aj(j∈{1,2, …, n}) is the low-frequency signal. In general, the more layers of decomposition, the smaller the wavelet coefficient of noise.

The basic idea of image denoising with wavelet transformation is to choose an appropriate threshold of wavelet function after the image is converted into a signal. To achieve the purpose of image restoration of image information, the wavelet coefficients larger than the set threshold will be kept, and those smaller than the set threshold will be cleared. However, the threshold is usually set in a certain method. If the value of threshold is too large, it will cause excessive constriction of the effective coefficient. Otherwise, when the value of threshold is too small, the filtering effect is not obvious to remove the noise well. In this paper, the general threshold method proposed by Donoho is utilized to calculate the threshold of the noise image signal, which is described aswhere is the standard deviation of the noise and N represents the signal length.

3. Self-Organizing Migration Algorithm

This section briefly introduces the self-organizing migration algorithm. As a new type of swarm intelligence optimization algorithm, SOMA has been highly regarded by researchers since it was proposed by Zelinka [33] in 2000. The specific implementation process of SOMA is described as follows.

Like ant colony algorithm (ACO) and particle swarm optimization algorithm (PSO), SOMA belongs to the category of cluster intelligence. Firstly, initialize individuals to form the initialization population and obtain the fitness value corresponding to each particle according to the specific problem to be solved. Other individuals in the population except the leader are migrated and updated with a migration loop (ML). The leader is the individual with the best fitness value in the current migration loop, and particles use a certain step to make a small jump to the leader in each migration loop. A perturbation vector PRTVector is randomly initialized to guide individuals move toward the leader, where PRTVector is adjusted by setting a coefficient prt, which usually takes a value of [0,1]. The way of individual migration is carried out according to the following equation:where represents the individuals to be migrated, represents the leader in the ML generation migration loop, and represents the new position obtained by individuals’ migration. T is the length of step.

The migration of individuals in the population will not continue all the time. When the accumulation step reaches Pathlength (maximum migration length), the migration loop will be stopped. Then, the newest leader is selected for the next loop of migration, and finally the algorithm terminates when the stop condition is met. The basic SOMA is described as follows.(1)Initialize NP individuals as the first-generation migration population(2)Calculate the fitness value of each one of the individuals, and select the optimal individual as the leader to guide other individuals to migrate(3)Individuals other than leader are migrated according to equation (3), and the value of PRTVector is updated(4)When the stop condition is reached, the migration stops and returns the value of the optimal individual

4. SOMA Algorithm with Adaptive Step Size (ASOMA)

4.1. Defects of SOMA

As an effective swarm intelligence search algorithm, SOMA can effectively guide individuals to migrate in a better direction. However, because the migration step is a fixed value, the particles cannot be effectively explored or utilized according to the current cluster information in the early and late stage of migration. However, because the value of step is fixed, the individuals cannot be effectively explored and exploited based on the current population information in the early and late stages of migration. In the early stage of the algorithm, a larger step value will cause the algorithm to migrate too fast, reducing the initial exploration ability of the SOMA and causing the individuals to migrate toward the local optimal direction. At the same time, a smaller value of step size may reduce the execution efficiency of the algorithm. Similarly, in the later stage of algorithm execution, a large value of step size will cause the algorithm to fail to accurately converge to the optimal solution. When solving simple problems, the step can be appropriately increased to improve the execution efficiency of the algorithm. But in the face of complex problems, a small step value can better guide the migration of particles to reduce the probability of particles falling into local optimization.

The reasonable value of step in SOMA will promote the convergence of the algorithm. An effective step setting can not only improve the convergence ability and accuracy of the algorithm, but also efficiently use time resources. However, in each migration loop, the leader is usually the best individual in the population. This selection method seems to effectively guide the population to migrate in a better direction, but it ignores the diversity of guidance. The leader usually only represents the optimal solution in the current migration loop, and it is impossible to determine whether it is a local optimization. Wrong information may lead to wrong guidance in the early stage of algorithm execution.

4.2. ASOMA

For the above of the fixed value of the step, this paper proposes an adaptive step size improvement method. ASOMA creates a successful information archive Sstep to store the step information taken by the successfully migrated individuals, takes the accumulated success archive information as the mean value of normal distribution, and randomly generates step in the way of normal distribution to guide the migration of the next individual.

In each migration loop, a normal distribution according to μstep and standard deviation 0.05 is generated.where according to the proposal of [33], μstep is initialized to 0.21 to achieve better results and updated at the end of each migration loop.where c is a constant between 0 and 1 and Sstep represents step archive of successfully migrated particles.

Through the method of adaptive step, ASOMA uses the success information in the historical migration to adjust the step change more effectively. It solves the problem of poor algorithm convergence caused by the fixed value of step and can balance the exploration and exploitation of the SOMA.

Secondly, in view of the excessive exploitation problem that has always adopted the optimal individual as the leader, this paper proposes a new type of leader selection scheme that can more effectively balance the exploration and exploitation of algorithm. The leader will be selected randomly from the top 100p% of all individuals, p ∈ (0,1]. The guiding method is shown in Figure 2.

In Figure 2, one of the top 100p% optimal individuals is selected randomly as the leader. It can be seen that this selective method can effectively improve the exploration performance of the algorithm and more effectively balance the exploration and exploitation of the algorithm. Finally, Algorithm 1 shows the pseudocode of ASOMA.

Initialization: step, PathLength, Nd, (ML is number of Migration Loop) Archive of successful information step, c = 0.5, μstep = 0.1
For t = 0:step:PathLength
  If t < PathLength
   Generate PRTVector vector
   j = 0;
   While j < Nd
    If PRTVector [j] = 1;
    ;
    End
    j = j + 1;
   End
   If
    ;
    Sstep(t) = step;
   End
  t = t + step;
  End
    
     
End

5. Experimental Analysis

Since the sources of noise are very wide, there are many different types of noise, such as uniform noise, exponential noise, impulse noise, speckle noise, salt and pepper noise (also called black and white noise, black and white dots), and so on. For different wavelet functions, different wavelet functions are used in different bands, and the results obtained will have different advantages. How to effectively select the wavelet parameters is a key problem to improve the effect of denoising.

This section uses the idea of metaheuristic to solve the selection of wavelet parameters and applies the proposed ASOMA to solve the image denoising problem to further improve the performance of denoising of SOMA.

The number of layers, threshold, and type of wavelet are defined. A 62 array is created to store the parameters used in wavelet processing. The first four rows of the array, respectively, limit the threshold of each layer of wavelet decomposition. The fifth row is defined as the choice of wavelet type, and the sixth row is defined as the number of layers of wavelet decomposition. To create an initial population, each individual in the population consists of the 6 parameters defined above. The fitness is composed of the weighted value of the peak signal-to-noise ratio (PSNR) and the structural similarity (SSIM), and its expression is defined as (6), where PSNR and SSIM will also be used as indicators for evaluating the effect of image denoising, which will be described in Section 5.3 in detail.where the individuals with higher fitness values will be reserved for the migration of the next generation. Because the problem to be solved is the maximum value problem, it is transformed into the minimum value problem to better describe the convergence process of fitness value.

Firstly, the population is initialized randomly, and the last two parameters of each individual in the population are rounded because the wavelet selection and decomposition layer values must be integers. Secondly, the initialized population is searched iteratively through ASOMA, and after continuous migration and update, the best individual leaders are selected for output. Finally, the final output obtained is used as the control parameter used for wavelet denoising.

5.1. Experimental Environment and Parameter Setting
5.1.1. Experimental Environment

To verify the effectiveness of ASOMA in wavelet denoising, this paper uses MATLAB 2018b software to conduct experiments on it. The equipment information used in the experiment is as follows: 16 GB RAM; 512G SSD; 2.30 GHz 64-bit processor; Windows 10 operating system.

5.1.2. The Comparison Algorithm Used in This Paper

Firstly, ASOMA is compared with “symlet wavelet system,” “dbN wavelet system,” and “Coiflet wavelet system,” respectively. The choice of wavelet is based on the research of Luisier et al. [34]. The types of wavelets and some performance parameters are shown in Table 1.

Secondly, ASOMA is compared with the basic SOMA variants (Modified SOMA [35], HBSOMA [36], and OSOMA [37]) and SOMA.(a)Modified SOMA: Ke et al. introduced an individual migration method with random mutation step size on the basis of SOMA, which enriched individual optimization methods and effectively improved the performance of SOMA [35].(b)HBSOMA: Lin introduced the mutation strategy in DE into SOMA and proposed a SOMA based on hybrid migrating behavior, which increased the diversity of the population and accelerated the optimization process of SOMA [36].(c)OSOMA: Lin added the idea of opposition-based learning to SOMA, by expanding the learning direction of individuals to find more excellent individuals as much as possible so that the algorithm speeds up the execution efficiency of the algorithm while maintaining the diversity of the population [37].

5.1.3. Test Image

This paper uses the aerials standard test image in the classic digital image processing libraries as experimental material [38]. The specific image material used is “Lena” with a size of 512 × 512 pixels. The specific image is shown in Figure 3.

This paper tests the performance of ASOMA to optimize wavelet denoising by adding three sets of different noises. The settings of noise are as follows.(1)Salt-and-pepper noise with a density of 0.3(2)Gaussian noise with a mean value of 0 and a variance of 0.3(3)Additive noise with a mean value of 0 and a variance of 0.3

Taking a variance of 0.3 to generate noise can ensure that the results of different denoising methods can be compared without losing the integrity of the image.

5.1.4. Experimental Parameter Setting

The parameters involved in the experimental comparison are shown in Table 2; the values of the parameters are all set according to the literature [33].

5.2. Experimental Results

Salt-and-pepper noise, Gaussian noise, and additive noise with density and variance of 0.3 are added to Lena original gray image to simulate the collected noise image. To eliminate randomness, the average value of 25 experiments is taken as the final output. The results are shown in Figures 4, 5, and 6 through comparative experiments.(1)The result of denoising between ASOMA and three wavelet systemsIt can be seen from the experimental results in Figures 4, 5, and 6 that the method of using metaheuristic to optimize wavelet parameters has better effect of denoising compared with the traditional wavelet system.(2)The result of denoising between ASOMA and Modified SOMA, HBSOMA, OSOMA, and SOMA.Figures 7, 8, and 9, respectively, show the experimental results of denoising effects between ASOMA and Modified SOMA, HBSOMA, OSOMA, and SOMA after adding salt-and-pepper noise, Gaussian noise, and additive noise with density and variance of 0.3 to original gray image of Lena. In this paper, the resulting images are processed by different algorithms, and the images are labelled with specific algorithm names.The above experimental results show that ASOMA has better effect of denoising on three different noisy images than Modified SOMA, HBSOMA, OSOMA, and SOMA.(3)The result of fitness value between ASOMA with Modified SOMA, HBSOMA, OSOMA, and SOMA.

This paper proposes a fitness function as shown in equation (6). Figures 10, 11, and 12 compare the convergence of the fitness values of ASOMA, Modified SOMA, HBSOMA, OSOMA, and SOMA when the cluster size NP is 10, 30, and 50, where the horizontal axis represents the iteration of cluster migration, and the vertical axis represents the fitness value. It can be seen that ASOMA has better convergence capabilities than the original SOMA and its variants.

The experimental results in Figures 10 and 11 show that when NP = 10, ASOMA is superior to other comparison algorithms in removing three different types of noise. From the experimental results in Figure 12, it can be seen that with the increasing of NP, both ASOMA and the comparison algorithm can find the optimal solution, but ASOMA has more advantages in convergence speed.

5.3. Evaluating Indicator

In this paper, PSNR (peak signal-to-noise ratio) and SSIM (structure similarity) are used to analyze and compare the denoising results.

5.3.1. PSNR

PSNR is usually used to evaluate the quality of a compressed image compared to the original image. The larger the PSNR value, the better the image quality obtained by compression [39]. MSE stands for mean square error, and the PSNR expression is as follows.

5.3.2. SSIM

SSIM is used to measure the similarity of two pictures, and it is often used as a reference index for measuring images in the field of image processing [32]. Similarly, the larger the SSIM value, the higher the image quality. The SSIM expression is as follows:where C1, C2, and C3 are positive real constants. These constants are chosen to avoid instability when the average value and standard deviation are close to zero.

Tables 3, 4, and 5 record the optimized processing results of Lena images with three different noises, which is generated by using traditional wavelet, ASOMA, Modified SOMA, HBSOMA, OSOMA, and SOMA with NP = 10, 30, and 50, respectively. The PSNR and SSIM of the corresponding image after denoising are calculated, respectively, and the average value of 25 independent operations is taken as the result.

The experimental results show that, compared with the traditional wavelet denoising methods, ASOMA, Modified SOMA, HBSOMA, OSOMA, and SOMA have made significant advantages in PSNR and SSIM. Moreover, ASOMA has better denoising effect than Modified SOMA, HBSOMA, OSOMA, and SOMA, which proves the effectiveness of ASOMA.

ASOMA archives the step size information of successfully migrated individuals and generates more adaptive excellent step size parameters with an archiving, which effectively improves the exploitation ability of the SOMA. At the same time, the parameter p is set to control the selection of leaders, which reduces the situation of the algorithm falls into local optimization, and effectively improves the exploration ability of SOMA. ASOMA effectively balances the exploration and exploitation performance of SOMA from two aspects, which makes full use of the migration history information of the population and effectively improves the convergence ability of SOMA. The above experimental results can confirm the effectiveness of ASOMA.

6. Conclusions

In this paper, an adaptive self-organizing migration algorithm with step size (ASOMA) and a new leader selection method is proposed, which further improves the convergence ability of SOMA. ASOMA effectively balances the exploration and exploitation performance of the SOMA and is verified by experiments on wavelet denoising. The experimental results show that ASOMA is more competitive in denoising performance than the traditional wavelet denoising method, SOMA and SOMA variants, which proves the effectiveness of the algorithm. The further improvement of the performance of ASOMA will be studied in the future. Moreover, swarm intelligence is only a branch of the metaheuristic algorithm, and the exploration of metaheuristic algorithms that are more suitable for denoising problems will also be carried out in subsequent research [4043].

Data Availability

All data used to support the findings of this study are included within the article.

Conflicts of Interest

The authors declare that they have no conflicts of interest regarding the publication of this paper.

Authors’ Contributions

Haowen Jia and Zijian Cao conceptualized the study; Zijian Cao took part in methodology; Zhenyu Wang investigated the study; Zhenyu Wang and Chen Liu prepared the original draft; Zijian Cao reviewed and edited the manuscript; and Zijian Cao, Tao Zhao, and Yanfang Fu acquired funding. All authors have read and agreed to the published version of the manuscript. Particularly, Tao Zhao worked on polishing the whole paper.

Acknowledgments

This research was partially funded by the Shaanxi Natural Science Basic Research Project (Grant no. 2020JM-565) and the President’s Fund Project of Xi’an University of Technology (project cultivation fund of the National Natural Science Foundation of China) (Grant no. XGPY200207).