Abstract
This paper proposes a stochastic search algorithm called improved hypercube optimisation search (HOS+) to find a better solution for optimisation problems. This algorithm is an improvement of the hypercube optimisation algorithm that includes initialization, displacement-shrink and searching area modules. The proposed algorithm has a new random parameters (RP) module that uses two control parameters in order to prevent premature convergence and slow finishing and improve the search accuracy considerable. Many optimisation problems can sometimes cause getting stuck into an interior local optimal solution. HOS+ algorithm that uses a random module can solve this problem and find the global optimal solution. A set of experiments were done in order to test the performance of the algorithm. At first, the performance of the proposed algorithm is tested using low and high dimensional benchmark functions. The simulation results indicated good convergence and much better performance at the lowest of iterations. The HOS+ algorithm is compared with other meta heuristic algorithms using the same benchmark functions on different dimensions. The comparative results indicated the superiority of the HOS+ algorithm in terms of obtaining the best optimal value and accelerating convergence solutions.
1. Introduction
The optimisation includes finding the best solutions in a solution space for which the objective function obtains its smallest (or largest) value. Real-world optimisation problems are often nonlinear and can have multiple local optimal (minimum and maximum) solutions. The basic aim is to find the best of these local optimums. Generally, global optimisation includes finding the best available solution from all feasible solutions given in a defined domain for which the objective function will obtain its smallest (or largest) value.
Traditional numerical optimisation algorithms that are based on finding the derivative of the objective function cannot find global optimal points for the function having multiple local optimums. In such cases, one efficient approach is based on the use of heuristic search algorithms. Metaheuristic search algorithms that are based on directed random search methods can provide sufficiently good solutions and solve the local-optimum problem and find global solutions to the optimisation problems [1, 2]. A set of meta-heuristic optimisation algorithms is developed to find the best solutions. These algorithms are: bat algorithm (BAT) [3], cuckoo search (CS) [4], ant lion optimizer (ALO) [5, 6], elephant herding optimisation (EHO) [7, 8], moth-flame optimisation (MFO) [9], krill herd (KH) [10], moth search algorithm (MSA) [11], monarch butterfly optimisation (MBO) [12, 13], mussels wandering optimisation (MWO) [14], and whale optimisation algorithm (WOA) [15]. Other meta-heuristic optimisation algorithms such as differential evolution (DE) [16], biogeography-based optimisation (BBO) [17, 18], harmony search (HS) [19], evolution strategies (ES) [20], sine cosine algorithm (SCA) [21], gravitational search algorithm (GSA) [22], monkey algorithm [23, 24], dragonfly algorithm (DA), and hybrid ABC/DA (HAD) [25, 26] are also efficiently used for solving many optimisation problems.
Paper [27] proposed an intelligent swarm-based MBO algorithm inspired by the migration behavior of monarch butterflies in nature. In this algorithm, the whole population is partitioned into two subpopulations of equal size. Each individual in population 1 changes its position based on the migration operator, each individual in population 2 changes its position according to the butterfly adjusting operator. The algorithm contains exploration and exploitation properties, easy structure, and strong robustness and is designated for global optimisation. The paper [28] proposed a Slime mould algorithm that is based on the oscillation mode of slime mould in nature. The algorithm uses adaptive weights to simulate the process of producing positive and negative feedback of the propagation wave of slime mould based on bio-oscillator to form the optimal path for connecting food with the excellent exploratory ability and exploitation property. In the moth search algorithm [29], the best moth is viewed as a light source. Some neighbour moths that are close to fittest always display an inclination to fly around their own positions in the form of Levy flights. In contrast, the moths that are far from the fittest one will fly towards the best one in a big step. These two operations corresponding to exploration and exploitation are the basis of the MSO algorithm. The paper [30] proposed a population based hunger game search. The algorithm is based on hunger-driven activities and behavioural choice animals. The authors used the algorithm for different areas such as artificial intelligence and machine learning with high optimisation capacity. The paper [31] based on the predation of animals proposed a colony predation search algorithm. The algorithm utilizes mathematical modelling of animal hunting. The algorithm was used for solving engineering problems. The paper [32] proposed a mathematical optimisation model based on simulation of the hunting behavior of Harris hawks. Inspired by the cooperative behavior and chasing style of Harris hawks, the authors designed the algorithm. A number of benchmark examples were used to evaluate the performance of the algorithm.
As mentioned, many metaheuristic optimisation algorithms have been designed to find the best solution to global problems and increase the accuracy of the optimisation. However, the optimisation algorithms can sometimes get stuck into an interior local optimal solution and cannot escape from that state. These search algorithms have premature convergence problems and low search accuracy in solving optimisation problems. This happens due to the loss of diversity among individuals. The original HOS algorithm may also have the same problems and this can lead to a lack of finding a near-optimal solution in the search area. In this paper, we proposed a new version of the HOS algorithm. The novelties of this paper are: The novel structure of the HOS+ algorithm is proposed; The new random perturbation module of the HOS+ algorithm is introduced; The proposed algorithm has been tested on benchmark problems; The proposed HOS+ algorithm help to prevent premature convergence problems and find the best solutions and also improve search accuracy in a small number of iterations. The designed HOS+ algorithm provided passing over possible local optima and has proven to be a successful convergence optima solution for the lowest iterations.
The remainder of the paper is organised as follows: Sec.2 presents the improved HOS+ algorithm. The design stages and operation modules of HOS+ are explained. Sec.3 presents the experimental results and discussion. A set of benchmark functions of different dimensions was used for testing the proposed HOS+ algorithm. In Sec.4 the performance of the HOS+ algorithm is evaluated and compared with the performances of other meta-heuristic algorithms using the same test functions of different dimensions. Finally, the conclusion is presented in Sec 5.
2. HOS+ Algorithm
The improved HOS+ algorithm is a new stochastic search method inspired by a hypercube evolution. The algorithm is a derivative-free unconstrained optimisation method and is based on a set of points randomly distributed inside an m-dimensional hypercube. The proposed algorithm provides the movement of population (number of points inside of the hypercube) that reaches the minimum (or maximum) of objective function rapidly by reducing the area of the hypercube and updating and searching solutions at each iteration. The original HOS algorithm consists of three- initialization process, displacement- shrink process and searching areas process. The proposed algorithm is renewed by adding a random module in the original search processes.
Stochastic processes are mathematical models of systems that are changing randomly. They are characterised by random variables described by a random probability distribution. They have applications in different fields such as physics, industry, economy, information technology, computer science and many other fields. There are two ways to use a random process in an optimisation problem: through a cost function or a set of constraints. At the same time, stochastic optimisation also refers to any optimisation technique that uses randomness in some ensembles. We consider the case where the parameters of objective function or constraints are random. The improved HOS+ algorithm is inspired by a random process and uses random parameter p1 and parameter p2 in order to improve the problems of premature convergence and slow finishing and search accuracy considerable. The proposed algorithm is the improvement of the stochastic hypercube optimisation algorithm (HOS) algorithm presented in [1, 2]. The algorithm is presented in Figure 1 and explained by the following steps in detail.

2.1. Step A: Initialization Process
The initialization process starts by generating initial points and forms the initial matrices for evaluating solutions in a given hypercube. The initial points are generated using the following operations.(1)Initialize the solution using the dimension of hypercube. where m is the dimension of the hypercube, n is population size.(2)Use lower bound (LB) and upper bound (UB) to scale the solutions xij.(3)Find the radii (R) of hypercube.(4)Find the center of hypercube Xc.
Formula (1) initialize the solutions X inside hypercube which is search area.
, where m is the dimension of the hypercube, n is population size. Each position is evaluated using an objective function. The best point Xbest is determined according to the values of test (or objective) function F.
In the initialization stage, the initial solutions are generated using initial conditions such as the dimension of the hypercube (m), radii of the hypercube (R), lower-upper boundaries (LB, UB) and a number of points (population, m) inside the hypercube (Figure 2). The lower and upper boundaries are used to generate the hypercube. The basic parameters of the hypercube are the center Xc and radii R, which are formulated by formulas (3) and (4). In the given search interval, using generated xij(i = 1,…,n; j = 1,…,m) data points inside the hypercube, the values of the objective functions fij are calculated (here fij are elements of F). After each iteration, the points change their positions (movement). These initial points are evaluated according to the objective function. So initialization process creates matrices as Xbest, Fbest (nx1) after evaluating initial points. The determined Xbest point is improved (updated) using local searches, such as hill climbing or derivative-based local search. If we use a derivative-based local search then , where 0 ≤ p ≤ 1, F is the objective function. The details of the initialization process in the HOS+ algorithm are shown in Figure 2. In the next iteration, the Xbest is utilised to determine the Hypercube center. This operation is realised by computing the center and mean of the last position point (Xc) and the last best Xbest point. The given process is called the “displacement” process.

2.2. Step B: Displacement–Shrink Process
The displacement–shrink process determines the hypercube’s centre and evaluates the test (or objective) function. The centre of the next hypercube is evaluated using the average of the sum of the previous hypercube’s centre and the present best point (Xbest). Thus the centre of the next hypercube (new hypercube) is determined as
Here R and Rnew are old and new radii, S is the convergence factor calculated in Section 3. The updates of hypercube parameters are performed using (5) and (6). As a result of this process, the hypercube size and correspondingly the search space are reduced. The process is called “shrink.” The decrease in hypercube size leads to an increase in the density of the search points (population). The movement of the best value is governed by contraction. The contraction is greater for smaller movements. This guarantees fast convergence, while it protects against getting stuck at an undesired (local) minimum.
As shown, new data points are generated at each iteration and the objective function is evaluated. According to the evaluation results, the hypercube size is changed. As a result, the hypercube size is decreased and the search space is shrunk correspondingly. The decrease of hypercube size causes an increase in the density of test points. This process causes a rapid finding of the optimum value of the objective function.
The algorithm will pass through a series of points from the current position which determines the maximal distance. The displacement ranges are presented below.(1)Normalized : is a normalized value of .(2)Normalized Xbest: is the normalized value of Xbest.(3)Normalize distance dn:(4)Re-normalize distance:
The x displacement is calculated and normalized twice for each iteration: at first, each element of x is divided by its corresponding initial interval so that the displacement is converted into unity-sided points (equations (7) and (8)), and then this number is again normalized by dividing it to the diagonal of the points such as (equations (9) and (10)). Thus, the contraction of the hypercube is becoming higher, when the movement of the number of points shrink which accelerates the convergence.
2.3. Step C: Searching Areas Process
Using equations (7)–(10) the distances between new and old optimum values are calculated in this process. In addition, the “Searching areas” process uses the interval defined for re-normalized distance, given in (11), to control the movements of x.
In case of satisfaction the condition by the movement x, the convergence factor S is computed and updated at each iteration asIn the above equation, dnn is the normalized distance calculated by (10) and based on the average of the last two best values of x. Thus, the purpose of the proposed algorithm ensures the movement of the population that reaches the minimum point rapidly by reducing the area of the hypercube after each iteration. A flowchart of the searching areas process in the proposed algorithm is shown in Figure 3.

2.4. Step D: Random Parameter (RP) Module
HOS+ algorithm includes a new RP module characterised by two control parameters p1 and p2. This module improves the points (current positions) inside the hypercube that might get stuck at some local solutions. At first, the p1 improves the points having local optima problem. The process is continued according to some tolerance and fixed by tolX. In addition, the upper bound of dimension d can be determined according to the value of tolX. The value of the solution at the local point is updated by multiplying the parameter p1 by a random scalar drawn from the standard normal distribution, that is Xnew ← X∗(1 + p1∗randn). Here randn generates random numbers in the interval of 0 and 1. The new point will be accepted according to the values of test functions. If the value of the optimisation function in the new point will be minimum (or maximum) than the previous one then the new point will be included in the solution. Thus, by using this operation the presented random module prevents the point from getting stuck in some local optimal solutions while controlling the points’ positions inside the hypercube. After these operations, the second new random parameter is introduced in order to control directed movements of all points inside the hypercube. The introduced second parameter (p2) improves the solutions along the direction pointing to their current position with different perturbations originating from some possible local minimum and searching for another minimum point. The points are updated by multiplying the parameter p2 to uniformly distributed random numbers (rand [1 x D]). That iswhere D is searching dimension. Thus, the improvements of positions are performed along the direction pointing with different perturbations in order to exit from some possible local minimum or to search for another minimum point. The pseudocode of the random permutation module is given in Figure 4.

The computational complexity of the HOS+ algorithm was analyzed. HOS+ includes initialization, fitness evaluation, displacement-shrink and normalization, searching areas and random parameter modules. Hypercube dimensions n, population size m and a maximum number of iterations T are the main parameters affecting the running time of the HOS+ algorithm in these modules. Computational complexity of initialization is O(n⋅m), displacement- shrink and normalization module is O(n⋅m), searching areas O(n), random permutation module is O(n⋅m). The displacement-shrink and normalization module, searching areas, random permutation module running in each iteration t; if we take into account the maximum number of iterations T then the computational complexity of HOS+ will be presented by O(n⋅m + n⋅T + n⋅m⋅T).
3. Benchmark Functions
The benchmark functions used are Sphere function (F1), Schwefel 2.22 function (F2), Rotated Hyper-Ellipsoid function (F3), Ackley function (F4), Griewank function (F5), and Hyperellipsoid function (F6). The details of information for these test functions are given below. The performance of the HOS+ was evaluated using low, medium and high dimensional optimisation functions. In the paper, low dimension is taken equal to 30D, medium dimension equal to 60D and high dimension-90D. For more information about these benchmark functions, we refer the reader to the link: https://www.sfu.ca/∼ssurjano/optimisation.html.
3.1. Sphere Function (F1)
The function is convex, continuous, differentiable, separable and uni-modal. It is used xi [−5.12, 5.12] for all i = 1,…,n and the global minimum is at f(x) = 0.
3.2. Schwefel 2.22 Function (F2)
The function is convex, continuous, non-differentiable, separable and uni-modal. It is used xi [−10,10] for all i = 1,…,n and the global minimum is at f(x) = 0.
3.3. Rotated Hyper Ellipsoid Function (F3)
This function is convex, continuous and uni-modal. It is used xi [−65, 65] for all i = 1,…,n and the global minimum is at f(x) = 0.
3.4. Ackley Function (F4)
This function is continuous and multi-modal. It is used xi [−32, 32] for all i = 1,…,n and the global minimum is at f(x) = 0.
3.5. Griewank Function (F5)
This function is continuous and uni-modal. It is used xi [−600, 600] for all i = 1,…,n and the global minimum is at f(x) = 0.
3.6. Hyper Ellipsoid Function (F6)
This function is convex, continuous, differentiable, separable and uni-modal. It is used xi [−5.12, 5.12] for all i = 1,…,n and the global minimum is at f(x) = 0.
4. The Performance of HOS+ Algorithm on Benchmark Functions
The HOS+ algorithm is simulated in Matlab R2017a for finding optimal solutions for a set of benchmark functions. The computer used for simulations has the following characteristics;(i)CPU: i5-8250U(ii)CPU Speed:1.60 GHz–1.80 GHz(iii)RAM: 8.00 GB(iv)OS: Windows 10
The HOS+ algorithm has been tested using the above-mentioned benchmark functions on 30D, 60D, and 90D dimensions. Evaluations are carried out using the same population size of 50, the same number of iterations of 50 and the maximum function evaluation. For all cases, the results are averaged using 100 independent runs of the algorithm. For measuring the performances of the algorithm the best, mean and standard deviation are taken.
At first, the performance of the HOS+ algorithm is compared with the original HOS algorithm given in [1, 2]. Using both algorithms the experiments were conducted for all benchmark functions on 30, 60 and 90 dimensions. Table 1 depicts the results of experiments obtained for six optimisation functions of F1, F2, F3, F4, F5, and F6 for the HOS and HOS+ algorithms. The simulations have been done at the same initial conditions. The best, averaged values of mean and standard deviation are illustrated in the table. The convergence plots and time-spent of HOS+ algorithm obtained from the simulations on 90-dimensional optimisation functions F1, F2, F3, F4, F5, and F6 were depicted in Figures 5 to 10, correspondingly. The results of the experiments were presented using the convergence plots and global search ability of the proposed algorithm. For comparative purpose, the convergence plots of original HOS algorithm are presented in Figure 11. The comparative results of performances given in Table 1 and the convergence plots given in Figures 5 to 10 and Figure 11 demonstrate the superiority of the HOS+ algorithm over the original HOS algorithm.







5. Comparison of HOS+ with Other Metaheuristic Algorithms
The HOS+ algorithm performance was compared with the performances of other meta-heuristic optimisation algorithms using 6 test functions on different dimensions, particularly 30D, 60D, and 90D. The comparisons of the algorithms were done using the same initial conditions. All algorithms are simulated using the same iterations’ number, the same dimensions, and the same maximum function evaluation [26, 33].
The comparative results of each function are presented in Tables 2–5. The best results are marked in bold. Tables 2–4 depict comparative results of experiments obtained for optimisation functions (F1–F6) on dimensions 30, 60 and 90.100 independent runs have been done for each optimisation function using HOS+ algorithm. Table 5 demonstrates the experimental comparative results of the F1, F2, and F4 functions on dimensions 20, 50, and 100. The results are averaged values of 30 independent runs of each algorithm.
The initial values of the parameters for the HOS+ algorithm were set as follows: population size is set equal to 50 and a number of iterations is set equal to 50.
First, the proposed algorithm was compared with a selected collection of other meta-heuristic algorithms. DA, ABC, and HAD algorithms were taken for comparison. Table 2 illustrates the best, mean and best standard deviation obtained from the experiments.
In the second stage, the HOS+ algorithm was compared with the meta-heuristic optimisation algorithms ACO, GA, DE, and PSO. Table 3 depicts the experimental results for each function.
In the third stage, the HOS+ algorithm was compared with the EHO, MSA, and WOA meta-heuristic optimisation algorithms. Table 4 depicts the experimental results obtained for each function.
In the fourth stage, the HOS+ algorithm was compared with the monarch butterfly optimisation algorithm (MBO), MBO with opposition-based learning and random local perturbation (OPMBO), and MBO with greedy strategy and self-adaptive crossover operator (GCMBO) using three benchmark functions. The comparative results are presented in Table 5. The best results are marked in bold.
In the fifth stage, the simulation results of HOS+ algorithm is compared with the simulation results of sine-cosine algorithm (SCA), m-SCA [34] and improved crow search algorithm ICSA [35]. Table 6 shows the experimental comparative results of the F1, F2, F3, F4 and F5 functions on 30 dimensions.
The experimental comparative results of ACO, ABC, DA DE, HAD, GA, PSO, EHO, MSA, WOA, MBO, GCMBO, OPMBO, ICSA, SCA, M-SCA and HOS+ algorithm showed that the proposed HOS+ algorithm has obtained better results and best convergence due to escaping local optimums in the majority of the evaluations. The obtained simulation results indicate the effectiveness of using HOS+ algorithm in optimisation problems.
6. Conclusions
This paper proposes a novel stochastic search algorithm based on the evolution of hypercube. The design stages of the algorithm were explained. A new random perturbation module is introduced in order to solve local optimum problems in optimisation problems and to find a global solution. The HOS+ algorithm has been tested using various low and high dimensional optimisation functions and the solution of the specified local optimum problem has been proven by experimental results. The obtained results demonstrated that the algorithm can successfully avoid getting stuck in the local optimum and find a global solution for the lowest iterations. Comparative results of performances that include the best, the mean, standard deviation and convergence plots demonstrate advantages of the proposed HOS+ algorithm over other thirteen meta-heuristic algorithms. The obtained simulation results indicate the efficiency of using the HOS+ algorithm in the solution of optimisation problems. Future research includes the application of the HOS+ algorithm to solve practical optimisation problems.
Data Availability
No data were used to support this study.
Conflicts of Interest
The authors declare that they have no conflicts of interest.