Abstract
This paper presents a nonlinear projection neural network for solving interval quadratic programs subject to box-set constraints in engineering applications. Based on the Saddle point theorem, the equilibrium point of the proposed neural network is proved to be equivalent to the optimal solution of the interval quadratic optimization problems. By employing Lyapunov function approach, the global exponential stability of the proposed neural network is analyzed. Two illustrative examples are provided to show the feasibility and the efficiency of the proposed method in this paper.
1. Introduction
In lots of engineering applications including regression analysis, image and signal progressing, parameter estimation, filter design and robust control, and so forth [1], it is necessary to solve the following quadratic programming problem: where , , and is a convex set. When is a positive definite matrix, the problem (1.1) is said to be the convex quadratic program. When is a semipositive definite matrix, the problem (1.1) is said to be the degenerate convex quadratic program. In general, the matrix is not precisely known, but can only be enclosed in intervals, that is, . Such quadratic program with interval data is named as interval quadratic program usually. In the recent years, there have been some project neural network approaches for solving the problem (1.1); see, for example, [2–15], and the references therein. In [2], Kennedy and Chua presented a primal network for solving the convex quadratic program. This network contains a finite penalty parameter, so it converges an approximate solution only. To overcome the penalty parameter, in [3, 4], Xia proposed several primal projection neural networks for solving the convex quadratic program and it dual, and analyzed the global asymptotic stability of the proposed neural networks when the constraint set is a box set. In [5, 6], Xia et al. presented a recurrent projection neural network for solving the convex quadratic program and related linear piecewise equation, and gave some conditions of the exponential convergence. In [7, 8], Yang and Cao presented a delayed projection neural network for solving problem (1.1), and analyzed the global asymptotic stability and exponential stability of the proposed neural networks when the constraint set is a unbounded box set. In order to solve the degenerate convex quadratic program, Tao et al. [9] and Xue and Bian [10, 11] proposed two projection neural networks, and proved that the equilibrium point of the proposed neural networks was equivalent to the KT point of the quadratic programming problem. Particularly, in [10], the proposed neural network was shown to have complete convergence and finite-time convergence, and the nonsingular part of the output trajectory with respect to has an exponentially convergent rate. In [12, 13], Hu and Wang designed a general projection neural network for solving monotone linear variational inequalities and extended linear-quadratic programming problems, and proved that the proposed network was exponentially convergent when the constraint set is a polyhedral set.
In order to solve the interval quadratic program, in [14], Ding and Huang presented a new class of interval projection neural networks, and proved the equilibrium point of this neural networks is equivalent to the KT point of a class of interval quadratic program. Furthermore, some sufficient conditions to ensure the existence and global exponential stability for the unique equilibrium point of interval projection neural networks are given. To the best of the authors knowledge, the work in [14] is first to study solving the interval quadratic program by a projection neural network. However, the interval quadratic program discussed in [14] is only a quadratic program without constraints, thus has many limitations in practice. It is well known that the quadratic program with constraints is more popular.
Motivated by the above discussion, in the present paper, a new projection neural network for solving the interval quadratic programming problem with box-set constraints is presented. Based on the Saddle theorem, the equilibrium point of the proposed neural network is proved to be equivalent to the KT point of the interval quadratic program. By using the fixed point theorem, the existence and uniqueness of an equilibrium point of the proposed neural network are analyzed. By constructing a suitable Lyapunov function, a sufficient condition to ensure the existence and global exponential stability for the unique equilibrium point of interval projection neural network is obtained.
This paper is organized as follows. Section 2 describes the system model and gives some necessary preliminaries; Section 3 gives the proof of the existence of equilibrium point of the proposed neural network, and discusses the global exponential stability of the proposed neural network; Section 4 provides two numerical examples to demonstrate the validity of the obtained results. Some conclusions are drawn in Section 5.
2. A Projection Neural Network Model
Consider the following interval quadratic programming problem: where , , ; , and is a positive definite diagonal matrix. means , . The Lagrangian function of the problem (2.1) is where is referred to as the Lagrange multiplier and . Based on the well-known Saddle point theorem [1], is an optimal solution of (2.1) if and only if there exist and , satisfying , that is, By the first inequality in (2.3), , for all , hence . Let . By the second inequality in (2.3), , for all , . If there exists such that , then , for all , which is contradictive when . Thus, for any , it follows that and , for all .
By using the project formulation [16], the above inequality can be equivalently represented as , where is a project function, and, for , On the other hand, , for all . This implies that Thus, is an optimal solution of (2.1) if and only if there exist and , such that satisfies From (2.6), it follows that . Hence, is an optimal solution of (2.1) if and only if there exists such that satisfies Substituting into the equation , we have where By the above discussion, we can obtain the following proposition.
Proposition 2.1. Let be a solution of the project equation then, is an optimal solution of the problem (2.1).
In the following, we propose a neural network, which is said to be the interval projection neural network, for solving (2.1) and (2.10), whose dynamical equation is defined as follows: The neural networks (2.11) can be equivalently written as Figure 1 shows the architecture of the neural network (2.11), where , , and .

Definition 2.2. The point is said to be an equilibrium point of interval projection neural network (2.11), if satisfies
By Proposition 2.1 and Definition 2.2, we have the following theorem.
Theorem 2.3. The point is an equilibrium point of the interval projection neural network (2.11) if and only if it is an optimal solution of the interval quadratic program (2.1).
Definition 2.4. The equilibrium point of the neural network (2.11) is said to be globally exponentially stable, if the trajectory of the neural network (2.11) with the initial value satisfies where is a constant independent of the initial value and is a constant dependent on the initial value . denotes the 1-norm of , that is, .
Lemma 2.5 (see [17]). Let be a closed convex set. Then, where is a project function on , given by .
3. Stability Analysis
In order to obtain the results in this paper, we make the following assumption for the neural network (2.11): where , .
Theorem 3.1. If the assumption is satisfied, then there exists a unique equilibrium point for the neural network (2.11).
Proof. Let . By Definition 2.2, it is obvious that the neural network (2.11) has a unique equilibrium point if and only if has a unique fixed point in . In the following, by using fixed point theorem, we prove that has a unique fixed point in . For any , by Lemma 2.5 and the assumption , we can obtain that where . By the assumption , , . This implies that . Equation (3.2) shows that is a contractive mapping, and hence has a unique fixed point. This completes the proof.
Proposition 3.2. If the assumption holds, then for any , there exists a solution with the initial value for the neural network (2.11).
Proof. Let , where is an identity mapping, then . By (3.2), we have Equation (3.3) means that the mapping is globally Lipschitz. Hence, for any , there exists a solution with the initial value for the neural network (2.11). This completes the proof.
Proposition 3.2 shows the existence of the solution for the neural network (2.11).
Theorem 3.3. If the assumption is satisfied, then the equilibrium point of the neural network (2.11) is globally exponentially stable.
Proof. By Theorem 3.1, the neural network (2.11) has a unique equilibrium point. We denote the equilibrium point of the neural network (2.11) by .
Consider Lyapunov function . Calculate the derivative of along the solution of the neural network (2.11). When , we have
Noting , by Lemma 2.5, we have
Hence,
where . By the assumption , . Hence, . Let , then . Equation (3.6) can be rewritten as . It follows easily that , for all This shows that the equilibrium point of the neural network (2.11) is globally exponentially stable. This completes the proof.
4. Illustrative Examples
Example 4.1. Consider the interval quadratic program defined by , , , , , , and .
The optimal solution of this quadratic program is under or . It is easy to check that The assumption holds. By Theorems 3.1 and 3.3, the neural network (2.11) has a unique equilibrium point which is globally exponentially stable, and the unique equilibrium point is the optimal solution of this quadratic programming problem.
In the case of , Figure 2 reveals that the projection neural network (2.11) with random initial value has a unique equilibrium point which is globally exponentially stable. In the case of , Figure 3 reveals that the projection neural network (2.11) with random initial value has the same unique equilibrium point which is globally exponentially stable. These are in accordance with the conclusion of Theorems 3.1 and 3.3.


Example 4.2. Consider the interval quadratic program defined by , , , , , , and .
The optimal solution of this quadratic program is under or . It is easy to check that The assumption holds. By Theorems 3.1 and 3.3, the neural network (2.11) has a unique equilibrium point which is globally exponentially stable, and the unique equilibrium point is the optimal solution of this quadratic programming problem.
In the case of , Figure 4 reveals that the projection neural network (2.11) with random initial value has a unique equilibrium point which is globally exponentially stable. In the case of , Figure 5 reveals that the projection neural network (2.11) with random initial value has the same unique equilibrium point which is globally exponentially stable. These are in accordance with the conclusion of Theorems 3.1 and 3.3.


5. Conclusion
In this paper, we have developed a new projection neural network for solving interval quadratic programs, the equilibrium point of the proposed neural network is equivalent to the solution of interval quadratic programs. A condition is derived which ensures the existence, uniqueness, and global exponential stability of the equilibrium point. The results obtained are highly valuable in both theory and practice for solving interval quadratic programs in engineering.
Acknowledgment
This paper was supported by the Hebei Province Education Foundation of China (2009157).