A One-Layer Recurrent Neural Network With a Discontinuous Hard-Limiting Activation Function for Quadratic Programming

In this paper, a one-layer recurrent neural network with a discontinuous hard-limiting activation function is proposed for quadratic programming. This neural network is capable of solving a large class of quadratic programming problems. The state variables of the neural network are proven to be glob...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE transactions on neural networks Ročník 19; číslo 4; s. 558 - 570
Hlavní autoři: Liu, Q, Wang, J
Médium: Journal Article
Jazyk:angličtina
Vydáno: New York, NY IEEE 01.04.2008
Institute of Electrical and Electronics Engineers
Témata:
ISSN:1045-9227, 1941-0093
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:In this paper, a one-layer recurrent neural network with a discontinuous hard-limiting activation function is proposed for quadratic programming. This neural network is capable of solving a large class of quadratic programming problems. The state variables of the neural network are proven to be globally stable and the output variables are proven to be convergent to optimal solutions as long as the objective function is strictly convex on a set defined by the equality constraints. In addition, a sequential quadratic programming approach based on the proposed recurrent neural network is developed for general nonlinear programming. Simulation results on numerical examples and support vector machine (SVM) learning show the effectiveness and performance of the neural network.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1045-9227
1941-0093
DOI:10.1109/TNN.2007.910736