The Modified Subgradient Algorithm Based on Feasible Values
Loading...
Files
Date
2009
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Taylor & Francis Ltd
Open Access Color
Green Open Access
No
OpenAIRE Downloads
OpenAIRE Views
Publicly Funded
No
Abstract
In this article, we continue to study the modified subgradient (MSG) algorithm previously suggested by Gasimov for solving the sharp augmented Lagrangian dual problems. The most important features of this algorithm are those that guarantees a global optimum for a wide class of non-convex optimization problems, generates a strictly increasing sequence of dual values, a property which is not shared by the other subgradient methods and guarantees convergence. The main drawbacks of MSG algorithm, which are typical for many subgradient algorithms, are those that uses an unconstrained global minimum of the augmented Lagrangian function and requires knowing an approximate upper bound of the initial problem to update stepsize parameters. In this study we introduce a new algorithm based on the so-called feasible values and give convergence theorems. The new algorithm does not require to know the optimal value initially and seeks it iteratively beginning with an arbitrary number. It is not necessary to find a global minimum of the augmented Lagrangian for updating the stepsize parameters in the new algorithm. A collection of test problems are used to demonstrate the performance of the new algorithm.
Description
Keywords
non-convex optimization, sharp augmented Lagrangian, modified subgradient algorithm, F-MSG algorithm, global optimization, Optimization, Constraint
Fields of Science
0211 other engineering and technologies, 02 engineering and technology
Citation
WoS Q
Q1
Scopus Q
Q2

OpenCitations Citation Count
28
Source
Optımızatıon
Volume
58
Issue
5
Start Page
535
End Page
560
PlumX Metrics
Citations
CrossRef : 17
Scopus : 42
Captures
Mendeley Readers : 4
Google Scholar™


