schliessen

Filtern

 

Bibliotheken

Universal gradient methods for convex optimization problems

In this paper, we present new methods for black-box convex minimization. They do not need to know in advance the actual level of smoothness of the objective function. Their only essential input parameter is the required accuracy of the solution. At the same time, for each particular problem class th... Full description

Journal Title: Mathematical programming 2014, Vol.152 (1-2), p.381-404
Main Author: Nesterov, Yu
Format: Electronic Article Electronic Article
Language: English
Subjects:
Publisher: Berlin/Heidelberg: Springer Berlin Heidelberg
ID: ISSN: 0025-5610
Zum Text:
SendSend as email Add to Book BagAdd to Book Bag
Staff View
recordid: cdi_proquest_miscellaneous_1730069100
title: Universal gradient methods for convex optimization problems
format: Article
creator:
  • Nesterov, Yu
subjects:
  • Accuracy
  • Analysis
  • Calculus of Variations and Optimal Control
  • Optimization
  • Combinatorics
  • Complexity theory
  • Convergence
  • Convex analysis
  • Convexity
  • Full Length Paper
  • Management science
  • Mathematical analysis
  • Mathematical and Computational Physics
  • Mathematical Methods in Physics
  • Mathematical models
  • Mathematical programming
  • Mathematics
  • Mathematics and Statistics
  • Mathematics of Computing
  • Methods
  • Minimization
  • Numerical Analysis
  • Optimization
  • Smoothness
  • Studies
  • Theoretical
ispartof: Mathematical programming, 2014, Vol.152 (1-2), p.381-404
description: In this paper, we present new methods for black-box convex minimization. They do not need to know in advance the actual level of smoothness of the objective function. Their only essential input parameter is the required accuracy of the solution. At the same time, for each particular problem class they automatically ensure the best possible rate of convergence. We confirm our theoretical results by encouraging numerical experiments, which demonstrate that the fast rate of convergence, typical for the smooth optimization problems, sometimes can be achieved even on nonsmooth problem instances.
language: eng
source:
identifier: ISSN: 0025-5610
fulltext: no_fulltext
issn:
  • 0025-5610
  • 1436-4646
url: Link


@attributes
NO1
SEARCH_ENGINEprimo_central_multiple_fe
SEARCH_ENGINE_TYPEPrimo Central Search Engine
RANK2.758714
LOCALfalse
PrimoNMBib
record
control
sourceidgale_proqu
recordidTN_cdi_proquest_miscellaneous_1730069100
sourceformatXML
sourcesystemPC
galeidA421783594
sourcerecordidA421783594
originalsourceidFETCH-LOGICAL-1387t-75b783639db07909f1c395134cf6815e978f611eecb670103abd2d73edfef1a30
addsrcrecordideNp1kMtKxDAUhoMoOF4ewF3BjZuO50zapMGViDcQ3DjrkLYnY4a2GZOOqE9vhroQQbIIhO_Pf2HsDGGOAPIyIiDIHLDIQSrIYY_NsOAiL0Qh9tkMYFHmpUA4ZEcxrgEAeVXN2NVycO8UoumyVTCto2HMehpffRsz60PW-OGdPjK_GV3vvszo_JBtgq876uMJO7Cmi3T6cx-z5d3ty81D_vR8_3hz_ZQnCznmsqxlxQVXbb2Lpiw2XJXIi8aKCktSsrICkaiphUwtuKnbRSs5tZYsGg7H7GL6Nxm_bSmOunexoa4zA_lt1Cg5gFBph4Se_0HXfhuGlE6jUAILlSwSNZ-olelIu8H6MZgmnZZ6lxqTden9ulhgCl6qIglwEjTBxxjI6k1wvQmfGkHv9tfT_jrtr3cl9c5kMWliYocVhV9R_hV9A8rghlg
sourcetypeAggregation Database
isCDItrue
recordtypearticle
pqid1696149670
display
typearticle
titleUniversal gradient methods for convex optimization problems
creatorNesterov, Yu
creatorcontribNesterov, Yu
descriptionIn this paper, we present new methods for black-box convex minimization. They do not need to know in advance the actual level of smoothness of the objective function. Their only essential input parameter is the required accuracy of the solution. At the same time, for each particular problem class they automatically ensure the best possible rate of convergence. We confirm our theoretical results by encouraging numerical experiments, which demonstrate that the fast rate of convergence, typical for the smooth optimization problems, sometimes can be achieved even on nonsmooth problem instances.
identifier
0ISSN: 0025-5610
1EISSN: 1436-4646
2DOI: 10.1007/s10107-014-0790-0
3CODEN: MHPGA4
languageeng
publisherBerlin/Heidelberg: Springer Berlin Heidelberg
subjectAccuracy ; Analysis ; Calculus of Variations and Optimal Control; Optimization ; Combinatorics ; Complexity theory ; Convergence ; Convex analysis ; Convexity ; Full Length Paper ; Management science ; Mathematical analysis ; Mathematical and Computational Physics ; Mathematical Methods in Physics ; Mathematical models ; Mathematical programming ; Mathematics ; Mathematics and Statistics ; Mathematics of Computing ; Methods ; Minimization ; Numerical Analysis ; Optimization ; Smoothness ; Studies ; Theoretical
ispartofMathematical programming, 2014, Vol.152 (1-2), p.381-404
rights
0Springer-Verlag Berlin Heidelberg and Mathematical Optimization Society 2014
1COPYRIGHT 2015 Springer
2Springer-Verlag Berlin Heidelberg and Mathematical Optimization Society 2015
lds50peer_reviewed
oafree_for_read
citesFETCH-LOGICAL-1387t-75b783639db07909f1c395134cf6815e978f611eecb670103abd2d73edfef1a30
links
openurl$$Topenurl_article
thumbnail$$Usyndetics_thumb_exl
search
creatorcontribNesterov, Yu
title
0Universal gradient methods for convex optimization problems
1Mathematical programming
addtitleMath. Program
descriptionIn this paper, we present new methods for black-box convex minimization. They do not need to know in advance the actual level of smoothness of the objective function. Their only essential input parameter is the required accuracy of the solution. At the same time, for each particular problem class they automatically ensure the best possible rate of convergence. We confirm our theoretical results by encouraging numerical experiments, which demonstrate that the fast rate of convergence, typical for the smooth optimization problems, sometimes can be achieved even on nonsmooth problem instances.
subject
0Accuracy
1Analysis
2Calculus of Variations and Optimal Control; Optimization
3Combinatorics
4Complexity theory
5Convergence
6Convex analysis
7Convexity
8Full Length Paper
9Management science
10Mathematical analysis
11Mathematical and Computational Physics
12Mathematical Methods in Physics
13Mathematical models
14Mathematical programming
15Mathematics
16Mathematics and Statistics
17Mathematics of Computing
18Methods
19Minimization
20Numerical Analysis
21Optimization
22Smoothness
23Studies
24Theoretical
issn
00025-5610
11436-4646
fulltextfalse
rsrctypearticle
creationdate2014
recordtypearticle
recordideNp1kMtKxDAUhoMoOF4ewF3BjZuO50zapMGViDcQ3DjrkLYnY4a2GZOOqE9vhroQQbIIhO_Pf2HsDGGOAPIyIiDIHLDIQSrIYY_NsOAiL0Qh9tkMYFHmpUA4ZEcxrgEAeVXN2NVycO8UoumyVTCto2HMehpffRsz60PW-OGdPjK_GV3vvszo_JBtgq876uMJO7Cmi3T6cx-z5d3ty81D_vR8_3hz_ZQnCznmsqxlxQVXbb2Lpiw2XJXIi8aKCktSsrICkaiphUwtuKnbRSs5tZYsGg7H7GL6Nxm_bSmOunexoa4zA_lt1Cg5gFBph4Se_0HXfhuGlE6jUAILlSwSNZ-olelIu8H6MZgmnZZ6lxqTden9ulhgCl6qIglwEjTBxxjI6k1wvQmfGkHv9tfT_jrtr3cl9c5kMWliYocVhV9R_hV9A8rghlg
startdate20140528
enddate20140528
creatorNesterov, Yu
general
0Springer Berlin Heidelberg
1Springer
2Springer Nature B.V
scope
0AAYXX
1CITATION
2BSHEE
33V.
47SC
57WY
67WZ
77XB
887Z
988I
108AL
118AO
128FD
138FE
148FG
158FK
168FL
17ABJCF
18ABUWG
19ARAPS
20AZQEC
21BENPR
22BEZIV
23BGLVJ
24DWQXO
25FRNLG
26F~G
27GNUQQ
28HCIFZ
29JQ2
30K60
31K6~
32K7-
33L.-
34L6V
35L7M
36L~C
37L~D
38M0C
39M0N
40M2P
41M7S
42P5Z
43P62
44PQBIZ
45PQBZA
46PQEST
47PQQKQ
48PQUKI
49PRINS
50PTHSS
51PYYUZ
52Q9U
sort
creationdate20140528
titleUniversal gradient methods for convex optimization problems
authorNesterov, Yu
facets
frbrtype5
frbrgroupidcdi_FETCH-LOGICAL-1387t-75b783639db07909f1c395134cf6815e978f611eecb670103abd2d73edfef1a30
rsrctypearticles
prefilterarticles
languageeng
creationdate2014
topic
0Accuracy
1Analysis
2Calculus of Variations and Optimal Control; Optimization
3Combinatorics
4Complexity theory
5Convergence
6Convex analysis
7Convexity
8Full Length Paper
9Management science
10Mathematical analysis
11Mathematical and Computational Physics
12Mathematical Methods in Physics
13Mathematical models
14Mathematical programming
15Mathematics
16Mathematics and Statistics
17Mathematics of Computing
18Methods
19Minimization
20Numerical Analysis
21Optimization
22Smoothness
23Studies
24Theoretical
toplevelpeer_reviewed
creatorcontribNesterov, Yu
collection
0CrossRef
1Academic OneFile (A&I only)
2ProQuest Central (Corporate)
3Computer and Information Systems Abstracts
4ABI/INFORM Collection
5ABI/INFORM Global (PDF only)
6ProQuest Central (purchase pre-March 2016)
7ABI/INFORM Global (Alumni Edition)
8Science Database (Alumni Edition)
9Computing Database (Alumni Edition)
10ProQuest Pharma Collection
11Technology Research Database
12ProQuest SciTech Collection
13ProQuest Technology Collection
14ProQuest Central (Alumni) (purchase pre-March 2016)
15ABI/INFORM Collection (Alumni Edition)
16Materials Science & Engineering Collection
17ProQuest Central (Alumni Edition)
18Advanced Technologies & Aerospace Collection
19ProQuest Central Essentials
20ProQuest Central
21Business Premium Collection
22Technology Collection
23ProQuest Central Korea
24Business Premium Collection (Alumni)
25ABI/INFORM Global (Corporate)
26ProQuest Central Student
27SciTech Premium Collection
28ProQuest Computer Science Collection
29ProQuest Business Collection (Alumni Edition)
30ProQuest Business Collection
31Computer Science Database
32ABI/INFORM Professional Advanced
33ProQuest Engineering Collection
34Advanced Technologies Database with Aerospace
35Computer and Information Systems Abstracts – Academic
36Computer and Information Systems Abstracts Professional
37ABI/INFORM Global
38Computing Database
39Science Database
40Engineering Database
41Advanced Technologies & Aerospace Database
42ProQuest Advanced Technologies & Aerospace Collection
43ProQuest One Business
44ProQuest One Business (Alumni)
45ProQuest One Academic Eastern Edition
46ProQuest One Academic
47ProQuest One Academic UKI Edition
48ProQuest Central China
49Engineering Collection
50ABI/INFORM Collection China
51ProQuest Central Basic
jtitleMathematical programming
delivery
delcategoryRemote Search Resource
fulltextno_fulltext
addata
auNesterov, Yu
formatjournal
genrearticle
ristypeJOUR
atitleUniversal gradient methods for convex optimization problems
jtitleMathematical programming
stitleMath. Program
date2014-05-28
risdate2014
volume152
issue1-2
spage381
epage404
pages381-404
issn0025-5610
eissn1436-4646
codenMHPGA4
abstractIn this paper, we present new methods for black-box convex minimization. They do not need to know in advance the actual level of smoothness of the objective function. Their only essential input parameter is the required accuracy of the solution. At the same time, for each particular problem class they automatically ensure the best possible rate of convergence. We confirm our theoretical results by encouraging numerical experiments, which demonstrate that the fast rate of convergence, typical for the smooth optimization problems, sometimes can be achieved even on nonsmooth problem instances.
copBerlin/Heidelberg
pubSpringer Berlin Heidelberg
doi10.1007/s10107-014-0790-0
oafree_for_read