schliessen

Filtern

 

Bibliotheken

Sub-sampled Newton methods

For large-scale finite-sum minimization problems, we study non-asymptotic and high-probability global as well as local convergence properties of variants of Newton’s method where the Hessian and/or gradients are randomly sub-sampled. For Hessian sub-sampling, using random matrix concentration inequa... Full description

Journal Title: Mathematical programming 2018-11-16, Vol.174 (1-2), p.293-326
Main Author: Roosta-Khorasani, Farbod
Other Authors: Mahoney, Michael W
Format: Electronic Article Electronic Article
Language: English
Subjects:
Publisher: Berlin/Heidelberg: Springer Berlin Heidelberg
ID: ISSN: 0025-5610
Zum Text:
SendSend as email Add to Book BagAdd to Book Bag
Staff View
recordid: cdi_proquest_journals_2134087418
title: Sub-sampled Newton methods
format: Article
creator:
  • Roosta-Khorasani, Farbod
  • Mahoney, Michael W
subjects:
  • Algebra
  • Analysis
  • Asymptotic methods
  • Calculus of Variations and Optimal Control
  • Optimization
  • Combinatorics
  • Concentration gradient
  • Convergence
  • Curvature
  • Full Length Paper
  • Linear algebra
  • Mathematical analysis
  • Mathematical and Computational Physics
  • Mathematical Methods in Physics
  • Mathematics
  • Mathematics and Statistics
  • Mathematics of Computing
  • Matrix methods
  • Methods
  • Newton methods
  • Numerical Analysis
  • Sampling methods
  • Theoretical
ispartof: Mathematical programming, 2018-11-16, Vol.174 (1-2), p.293-326
description: For large-scale finite-sum minimization problems, we study non-asymptotic and high-probability global as well as local convergence properties of variants of Newton’s method where the Hessian and/or gradients are randomly sub-sampled. For Hessian sub-sampling, using random matrix concentration inequalities, one can sub-sample in a way that second-order information, i.e., curvature, is suitably preserved. For gradient sub-sampling, approximate matrix multiplication results from randomized numerical linear algebra provide a way to construct the sub-sampled gradient which contains as much of the first-order information as possible. While sample sizes all depend on problem specific constants, e.g., condition number, we demonstrate that local convergence rates are problem-independent .
language: eng
source:
identifier: ISSN: 0025-5610
fulltext: no_fulltext
issn:
  • 0025-5610
  • 1436-4646
url: Link


@attributes
NO1
SEARCH_ENGINEprimo_central_multiple_fe
SEARCH_ENGINE_TYPEPrimo Central Search Engine
RANK2.6820033
LOCALfalse
PrimoNMBib
record
control
sourceidgale_proqu
recordidTN_cdi_proquest_journals_2134087418
sourceformatXML
sourcesystemPC
galeidA581187065
sourcerecordidA581187065
originalsourceidFETCH-LOGICAL-c298t-e4d5ecb4bd49e5df7a6b8ff874f4cf02b28ae72cabe5afa9543d95614bb799c00
addsrcrecordideNp1kE9LAzEQxYMoWKsfQE8Fz6mT3SSbPZaiVSh6UM8hfyZ1S3e3JlvEb2_KCp5kDgPD-828eYRcM5gzgOouMWBQUWCKspJLKk7IhPFSUi65PCUTgEJQIRmck4uUtgDASqUm5Ob1YGky7X6HfvaMX0PfzVocPnqfLslZMLuEV799St4f7t-Wj3T9snpaLtbUFbUaKHIv0FluPa9R-FAZaVUIquKBuwCFLZTBqnDGojDB1IKXvs5OuLVVXTuAKbkd9-5j_3nANOhtf4hdPqmL_AvkTUxl1XxUbcwOddOFfojG5fLYNq7vMDR5vhCKMVWBFBlgI-Bin1LEoPexaU381gz0MTM9ZqZzZvqYmT4yxcikrO02GP-s_A_9AH8HbRo
sourcetypeAggregation Database
isCDItrue
recordtypearticle
pqid2134087418
display
typearticle
titleSub-sampled Newton methods
creatorRoosta-Khorasani, Farbod ; Mahoney, Michael W
creatorcontribRoosta-Khorasani, Farbod ; Mahoney, Michael W
descriptionFor large-scale finite-sum minimization problems, we study non-asymptotic and high-probability global as well as local convergence properties of variants of Newton’s method where the Hessian and/or gradients are randomly sub-sampled. For Hessian sub-sampling, using random matrix concentration inequalities, one can sub-sample in a way that second-order information, i.e., curvature, is suitably preserved. For gradient sub-sampling, approximate matrix multiplication results from randomized numerical linear algebra provide a way to construct the sub-sampled gradient which contains as much of the first-order information as possible. While sample sizes all depend on problem specific constants, e.g., condition number, we demonstrate that local convergence rates are problem-independent .
identifier
0ISSN: 0025-5610
1EISSN: 1436-4646
2DOI: 10.1007/s10107-018-1346-5
languageeng
publisherBerlin/Heidelberg: Springer Berlin Heidelberg
subjectAlgebra ; Analysis ; Asymptotic methods ; Calculus of Variations and Optimal Control; Optimization ; Combinatorics ; Concentration gradient ; Convergence ; Curvature ; Full Length Paper ; Linear algebra ; Mathematical analysis ; Mathematical and Computational Physics ; Mathematical Methods in Physics ; Mathematics ; Mathematics and Statistics ; Mathematics of Computing ; Matrix methods ; Methods ; Newton methods ; Numerical Analysis ; Sampling methods ; Theoretical
ispartofMathematical programming, 2018-11-16, Vol.174 (1-2), p.293-326
rights
0Springer-Verlag GmbH Germany, part of Springer Nature and Mathematical Optimization Society 2018
1COPYRIGHT 2019 Springer
2Mathematical Programming is a copyright of Springer, (2018). All Rights Reserved.
lds50peer_reviewed
citesFETCH-LOGICAL-c298t-e4d5ecb4bd49e5df7a6b8ff874f4cf02b28ae72cabe5afa9543d95614bb799c00
links
openurl$$Topenurl_article
thumbnail$$Usyndetics_thumb_exl
search
creatorcontrib
0Roosta-Khorasani, Farbod
1Mahoney, Michael W
title
0Sub-sampled Newton methods
1Mathematical programming
addtitleMath. Program
descriptionFor large-scale finite-sum minimization problems, we study non-asymptotic and high-probability global as well as local convergence properties of variants of Newton’s method where the Hessian and/or gradients are randomly sub-sampled. For Hessian sub-sampling, using random matrix concentration inequalities, one can sub-sample in a way that second-order information, i.e., curvature, is suitably preserved. For gradient sub-sampling, approximate matrix multiplication results from randomized numerical linear algebra provide a way to construct the sub-sampled gradient which contains as much of the first-order information as possible. While sample sizes all depend on problem specific constants, e.g., condition number, we demonstrate that local convergence rates are problem-independent .
subject
0Algebra
1Analysis
2Asymptotic methods
3Calculus of Variations and Optimal Control; Optimization
4Combinatorics
5Concentration gradient
6Convergence
7Curvature
8Full Length Paper
9Linear algebra
10Mathematical analysis
11Mathematical and Computational Physics
12Mathematical Methods in Physics
13Mathematics
14Mathematics and Statistics
15Mathematics of Computing
16Matrix methods
17Methods
18Newton methods
19Numerical Analysis
20Sampling methods
21Theoretical
issn
00025-5610
11436-4646
fulltextfalse
rsrctypearticle
creationdate2018
recordtypearticle
recordideNp1kE9LAzEQxYMoWKsfQE8Fz6mT3SSbPZaiVSh6UM8hfyZ1S3e3JlvEb2_KCp5kDgPD-828eYRcM5gzgOouMWBQUWCKspJLKk7IhPFSUi65PCUTgEJQIRmck4uUtgDASqUm5Ob1YGky7X6HfvaMX0PfzVocPnqfLslZMLuEV799St4f7t-Wj3T9snpaLtbUFbUaKHIv0FluPa9R-FAZaVUIquKBuwCFLZTBqnDGojDB1IKXvs5OuLVVXTuAKbkd9-5j_3nANOhtf4hdPqmL_AvkTUxl1XxUbcwOddOFfojG5fLYNq7vMDR5vhCKMVWBFBlgI-Bin1LEoPexaU381gz0MTM9ZqZzZvqYmT4yxcikrO02GP-s_A_9AH8HbRo
startdate20181116
enddate20181116
creator
0Roosta-Khorasani, Farbod
1Mahoney, Michael W
general
0Springer Berlin Heidelberg
1Springer
2Springer Nature B.V
scope
0AAYXX
1CITATION
2BSHEE
33V.
47SC
57WY
67WZ
77XB
887Z
988I
108AL
118AO
128FD
138FE
148FG
158FK
168FL
17ABJCF
18ABUWG
19ARAPS
20AZQEC
21BENPR
22BEZIV
23BGLVJ
24DWQXO
25FRNLG
26F~G
27GNUQQ
28HCIFZ
29JQ2
30K60
31K6~
32K7-
33L.-
34L6V
35L7M
36L~C
37L~D
38M0C
39M0N
40M2P
41M7S
42P5Z
43P62
44PQBIZ
45PQBZA
46PQEST
47PQQKQ
48PQUKI
49PRINS
50PTHSS
51PYYUZ
52Q9U
sort
creationdate20181116
titleSub-sampled Newton methods
authorRoosta-Khorasani, Farbod ; Mahoney, Michael W
facets
frbrtype5
frbrgroupidcdi_FETCH-LOGICAL-c298t-e4d5ecb4bd49e5df7a6b8ff874f4cf02b28ae72cabe5afa9543d95614bb799c00
rsrctypearticles
prefilterarticles
languageeng
creationdate2018
topic
0Algebra
1Analysis
2Asymptotic methods
3Calculus of Variations and Optimal Control; Optimization
4Combinatorics
5Concentration gradient
6Convergence
7Curvature
8Full Length Paper
9Linear algebra
10Mathematical analysis
11Mathematical and Computational Physics
12Mathematical Methods in Physics
13Mathematics
14Mathematics and Statistics
15Mathematics of Computing
16Matrix methods
17Methods
18Newton methods
19Numerical Analysis
20Sampling methods
21Theoretical
toplevelpeer_reviewed
creatorcontrib
0Roosta-Khorasani, Farbod
1Mahoney, Michael W
collection
0CrossRef
1Academic OneFile (A&I only)
2ProQuest Central (Corporate)
3Computer and Information Systems Abstracts
4ABI/INFORM Collection
5ABI/INFORM Global (PDF only)
6ProQuest Central (purchase pre-March 2016)
7ABI/INFORM Global (Alumni Edition)
8Science Database (Alumni Edition)
9Computing Database (Alumni Edition)
10ProQuest Pharma Collection
11Technology Research Database
12ProQuest SciTech Collection
13ProQuest Technology Collection
14ProQuest Central (Alumni) (purchase pre-March 2016)
15ABI/INFORM Collection (Alumni Edition)
16Materials Science & Engineering Collection
17ProQuest Central (Alumni Edition)
18Advanced Technologies & Aerospace Collection
19ProQuest Central Essentials
20ProQuest Central
21Business Premium Collection
22Technology Collection
23ProQuest Central Korea
24Business Premium Collection (Alumni)
25ABI/INFORM Global (Corporate)
26ProQuest Central Student
27SciTech Premium Collection
28ProQuest Computer Science Collection
29ProQuest Business Collection (Alumni Edition)
30ProQuest Business Collection
31Computer Science Database
32ABI/INFORM Professional Advanced
33ProQuest Engineering Collection
34Advanced Technologies Database with Aerospace
35Computer and Information Systems Abstracts – Academic
36Computer and Information Systems Abstracts Professional
37ABI/INFORM Global
38Computing Database
39Science Database
40Engineering Database
41Advanced Technologies & Aerospace Database
42ProQuest Advanced Technologies & Aerospace Collection
43ProQuest One Business
44ProQuest One Business (Alumni)
45ProQuest One Academic Eastern Edition
46ProQuest One Academic
47ProQuest One Academic UKI Edition
48ProQuest Central China
49Engineering Collection
50ABI/INFORM Collection China
51ProQuest Central Basic
jtitleMathematical programming
delivery
delcategoryRemote Search Resource
fulltextno_fulltext
addata
au
0Roosta-Khorasani, Farbod
1Mahoney, Michael W
formatjournal
genrearticle
ristypeJOUR
atitleSub-sampled Newton methods
jtitleMathematical programming
stitleMath. Program
date2018-11-16
risdate2018
volume174
issue1-2
spage293
epage326
pages293-326
issn0025-5610
eissn1436-4646
abstractFor large-scale finite-sum minimization problems, we study non-asymptotic and high-probability global as well as local convergence properties of variants of Newton’s method where the Hessian and/or gradients are randomly sub-sampled. For Hessian sub-sampling, using random matrix concentration inequalities, one can sub-sample in a way that second-order information, i.e., curvature, is suitably preserved. For gradient sub-sampling, approximate matrix multiplication results from randomized numerical linear algebra provide a way to construct the sub-sampled gradient which contains as much of the first-order information as possible. While sample sizes all depend on problem specific constants, e.g., condition number, we demonstrate that local convergence rates are problem-independent .
copBerlin/Heidelberg
pubSpringer Berlin Heidelberg
doi10.1007/s10107-018-1346-5