logo
logo
x
바코드검색
BOOKPRICE.co.kr
책, 도서 가격비교 사이트
바코드검색

인기 검색어

실시간 검색어

검색가능 서점

도서목록 제공

Accelerated Optimization for Machine Learning: First-Order Algorithms

Accelerated Optimization for Machine Learning: First-Order Algorithms (Hardcover, 2020)

Huan Li, Zhouchen Lin, Cong Fang (지은이)
Springer
345,050원

일반도서

검색중
서점 할인가 할인률 배송비 혜택/추가 실질최저가 구매하기
282,940원 -18% 0원
14,150원
268,790원 >
yes24 로딩중
교보문고 로딩중
notice_icon 검색 결과 내에 다른 책이 포함되어 있을 수 있습니다.

중고도서

검색중
서점 유형 등록개수 최저가 구매하기
로딩중

eBook

검색중
서점 정가 할인가 마일리지 실질최저가 구매하기
로딩중

책 이미지

Accelerated Optimization for Machine Learning: First-Order Algorithms
eBook 미리보기

책 정보

· 제목 : Accelerated Optimization for Machine Learning: First-Order Algorithms (Hardcover, 2020) 
· 분류 : 외국도서 > 컴퓨터 > 인공지능(AI)
· ISBN : 9789811529092
· 쪽수 : 275쪽
· 출판일 : 2020-05-30

목차

CHAPTER 1 Introduction


CHAPTER 2 Accelerated Algorithms for Unconstrained Convex Optimization

1. Preliminaries

2. Accelerated Gradient Method for smooth optimization

3. Extension to the Composite Optimization

3.1. Nesterov's First Scheme

3.2. Nesterov's Second Scheme

3.2.1. A Primal Dual Perspective

3.3. Nesterov's Third Scheme

4. Inexact Proximal and Gradient Computing

4.1. Inexact Accelerated Gradient Descent

4.2. Inexact Accelerated Proximal Point Method

5. Restart

6. Smoothing for Nonsmooth Optimization

7. Higher Order Accelerated Method

8. Explanation: An Variational Perspective

8.1. Discretization

 

CHAPTER 3 Accelerated Algorithms for Constrained Convex Optimization

1. Preliminaries

1.1. Case Study: Linear Equality Constraint

2. Accelerated Penalty Method

2.1. Non-strongly Convex Objectives

2.2. Strong Convex Objectives

3. Accelerated Lagrange Multiplier Method

3.1. Recovering the Primal Solution

3.2. Accelerated Augmented Lagrange Multiplier Method

4. Accelerated Alternating Direction Method of Multipliers

4.1. Non-strongly Convex and Non-smooth

4.2. Strongly Convex and Non-smooth

4.3. Non-strongly Convex and Smooth

4.4. Strongly Convex and Smooth

4.5. Non-ergodic Convergence Rate

4.5.1. Original ADMM

4.5.2. ADMM with Extrapolation and Increasing Penalty Parameter

5. Accelerated Primal Dual Method

5.1. Case 1

5.2. Case 2

5.3. Case 3

5.4. Case 4

 

CHAPTER 4 Accelerated Algorithms for Nonconvex Optimization

1. Proximal Gradient with Momentum

1.1. Basic Assumptions

1.2. Convergence Theorem

1.3. Another Method: Monotone APG

2. AGD Achieves the Critical Points Quickly

2.1. AGD as a Convexity Monitor

2.2. Negative Curvature

2.3. Accelerating Nonconvex Optimization

3. AGD Escapes the Saddle Points Quickly

3.1. Almost Convex

3.2. Negative Curvature Descent

3.3. AGD for Non-Convex Problem

3.3.1. Locally Almost Convex! Globally Almost Convex

3.3.2. Outer Iterations

3.3.3. Inner Iterations

 

CHAPTER 5 Accelerated Stochastic Algorithms

1. The Individual Convexity Case

1.1. Accelerated Stochastic Coordinate Descent

1.2. Background for Variance Reduction Methods

1.3. Accelerated Stochastic Variance Reduction Method

1.4. Black-Box Acceleration

2. The Individual Non-convexity Case

2.1. Individual Non-convex but Integrally Convex

3. The Non-Convexity Case

3.1. SPIDER

3.2. Momentum Acceleration

4. Constrained Problem

5. Infinity Case

 

CHAPTER 6 Paralleling Algorithms

1. Accelerated Asynchronous Algorithms

1.1. Asynchronous Accelerated Gradient Descent

1.2. Asynchronous Accelerated Stochastic Coordinate Descent

2. Accelerated Distributed Algorithms

2.1. Centralized Topology

2.1.1. Large Mini-batch Algorithms

2.1.2. Dual Communication-Efficient Methods

2.2. Decentralized Topology

 

CHAPTER 7 Conclusions

 

APPENDIX Mathematical Preliminaries

이 포스팅은 쿠팡 파트너스 활동의 일환으로,
이에 따른 일정액의 수수료를 제공받습니다.
이 포스팅은 제휴마케팅이 포함된 광고로 커미션을 지급 받습니다.
도서 DB 제공 : 알라딘 서점(www.aladin.co.kr)
최근 본 책