RISS 학술연구정보서비스

검색
다국어 입력

http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.

변환된 중국어를 복사하여 사용하시면 됩니다.

예시)
  • 中文 을 입력하시려면 zhongwen을 입력하시고 space를누르시면됩니다.
  • 北京 을 입력하시려면 beijing을 입력하시고 space를 누르시면 됩니다.
닫기
    인기검색어 순위 펼치기

    RISS 인기검색어

      Convex analysis and minimization algorithms

      한글로보기

      https://www.riss.kr/link?id=M264972

      • 저자
      • 발행사항

        Berlin ; New York : Springer-Verlag, c1993

      • 발행연도

        1993

      • 작성언어

        영어

      • 주제어
      • DDC

        515/.88 판사항(20)

      • ISBN

        3540568506 (Berlin : v.1)
        0387568506 (New York : v. 1)
        3540568522 (Berlin : v. 2)
        0387568522 (New York : v. 2)

      • 자료형태

        단행본(다권본)

      • 발행국(도시)

        독일

      • 서명/저자사항

        Convex analysis and minimization algorithms / Jean-Baptiste Hiriart-Urruty, Claude Lemare>chal.

      • 형태사항

        2 v. : ill. ; 25 cm.

      • 총서사항

        Grundlehren der mathematischen Wissenschaften ; 305-306

      • 일반주기명

        Includes bibliographical references and indexes.
        1. Fundamentals -- 2. Advanced theory and bundle methods.

      • 소장기관
        • 경남대학교 중앙도서관 소장기관정보 Deep Link
        • 고려대학교 과학도서관 소장기관정보 Deep Link
        • 국립중앙도서관 국립중앙도서관 우편복사 서비스
        • 국립창원대학교 도서관 소장기관정보
        • 동서대학교 민석도서관 소장기관정보
        • 서강대학교 도서관 소장기관정보 Deep Link
        • 서울과학기술대학교 도서관 소장기관정보
        • 서울대학교 중앙도서관 소장기관정보 Deep Link
        • 성균관대학교 삼성학술정보관 소장기관정보 Deep Link
        • 한국외국어대학교 글로벌캠퍼스 도서관 소장기관정보
        • 한국외국어대학교 서울캠퍼스 도서관 소장기관정보
        • 홍익대학교 세종캠퍼스 문정도서관 소장기관정보
        • 홍익대학교 중앙도서관 소장기관정보
      • 0

        상세조회
      • 0

        다운로드
      서지정보 열기
      • 내보내기
      • 내책장담기
      • 공유하기
      • 오류접수

      부가정보

      목차 (Table of Contents)

      • [Volume. 1]----------
      • CONTENTS
      • Introduction = XV
      • Ⅰ. Convex Functions of One Real Variable = 1
      • 1 Basic Definitions and Examples = 1
      • [Volume. 1]----------
      • CONTENTS
      • Introduction = XV
      • Ⅰ. Convex Functions of One Real Variable = 1
      • 1 Basic Definitions and Examples = 1
      • 1.1 First Definitions of a Convex Function = 2
      • 1.2 Inequalities with More Than Two Points = 6
      • 1.3 Modem Definition of Convexity = 8
      • 2 First Properties = 9
      • 2.1 Stability Under Functional Operations = 9
      • 2.2 Limits of Convex Functions = 11
      • 2.3 Behaviour at Infinity = 14
      • 3 Continuity Properties = 16
      • 3.1 Continuity on the Interior of the Domain = 16
      • 3.2 Lower Semi-Continuity: Closed Convex Functions = 17
      • 3.3 Properties of Closed Convex Functions = 19
      • 4 First-Order Differentiation = 20
      • 4.1 One-Sided Differentiability of Convex Functions = 21
      • 4.2 Basic Properties of Subderivatives = 24
      • 4.3 Calculus Rules = 27
      • 5 Second-Order Differentiation = 29
      • 5.1 The Second Derivative of a Convex Function = 30
      • 5.2 One-Sided Second Derivatives = 32
      • 5.3 How to Recognize a Convex Function = 33
      • 6 First Steps into the Theory of Conjugate Functions = 36
      • 6.1 Basic Properties of the Conjugate = 38
      • 6.2 Differentiation of the Conjugate = 40
      • 6.3 Calculus Rules with Conjugacy = 43
      • Ⅱ. Introduction to Optimization Algorithms = 47
      • 1 Generalities = 47
      • 1.1 The Problem = 47
      • 1.2 General Structure of Optimization Schemes = 50
      • 1.3 General Structure of Optimization Algorithms = 52
      • 2 Defining the Direction = 54
      • 2.1 Descent and Steepest-Descent Directions = 54
      • 2.2 First-Order Methods = 56
      • 2.3 Newtonian Methods = 61
      • 2.4 Conjugate-Gradient Methods = 65
      • 3 Line-Searches = 70
      • 3.1 General Structure of a Line-Search = 71
      • 3.2 Designing the Test (0), (R), (L) = 74
      • 3.3 The Wolfe Line-Search = 77
      • 3.4 Updating the Trial Stepsize = 91
      • Ⅲ. Convex Sets = 87
      • 1 Generalities = 87
      • 1.1 Definition and First Examples = 87
      • 1.2 Convexity-Preserving Operations on Sets = 90
      • 1.3 Convex Combinations and Convex Hulls = 94
      • 1.4 Closed Convex Sets and Hulls = 99
      • 2 Convex Sets Attached to a Convex Set = 102
      • 2.1 The Relative Interior = 102
      • 2.2 The Asymptotic Cone = 108
      • 2.3 Extreme Points = 110
      • 2.4 Exposed Faces = 113
      • 3 Projection onto Closed Convex Sets = 116
      • 3.1 The Projection Operator = 116
      • 3.2 Projection onto a Closed Convex Cone = 118
      • 4 Separation and Applications = 121
      • 4.1 Separation Between Convex Sets = 121
      • 4.2 First Consequences of the Separation Properties = 124
      • 4.3 The Lemma of Minkowski-Farkas = 129
      • 5 Conical Approximations of Convex Sets = 132
      • 5.1 Convenient Definitions of Tangent Cones = 133
      • 5.2 The Tangent and Normal Cones to a Convex Set = 136
      • 5.3 Some Properties of Tangent and Normal Cones = 139
      • Ⅳ. Convex Functions of Several Variables = 143
      • 1 Basic Definitions and Examples = 143
      • 1.1 The Definitions of a Convex Function = 143
      • 1.2 Special Convex Functions: Affinity and Closedness = 147
      • 1.3 First Examples = 152
      • 2 Functional Operations Preserving Convexity = 157
      • 2.1 Operations Preserving Closedness = 158
      • 2.2 Dilations and Perspectives of a Function = 160
      • 2.3 Infimal Convolution = 162
      • 2.4 Image of a Function Under a Linear Mapping = 166
      • 2.5 Convex Hull and Closed Convex Hull of a Function = 169
      • 3 Local and Global Behaviour of a Convex Function = 173
      • 3.1 Continuity Properties = 173
      • 3.2 Behaviour at Infinity = 178
      • 4 First- and Second-Order Differentiation = 183
      • 4.1 Differentiable Convex Functions = 183
      • 4.2 Nondifferentiable Convex Functions = 188
      • 4.3 Second-Order Differentiation = 190
      • Ⅴ. Sublinearity and Support Functions = 195
      • 1 Sublinear Functions = 197
      • 1.1 Definitions and First Properties = 197
      • 1.2 Some Examples = 201
      • 1.3 The Convex Cone of All Closed Sublinear Functions = 206
      • 2 The Support Function of a Nonempty Set = 208
      • 2.1 Definitions, Interpretations = 208
      • 2.2 Basic Properties = 211
      • 2.3 Examples = 215
      • 3 The Isomorphism Between Closed Convex Sets and Closed Sublinear Functions = 218
      • 3.1 The Fundamental Correspondence = 218
      • 3.2 Example: Norms and Their Duals, Polarity = 220
      • 3.3 Calculus with Support Functions = 225
      • 3.4 Example: Support Functions of Closed Convex Polyhedra = 234
      • Ⅵ Subdifferentials of Finite Convex Functions = 237
      • 1 The Subdifferential: Definitions and Interpretations = 238
      • 1.1 First Definition: Directional Derivatives = 238
      • 1.2 Second Definition: Minorization by Affine Functions = 241
      • 1.3 Geometric Constructions and Interpretations = 243
      • 1.4 A Constructive Approach to the Existence of a Subgradient = 247
      • 2 Local Properties of the Subdifferential = 249
      • 2.1 First-Order Developments = 249
      • 2.2 Minimality Conditions = 253
      • 2.3 Mean-Value Theorems = 256
      • 3 First Examples = 258
      • 4 Calculus Rules with Subdifferentials = 261
      • 4.1 Positive Combinations of Functions = 261
      • 4.2 Pre-Composition with an Affine Mapping = 263
      • 4.3 Post-Composition with an Increasing Convex Function of Several Variables = 264
      • 4.4 Supremum of Convex Functions = 266
      • 4.5 Image of a Function Under a Linear Mapping = 272
      • 5 Further Examples = 275
      • 5.1 Largest Eigenvalue of a Symmetric Matrix = 275
      • 5.2 Nested Optimization = 277
      • 5.3 Best Approximation of a Continuous Function on a Compact Interval = 278
      • 6 The Subdifferential as a Multifunction = 279
      • 6.1 Monotonicity Properties of the Subdifferential = 280
      • 6.2 Continuity Properties of the Subdifferential = 282
      • 6.3 Subdifferentials and Limits of Gradients = 284
      • Ⅶ. Constrained Convex Minimization Problems : Minimality Conditions, Elements of Duality Theory = 291
      • 1 Abstract Minimality Conditions = 292
      • 1.1 A Geometric Characterization = 293
      • 1.2 Conceptual Exact Penalty = 298
      • 2 Minimality Conditions Involving Constraints Explicitly = 301
      • 2.1 Expressing the Normal and Tangent Cones in Terms of the Constraint-Functions = 303
      • 2.2 Constraint Qualification Conditions = 307
      • 2.3 The Strong Slater Assumption = 311
      • 2.4 Tackling the Minimization Problem with Its Data Directly = 314
      • 3 Properties and Interpretations of the Multipliers = 317
      • 3.1 Multipliers as a Means to Eliminate Constraints: the Lagrange Function = 317
      • 3.2 Multipliers and Exact Penalty = 320
      • 3.3 Multipliers as Sensitivity Parameters with Respect to Perturbations = 323
      • 4 Minimality Conditions and Saddle-Points = 327
      • 4.1 Saddle-Points: Definitions and First Properties = 327
      • 4.2 Mini-Maximization Problems = 330
      • 4.3 An Existence Result = 333
      • 4.4 Saddle-Points of Lagrange Functions = 336
      • 4.5 A First Step into Duality Theory = 338
      • Ⅷ. Descent Theory for Convex Minimization: The Case of Complete Information = 343
      • 1 Descent Directions and Steepest-Descent Schemes = 343
      • 1.1 Basic Definitions = 343
      • 1.2 Solving the Direction-Finding Problem = 347
      • 1.3 Some Particular Cases = 351
      • 1.4 Conclusion = 355
      • 2 Illustration. The Finite Minimax Problem = 356
      • 2.1 The Steepest-Descent Method for Finite Minimax Problems = 357
      • 2.2 Non-Convergence of the Steepest-Descent Method = 363
      • 2.3 Connection with Nonlinear Programming = 366
      • 3 The Practical Value of Descent Schemes = 371
      • 3.1 Large Minimax Problems = 371
      • 3.2 Infinite Minimax Problems = 373
      • 3.3 Smooth but Stiff Functions = 374
      • 3.4 The Steepest-Descent Trajectory = 377
      • 3.5 Conclusion = 383
      • Appendix: Notations = 385
      • 1 Some Facts About Optimization = 385
      • 2 The Set of Extended Real Numbers = 388
      • 3 Linear and Bilinear Algebra = 390
      • 4. Differentiation in a Euclidean Space = 393
      • 5 Set-Valued Analysis = 396
      • 6 A Bird's Eye View of Measure Theory and Integration = 399
      • Bibliographical Comments = 401
      • References = 407
      • Index = 415
      • [Volume. 2]----------
      • CONTENTS
      • Introduction = XV
      • Ⅸ. Inner Construction of the Subdifferential = 1
      • 1 The Elementary Mechanism = 2
      • 2 Convergence Properties = 9
      • 2.1 Convergence = 9
      • 2.2 Speed of Convergence = 15
      • 3 Putting the Mechanism in Perspective = 24
      • 3.1 Bundling as a Substitute for Steepest Descent = 24
      • 3.2 Bundling as an Emergency Device for Descent Methods = 27
      • 3.3 Bundling as a Separation Algorithm = 29
      • Ⅹ. Conjugacy in Convex Analysis = 35
      • 1 The Convex Conjugate of a Function = 37
      • 1.1 Definition and First Examples = 37
      • 1.2 Interpretations = 40
      • 1.3 First Properties = 42
      • 1.4 Subdifferentials of Extended-Valued Functions = 47
      • 1.5 Convexification and Subdifferentiability = 49
      • 2 Calculus Rules on the Conjugacy Opcration = 54
      • 2.1 Image of a Function Under a Linear Mapping = 54
      • 2.2 Pre-Composition with an Affine Mapping = 56
      • 2.3 Sum of Two Functions = 61
      • 2.4 Infima and Suprema = 65
      • 2.5 Post-Composition with an Increasing Convex Function = 69
      • 2.6 A Glimpse of Biconjugate Calculus = 71
      • 3 Various Examples = 72
      • 3.1 The Cramer Transformation = 72
      • 3.2 Some Results on the Euclidean Distance to a Closed Set = 73
      • 3.3 The Conjugate of Convex Partially Quadratic Functions = 75
      • 3.4 Polyhedral Functions = 76
      • 4 Differentiability of a Conjugate Function = 79
      • 4.1 First-Order Differentiability = 79
      • 4.2 Towards Second-Order Differentiability = 82
      • XI. Approximate Subdifferentials of Convex Functions = 91
      • 1 The Approximate Subdifferential = 92
      • 1.1 Definition, First Properties and Examples = 92
      • 1.2 Characterization via the Conjugate Function = 95
      • 1.3 Some Useful Properties = 98
      • 2 The Approximate Directional Derivative = 102
      • 2.1 The Support Function of the Approximate Subdifferential = 102
      • 2.2 Properties of the Approximate Difference Quotient = 106
      • 2.3 Behaviour of $$f'_\varepsilon $$ and $$T_ε$$ as Functions of ε = 110
      • 3 Calculus Rules on the Approximate Subdifferential = 113
      • 3.1 Sum of Functions = 113
      • 3.2 Pre-Composition with an Affine Mapping = 116
      • 3.3 Image and Marginal Functions = 118
      • 3.4 A Study of the Infimal Convolution = 119
      • 3.5 Maximum of Functions = 123
      • 3.6 Post-Composition with an Increasing Convex Function = 125
      • 4 The Approximate Subdifferential as a Multifunction = 127
      • 4.1 Continuity Properties of the Approximate Subdifferential = 127
      • 4.2 Transportation of Approximate Subgradients = 129
      • XII. Abstract Duality for Practitioners = 137
      • 1 The Problem and the General Approach = 137
      • 1.1 The Rules of the Game = 137
      • 1.2 Examples = 141
      • 2 The Necessary Theory = 147
      • 2.1 Preliminary Results: The Dual Problem = 147
      • 2.2 First Properties of the Dual Problem = 150
      • 2.3 Primal-Dual Optimality Characterizations = 154
      • 2.4 Existence of Dual Solutions = 157
      • 3 Illustrations = 161
      • 3.1 The Minimax Point of View = 161
      • 3.2 Inequality Constraints = 162
      • 3.3 Dualization of Linear Programs = 165
      • 3.4 Dualization of Quadratic Programs = 166
      • 3.5 Steepest-Descent Directions = 168
      • 4 Classical Dual Algorithms = 170
      • 4.1 Subgradient Optimization = 171
      • 4.2 The Cutting-Plane Algorithm = 174
      • 5 Putting the Method in Perspective = 178
      • 5.1 The Primal Function = 178
      • 5.2 Augmented Lagrangians = 181
      • 5.3 The Dualization Scheme in Various Situations = 185
      • 5.4 Fenchel's Duality = 190
      • XIII. Methods of ε-Descent = 195
      • 1 Introduction, Identifying the Approximate Subdifferential = 195
      • 1.1 The Problem and Its Solution = 195
      • 1.2 The Line-Search Function = 199
      • 1.3 The Schematic Algorithm = 203
      • 2 A Direct Implementation: Algorithm of ε-Descent = 206
      • 2.1 Iterating the Line-Search = 206
      • 2.2 Stopping the Line-Search = 209
      • 2.3 The ε-Descent Algorithm and Its Convergence = 212
      • 3 Putting the Algorithm in Perspective = 216
      • 3.1 A Pure Separation Form = 216
      • 3.2 A Totally Static Minimization Algorithm = 219
      • XIV. Dynamic Construction of Approximate Subdifferentials: Dual Form of Bundle Methods = 223
      • 1 Introduction: The Bundle of Information = 223
      • 1.1 Motivation = 223
      • 1.2 Constructing the Bundle of Information = 227
      • 2 Computing the Direction = 233
      • 2.1 The Quadratic Program = 233
      • 2.2 Minimality Conditions = 236
      • 2.3 Directional Derivatives Estimates = 241
      • 2.4 The Role of the Cutting-Plane Function = 244
      • 3 The Implementable Algorithm = 248
      • 3.1 Derivation of the Line-Search = 248
      • 3.2 The Implementable Line-Search and Its Convergence = 250
      • 3.3 Derivation of the Descent Algorithm = 254
      • 3.4 The Implementable Algorithm and Its Convergence = 257
      • 4 Numerical Illustrations = 263
      • 4.1 Typical Behaviour = 263
      • 4.2 The Role of ε = 266
      • 4.3 A Variant with Infinite ε : Conjugate Subgradients = 268
      • 4.4 The Role of the Stopping Criterion = 269
      • 4.5 The Role of Other Parameters = 271
      • 4.6 General Conclusions = 273
      • XV. Acceleration of the Cutting-Plane Algorithm: Primal Forms of Bundle Methods = 275
      • 1 Accelerating the Cutting-Plane Algorithm = 275
      • 1.1 Instability of Cutting Planes = 276
      • 1.2 Stabilizing Devices: Leading Principles = 279
      • 1.3 A Digression: Step-Control Strategies = 283
      • 2 A Variety of Stabilized Algorithms = 285
      • 2.1 The Trust-Region Point of View = 286
      • 2.2 The Penalization Point of View = 289
      • 2.3 The Relaxation Point of View = 292
      • 2.4 A Possible Dual Point of View = 295
      • 2.5 Conclusion = 299
      • 3 A Class of Primal Bundle Algorithms = 301
      • 3.1 The General Method = 301
      • 3.2 Convergence = 307
      • 3.3 Appropriate Stepsize Values = 314
      • 4 Bundle Methods as Regularizations = 317
      • 4.1 Basic Properties of the Moreau-Yosida Regularization = 317
      • 4.2 Minimizing the Moreau-Yosida Regularization = 322
      • 4.3 Computing the Moreau-Yosida Regularization = 326
      • Bibliographical Comments = 331
      • References = 337
      • Index = 345
      더보기

      온라인 도서 정보

      온라인 서점 구매

      온라인 서점 구매 정보
      서점명 서명 판매현황 종이책 전자책 구매링크
      정가 판매가(할인율) 포인트(포인트몰)
      알라딘

      Convex Analysis and Minimization Algorithms I: Fundamentals (Hardcover, 2)

      판매중 282,720원 231,830원 (18%)

      종이책 구매

      11,600포인트
      예스24.com

      Convex Analysis and Minimization Algorithms I: Fundamentals

      판매중 279,980원 251,980원 (10%)

      종이책 구매

      12,600포인트 (5%)
      • 포인트 적립은 해당 온라인 서점 회원인 경우만 해당됩니다.
      • 상기 할인율 및 적립포인트는 온라인 서점에서 제공하는 정보와 일치하지 않을 수 있습니다.
      • RISS 서비스에서는 해당 온라인 서점에서 구매한 상품에 대하여 보증하거나 별도의 책임을 지지 않습니다.

      분석정보

      View

      상세정보조회

      0

      Usage

      원문다운로드

      0

      대출신청

      0

      복사신청

      0

      EDDS신청

      0

      동일 주제 내 활용도 TOP

      더보기

      이 자료와 함께 이용한 RISS 자료

      나만을 위한 추천자료

      해외이동버튼