Atjaunināt sīkdatņu piekrišanu

E-grāmata: Derivative-Free and Blackbox Optimization

Citas grāmatas par šo tēmu:
  • Formāts - PDF+DRM
  • Cena: 65,42 €*
  • * ši ir gala cena, t.i., netiek piemērotas nekādas papildus atlaides
  • Ielikt grozā
  • Pievienot vēlmju sarakstam
  • Šī e-grāmata paredzēta tikai personīgai lietošanai. E-grāmatas nav iespējams atgriezt un nauda par iegādātajām e-grāmatām netiek atmaksāta.
Citas grāmatas par šo tēmu:

DRM restrictions

  • Kopēšana (kopēt/ievietot):

    nav atļauts

  • Drukāšana:

    nav atļauts

  • Lietošana:

    Digitālo tiesību pārvaldība (Digital Rights Management (DRM))
    Izdevējs ir piegādājis šo grāmatu šifrētā veidā, kas nozīmē, ka jums ir jāinstalē bezmaksas programmatūra, lai to atbloķētu un lasītu. Lai lasītu šo e-grāmatu, jums ir jāizveido Adobe ID. Vairāk informācijas šeit. E-grāmatu var lasīt un lejupielādēt līdz 6 ierīcēm (vienam lietotājam ar vienu un to pašu Adobe ID).

    Nepieciešamā programmatūra
    Lai lasītu šo e-grāmatu mobilajā ierīcē (tālrunī vai planšetdatorā), jums būs jāinstalē šī bezmaksas lietotne: PocketBook Reader (iOS / Android)

    Lai lejupielādētu un lasītu šo e-grāmatu datorā vai Mac datorā, jums ir nepieciešamid Adobe Digital Editions (šī ir bezmaksas lietotne, kas īpaši izstrādāta e-grāmatām. Tā nav tas pats, kas Adobe Reader, kas, iespējams, jau ir jūsu datorā.)

    Jūs nevarat lasīt šo e-grāmatu, izmantojot Amazon Kindle.

This book is designed as a textbook, suitable for self-learning or for teaching an upper-year university course on derivative-free and blackbox optimization. 

The book is split into 5 parts and is designed to be modular; any individual part depends only on the material in Part I.  Part I of the book discusses what is meant by Derivative-Free and Blackbox Optimization, provides background material, and early basics while Part II focuses on heuristic methods (Genetic Algorithms and Nelder-Mead).  Part III presents direct search methods (Generalized Pattern Search and Mesh Adaptive Direct Search) and Part IV focuses on model-based methods (Simplex Gradient and Trust Region).  Part V discusses dealing with constraints, using surrogates, and bi-objective optimization.





End of chapter exercises are included throughout as well as 15 end of chapter projects and over 40 figures.  Benchmarking techniques are also presented in the appendix.

Recenzijas

It is a wonderful textbook that can be used entirely or partially to support optimization courses. the authors have achieved gloriously their stated goal of providing a clear grasp of the foundational concepts in derivative-free and blackbox optimization. I wish that it will find its way somehow to the desks of engineering design optimization practitioners. (Michael Kokkolaras, Optimization and Engineering, Vol. 20, 2019) This book targets two audiences: individuals interested in understanding derivative-free optimization (DFO) and blackbox optimization and practitioners who have to solve real-world problems that cannot be approached by traditional gradient-based methods. The book is written in a clear style with sufficient details, examples and proofs of theoretical results. The authors pay equalattention to careful theoretical development and analysis of the methods, and to practical details of the algorithms. (Olga Brezhneva, Mathematical Reviews, October, 2018) The authors present a comprehensive textbook being an introduction to blackbox and derivative- free optimization. The book is for sure a necessary position for students of mathematics, IT or engineering that would like to explore the subject of blackbox and derivative-free optimization. Also the researchers in the area of optimization could treat it as an introductory reading. Finally, the book would be also a good choice for practitionners dealing with such kind of problems. (Marcin Anholcer, zbMATH 1391.90001, 2018)

Foreword v
Preface ix
Part 1 Introduction and Background Material
1(54)
Chapter 1 Introduction: Tools and Challenges in Derivative-Free and Blackbox Optimization
3(12)
1.1 What Are Derivative-Free and Blackbox Optimization?
3(3)
1.2 Classifications of Optimization Problems
6(2)
1.3 Example Applications
8(3)
1.4 Remarks on Blackbox Optimization Problems
11(4)
Chapter 2 Mathematical Background
15(18)
2.1 Vectors, Matrices, and Norms
16(1)
2.2 Functions and Multivariate Calculus
17(3)
2.3 Descent Directions
20(1)
2.4 Basic Properties of Sets
21(1)
2.5 Convexity
22(2)
2.6 Simplices
24(9)
Chapter 3 The Beginnings of DFO Algorithms
33(22)
3.1 Exhaustive Search
33(2)
3.2 Grid Search
35(2)
3.3 Coordinate Search
37(3)
3.4 Selecting a Starting Point
40(1)
3.5 Convergence Analysis of Coordinate Search
41(3)
3.6 Algorithmic Variants of Coordinate Search
44(3)
3.7 Methods in This Book
47(8)
Some Remarks on DFO
53(2)
Part 2 Popular Heuristic Methods
55(38)
Chapter 4 Genetic Algorithms
57(18)
4.1 Biology Overview and the GA Algorithm
58(1)
4.2 Fitness and Selection
59(3)
4.3 Encoding
62(2)
4.4 Crossover and Mutation
64(3)
4.5 Convergence and Stopping
67(8)
Chapter 5 Nelder-Mead
75(18)
5.1 The NM Algorithm
76(2)
5.2 The Nelder-Mead Simplex
78(2)
5.3 Convergence Study
80(1)
5.4 The McKinnon Example
81(12)
Further Remarks on Heuristics
90(3)
Part 3 Direct Search Methods
93(64)
Chapter 6 Positive Bases and Nonsmooth Optimization
95(20)
6.1 Positive Bases
96(3)
6.2 Constructing Positive Bases
99(2)
6.3 Positive Bases and Descent Directions
101(1)
6.4 Optimality Conditions for Unconstrained Problems
101(6)
6.5 Optimality Conditions for Constrained Problems
107(8)
Chapter 7 Generalised Pattern Search
115(20)
7.1 The GPS Algorithm
116(3)
7.2 Opportunistic Strategy, the search Step, and Starting Points Selection
119(1)
7.3 The Mesh
120(2)
7.4 Convergence
122(7)
7.5 Numerical Experiments with the Rheology Problem
129(6)
Chapter 8 Mesh Adaptive Direct Search
135(22)
8.1 The MADS Algorithm
136(4)
8.2 Opportunistic Strategy, the search Step, and Starting Points Selection
140(1)
8.3 Convergence Analysis of MADS
140(3)
8.4 Dense Sets of Polling Directions
143(5)
8.5 Further Experiments with the Rheology Optimization Problem
148(9)
Further Remarks on Direct Search Methods
154(3)
Part 4 Model-Based Methods
157(62)
Chapter 9 Building Linear and Quadratic Models
159(24)
9.1 Fully Linear Models
160(3)
9.2 Linear Interpolation
163(3)
9.3 Error Bounds for Linear Interpolation
166(4)
9.4 Linear Regression
170(2)
9.5 Quadratic Models
172(11)
Chapter 10 Model-Based Descent
183(18)
10.1 Controllable Accuracy
184(1)
10.2 The MBD Algorithm
185(2)
10.3 Flexibility in the MBD Algorithm and Stopping Criteria
187(1)
10.4 The MBD Algorithm Has Successful Iterations
188(6)
10.5 Convergence
194(2)
10.6 Additional Experiments with the Rheology Problem
196(5)
Chapter 11 Model-Based Trust Region
201(18)
11.1 The MBTR Algorithm
202(3)
11.2 Model Checks and Stopping Conditions
205(1)
11.3 Solving the Trust Region Subproblem
206(3)
11.4 Convergence
209(4)
11.5 More Experiments with the Rheology Optimization Problem
213(6)
Further Remarks on Model-Based Methods
217(2)
Part 5 Extensions and Refinements
219(44)
Chapter 12 Variables and Constraints
221(14)
12.1 Types of Variables
222(2)
12.2 Types of Constraints
224(1)
12.3 The Constraint Violation Function
225(3)
12.4 Relaxable Constraints by the Progressive Barrier
228(7)
Chapter 13 Optimization Using Surrogates and Models
235(12)
13.1 Surrogate Problem and Surrogate Functions
235(3)
13.2 The Surrogate Management Framework
238(3)
13.3 Final Experiments with the Rheology Optimization Problem
241(6)
Chapter 14 Biobjective Optimization
247(16)
14.1 The Pareto Set and Front
248(1)
14.2 Single-Objective Approaches
249(4)
14.3 Biobjective Optimization Algorithm
253(10)
Final Remarks on DFO/BBO
261(2)
Appendix A Comparing Optimization Methods
263(18)
A.1 Test Sets
264(1)
A.2 Data Collection
265(2)
A.3 Data Analysis
267(14)
Solutions to Selected Exercises 281(8)
References 289(10)
Index 299
Dr. Charles Audet is a Professor of Mathematics at the École Polytechnique de Montréal. His research interests include the analysis and development of algorithms for blackbox nonsmooth optimization, and structured global optimization. He obtained a Ph.D. degree in applied mathematics from the École Polytechnique de Montréal, and worked as a post-doc at Rice University in Houston, Texas.

Dr. Warren Hare received his Ph.D. in Mathematical Optimization from Simon Fraser University.  He complete postdoctoral research at IMPA (Brazil) and McMaster (Canada), before joining the University of British Columbia (Canada).