Atjaunināt sīkdatņu piekrišanu

E-grāmata: Mathematical Foundations of Infinite-Dimensional Statistical Models

, (University of Cambridge)
  • Formāts - PDF+DRM
  • Cena: 52,34 €*
  • * ši ir gala cena, t.i., netiek piemērotas nekādas papildus atlaides
  • Ielikt grozā
  • Pievienot vēlmju sarakstam
  • Šī e-grāmata paredzēta tikai personīgai lietošanai. E-grāmatas nav iespējams atgriezt un nauda par iegādātajām e-grāmatām netiek atmaksāta.

DRM restrictions

  • Kopēšana (kopēt/ievietot):

    nav atļauts

  • Drukāšana:

    nav atļauts

  • Lietošana:

    Digitālo tiesību pārvaldība (Digital Rights Management (DRM))
    Izdevējs ir piegādājis šo grāmatu šifrētā veidā, kas nozīmē, ka jums ir jāinstalē bezmaksas programmatūra, lai to atbloķētu un lasītu. Lai lasītu šo e-grāmatu, jums ir jāizveido Adobe ID. Vairāk informācijas šeit. E-grāmatu var lasīt un lejupielādēt līdz 6 ierīcēm (vienam lietotājam ar vienu un to pašu Adobe ID).

    Nepieciešamā programmatūra
    Lai lasītu šo e-grāmatu mobilajā ierīcē (tālrunī vai planšetdatorā), jums būs jāinstalē šī bezmaksas lietotne: PocketBook Reader (iOS / Android)

    Lai lejupielādētu un lasītu šo e-grāmatu datorā vai Mac datorā, jums ir nepieciešamid Adobe Digital Editions (šī ir bezmaksas lietotne, kas īpaši izstrādāta e-grāmatām. Tā nav tas pats, kas Adobe Reader, kas, iespējams, jau ir jūsu datorā.)

    Jūs nevarat lasīt šo e-grāmatu, izmantojot Amazon Kindle.

In nonparametric and high-dimensional statistical models, the classical GaussFisherLe Cam theory of the optimality of maximum likelihood estimators and Bayesian posterior inference does not apply, and new foundations and ideas have been developed in the past several decades. This book gives a coherent account of the statistical theory in infinite-dimensional parameter spaces. The mathematical foundations include self-contained 'mini-courses' on the theory of Gaussian and empirical processes, approximation and wavelet theory, and the basic theory of function spaces. The theory of statistical inference in such models - hypothesis testing, estimation and confidence sets - is presented within the minimax paradigm of decision theory. This includes the basic theory of convolution kernel and projection estimation, but also Bayesian nonparametrics and nonparametric maximum likelihood estimation. In a final chapter the theory of adaptive inference in nonparametric models is developed, including Lepski's method, wavelet thresholding, and adaptive inference for self-similar functions. Winner of the 2017 PROSE Award for Mathematics.

Papildus informācija

Now in paperback: the new classic on the theory of statistical inference in statistical models with an infinite-dimensional parameter space.
Preface xi
1 Nonparametric Statistical Models
1(14)
1.1 Statistical Sampling Models
2(2)
1.1.1 Nonparametric Models for Probability Measures
2(1)
1.1.2 Indirect Observations
3(1)
1.2 Gaussian Models
4(9)
1.2.1 Basic Ideas of Regression
4(2)
1.2.2 Some Nonparametric Gaussian Models
6(2)
1.2.3 Equivalence of Statistical Experiments
8(5)
1.3 Notes
13(2)
2 Gaussian Processes
15(94)
2.1 Definitions, Separability, 0-1 Law, Concentration
15(11)
2.1.1 Stochastic Processes: Preliminaries and Definitions
15(4)
2.1.2 Gaussian Processes: Introduction and First Properties
19(7)
2.2 Isoperimetric Inequalities with Applications to Concentration
26(10)
2.2.1 The Isoperimetric Inequality on the Sphere
26(4)
2.2.2 The Gaussian Isoperimetric Inequality for the Standard Gaussian Measure on RN
30(2)
2.2.3 Application to Gaussian Concentration
32(4)
2.3 The Metric Entropy Bound for Suprema of Sub-Gaussian Processes
36(12)
2.4 Anderson's Lemma, Comparison and Sudakov's Lower Bound
48(12)
2.4.1 Anderson's Lemma
48(4)
2.4.2 Slepian's Lemma and Sudakov's Minorisation
52(8)
2.5 The Log-Sobolev Inequality and Further Concentration
60(6)
2.5.1 Some Properties of Entropy: Variational Definition and Tensorisation
60(2)
2.5.2 A First Instance of the Herbst (or Entropy) Method: Concentration of the Norm of a Gaussian Variable about Its Expectation
62(4)
2.6 Reproducing Kernel Hilbert Spaces
66(22)
2.6.1 Definition and Basic Properties
66(6)
2.6.2 Some Applications of RKHS: Isoperimetric Inequality, Equivalence and Singularity, Small Ball Estimates
72(7)
2.6.3 An Example: RKHS and Lower Bounds for Small Ball Probabilities of Integrated Brownian Motion
79(9)
2.7 Asymptotics for Extremes of Stationary Gaussian Processes
88(14)
2.8 Notes
102(7)
3 Empirical Processes
109(182)
3.1 Definitions, Overview and Some Background Inequalities
109(26)
3.1.1 Definitions and Overview
109(4)
3.1.2 Exponential and Maximal Inequalities for Sums of Independent Centred and Bounded Real Random Variables
113(8)
3.1.3 The Levy and Hoffmann-Jorgensen Inequalities
121(6)
3.1.4 Symmetrisation, Randomisation, Contraction
127(8)
3.2 Rademacher Processes
135(14)
3.2.1 A Comparison Principle for Rademacher Processes
136(3)
3.2.2 Convex Distance Concentration and Rademacher Processes
139(5)
3.2.3 A Lower Bound for the Expected Supremum of a Rademacher Process
144(5)
3.3 The Entropy Method and Talagrand's Inequality
149(22)
3.3.1 The Subadditivity Property of the Empirical Process
149(4)
3.3.2 Differential Inequalities and Bounds for Laplace Transforms of Subadditive Functions and Centred Empirical Processes, λ ≥ 0
153(5)
3.3.3 Differential Inequalities and Bounds for Laplace Transforms of Centred Empirical Processes, λ < 0
158(3)
3.3.4 The Entropy Method for Random Variables with Bounded Differences and for Self-Bounding Random Variables
161(4)
3.3.5 The Upper Tail in Talagrand's Inequality for Nonidentically Distributed Random Variables*
165(6)
3.4 First Applications of Talagrand's Inequality
171(13)
3.4.1 Moment Inequalities
171(2)
3.4.2 Data-Driven Inequalities: Rademacher Complexities
173(2)
3.4.3 A Bernstein-Type Inequality for Canonical U-statistics of Order 2
175(9)
3.5 Metric Entropy Bounds for Suprema of Empirical Processes
184(28)
3.5.1 Random Entropy Bounds via Randomisation
184(11)
3.5.2 Bracketing I: An Expectation Bound
195(11)
3.5.3 Bracketing II: An Exponential Bound for Empirical Processes over Not Necessarily Bounded Classes of Functions
206(6)
3.6 Vapnik-Cervonenkis Classes of Sets and Functions
212(16)
3.6.1 Vapnik-Cervonenkis Classes of Sets
212(5)
3.6.2 VC Subgraph Classes of Functions
217(5)
3.6.3 VC Hull and VC Major Classes of Functions
222(6)
3.7 Limit Theorems for Empirical Processes
228(58)
3.7.1 Some Measurability
229(4)
3.7.2 Uniform Laws of Large Numbers (Glivenko-Cantelli Theorems)
233(9)
3.7.3 Convergence in Law of Bounded Processes
242(8)
3.7.4 Central Limit Theorems for Empirical Processes I: Definition and Some Properties of Donsker Classes of Functions
250(7)
3.7.5 Central Limit Theorems for Empirical Processes II: Metric and Bracketing Entropy Sufficient Conditions for the Donsker Property
257(4)
3.7.6 Central Limit Theorems for Empirical Processes III: Limit Theorems Uniform in P and Limit Theorems for P-Pre-Gaussian Classes
261(25)
3.8 Notes
286(5)
4 Function Spaces and Approximation Theory
291(98)
4.1 Definitions and Basic Approximation Theory
291(14)
4.1.1 Notation and Preliminaries
291(4)
4.1.2 Approximate Identities
295(6)
4.1.3 Approximation in Sobolev Spaces by General Integral Operators
301(3)
4.1.4 Littlewood-Paley Decomposition
304(1)
4.2 Orthonormal Wavelet Bases
305(22)
4.2.1 Multiresolution Analysis of L2
305(7)
4.2.2 Approximation with Periodic Kernels
312(4)
4.2.3 Construction of Scaling Functions
316(11)
4.3 Besov Spaces
327(52)
4.3.1 Definitions and Characterisations
327(11)
4.3.2 Basic Theory of the Spaces Bspq
338(9)
4.3.3 Relationships to Classical Function Spaces
347(5)
4.3.4 Periodic Besov Spaces on [ 0, 1]
352(9)
4.3.5 Boundary-Corrected Wavelet Bases*
361(5)
4.3.6 Besov Spaces on Subsets of Rd
366(6)
4.3.7 Metric Entropy Estimates
372(7)
4.4 Gaussian and Empirical Processes in Besov Spaces
379(7)
4.4.1 Random Gaussian Wavelet Series in Besov Spaces
379(2)
4.4.2 Donsker Properties of Balls in Besov Spaces
381(5)
4.5 Notes
386(3)
5 Linear Nonparametric Estimators
389(78)
5.1 Kernel and Projection-Type Estimators
389(32)
5.1.1 Moment Bounds
391(14)
5.1.2 Exponential Inequalities, Higher Moments and Almost-Sure Limit Theorems
405(6)
5.1.3 A Distributional Limit Theorem for Uniform Deviations*
411(10)
5.2 Weak and Multiscale Metrics
421(18)
5.2.1 Smoothed Empirical Processes
421(13)
5.2.2 Multiscale Spaces
434(5)
5.3 Some Further Topics
439(23)
5.3.1 Estimation of Functionals
439(12)
5.3.2 Deconvolution
451(11)
5.4 Notes
462(5)
6 The Minimax Paradigm
467(74)
6.1 Likelihoods and Information
467(9)
6.1.1 Infinite-Dimensional Gaussian Likelihoods
468(5)
6.1.2 Basic Information Theory
473(3)
6.2 Testing Nonparametric Hypotheses
476(35)
6.2.1 Construction of Tests for Simple Hypotheses
478(7)
6.2.2 Minimax Testing of Uniformity on [ 0, 1]
485(7)
6.2.3 Minimax Signal-Detection Problems in Gaussian White Noise
492(2)
6.2.4 Composite Testing Problems
494(17)
6.3 Nonparametric Estimation
511(11)
6.3.1 Minimax Lower Bounds via Multiple Hypothesis Testing
512(3)
6.3.2 Function Estimation in L∞ Loss
515(3)
6.3.3 Function Estimation in Lp-Loss
518(4)
6.4 Nonparametric Confidence Sets
522(15)
6.4.1 Honest Minimax Confidence Sets
523(1)
6.4.2 Confidence Sets for Nonparametric Estimators
524(13)
6.5 Notes
537(4)
7 Likelihood-Based Procedures
541(66)
7.1 Nonparametric Testing in Hellinger Distance
542(4)
7.2 Nonparametric Maximum Likelihood Estimators
546(24)
7.2.1 Rates of Convergence in Hellinger Distance
547(4)
7.2.2 The Information Geometry of the Likelihood Function
551(3)
7.2.3 The Maximum Likelihood Estimator over a Sobolev Ball
554(9)
7.2.4 The Maximum Likelihood Estimator of a Monotone Density
563(7)
7.3 Nonparametric Bayes Procedures
570(33)
7.3.1 General Contraction Results for Posterior Distributions
573(5)
7.3.2 Contraction Results with Gaussian Priors
578(4)
7.3.3 Product Priors in Gaussian Regression
582(9)
7.3.4 Nonparametric Bernstein-von Mises Theorems
591(12)
7.4 Notes
603(4)
8 Adaptive Inference
607(60)
8.1 Adaptive Multiple-Testing Problems
607(7)
8.1.1 Adaptive Testing with L2-Alternatives
608(4)
8.1.2 Adaptive Plug-in Tests for L∞-Alternatives
612(2)
8.2 Adaptive Estimation
614(14)
8.2.1 Adaptive Estimation in L2
614(6)
8.2.2 Adaptive Estimation in L∞
620(8)
8.3 Adaptive Confidence Sets
628(36)
8.3.1 Confidence Sets in Two-Class Adaptation Problems
629(9)
8.3.2 Confidence Sets for Adaptive Estimators I
638(6)
8.3.3 Confidence Sets for Adaptive Estimators II: Self-Similar Functions
644(13)
8.3.4 Some Theory for Self-Similar Functions
657(7)
8.4 Notes
664(3)
References 667(16)
Author Index 683(4)
Index 687
Evarist Giné (19442015) was Head of the Department of Mathematics at the University of Connecticut. Giné was a distinguished mathematician who worked on mathematical statistics and probability in infinite dimensions. He was the author of two books and more than 100 articles. Richard Nickl is Professor of Mathematical Statistics in the Statistical Laboratory within the Department of Pure Mathematics and Mathematical Statistics at the University of Cambridge.