Atjaunināt sīkdatņu piekrišanu

E-grāmata: Probabilistic Networks and Expert Systems: Exact Computational Methods for Bayesian Networks

Citas grāmatas par šo tēmu:
  • Formāts - PDF+DRM
  • Cena: 130,27 €*
  • * ši ir gala cena, t.i., netiek piemērotas nekādas papildus atlaides
  • Ielikt grozā
  • Pievienot vēlmju sarakstam
  • Šī e-grāmata paredzēta tikai personīgai lietošanai. E-grāmatas nav iespējams atgriezt un nauda par iegādātajām e-grāmatām netiek atmaksāta.
Citas grāmatas par šo tēmu:

DRM restrictions

  • Kopēšana (kopēt/ievietot):

    nav atļauts

  • Drukāšana:

    nav atļauts

  • Lietošana:

    Digitālo tiesību pārvaldība (Digital Rights Management (DRM))
    Izdevējs ir piegādājis šo grāmatu šifrētā veidā, kas nozīmē, ka jums ir jāinstalē bezmaksas programmatūra, lai to atbloķētu un lasītu. Lai lasītu šo e-grāmatu, jums ir jāizveido Adobe ID. Vairāk informācijas šeit. E-grāmatu var lasīt un lejupielādēt līdz 6 ierīcēm (vienam lietotājam ar vienu un to pašu Adobe ID).

    Nepieciešamā programmatūra
    Lai lasītu šo e-grāmatu mobilajā ierīcē (tālrunī vai planšetdatorā), jums būs jāinstalē šī bezmaksas lietotne: PocketBook Reader (iOS / Android)

    Lai lejupielādētu un lasītu šo e-grāmatu datorā vai Mac datorā, jums ir nepieciešamid Adobe Digital Editions (šī ir bezmaksas lietotne, kas īpaši izstrādāta e-grāmatām. Tā nav tas pats, kas Adobe Reader, kas, iespējams, jau ir jūsu datorā.)

    Jūs nevarat lasīt šo e-grāmatu, izmantojot Amazon Kindle.

Probabilistic expert systems are graphical networks which support the modeling of uncertainty and decisions in large complex domains, while retaining ease of calculation. Building on original research by the authors, this book gives a thorough and rigorous mathematical treatment of the underlying ideas, structures, and algorithms. The book will be of interest to researchers in both artificial intelligence and statistics, who desire an introduction to this fascinating and rapidly developing field.

WINNER OF THE 2001 DEGROOT PRIZE!Probabilistic expert systems are graphical networks that support the modelling of uncertainty and decisions in large complex domains, while retaining ease of calculation. Building on original research by the authors over a number of years, this book gives a thorough and rigorous mathematical treatment of the underlying ideas, structures, and algorithms, emphasizing those cases in which exact answers are obtainable. It covers both the updating of probabilistic uncertainty in the light of new evidence, and statistical inference, about unknown probabilities or unknown model structure, in the light of new data. The careful attention to detail will make this work an important reference source for all those involved in the theory and applications of probabilistic expert systems.This book was awarded the first DeGroot Prize by the International Society for Bayesian Analysis for a book making an important, timely, thorough, and notably original contribution to the statistics literature.

Recenzijas

From the reviews:



JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION



"This important book fills a void in the graphical Markov models literature. The authors have summarized their extensive and influential work in this area and provided a valuable resource both for educators and for practitioners."

Papildus informācija

Springer Book Archives
Preface v
Introduction
1(4)
What is this book about?
1(1)
What is in this book?
2(1)
What is not in this book?
3(1)
How should this book be used?
4(1)
Logic, Uncertainty, and Probability
5(20)
What is an expert system?
5(1)
Diagnostic decision trees
6(1)
Production systems
7(1)
Coping with uncertainty
8(2)
The naive probabilistic approach
10(1)
Interpretations of probability
11(2)
Axioms
13(1)
Bayes' theorem
14(3)
Bayesian reasoning in expert systems
17(4)
A broader context for probabilistic expert systems
21(4)
Building and Using Probabilistic Networks
25(18)
Graphical modelling of the domain
26(5)
Qualitative modelling
27(1)
Probabilistic modelling
28(1)
Quantitative modelling
29(1)
Further background to the elicitation process
29(2)
From specification to inference engine
31(3)
Moralization
31(2)
From moral graph to junction tree
33(1)
The inference process
34(3)
The clique-marginal representation
36(1)
Incorporation of evidence
36(1)
Bayesian networks as expert systems
37(3)
Background references and further reading
40(3)
Structuring the graph
40(1)
Specifying the probability distribution
40(3)
Graph Theory
43(20)
Basic concepts
43(6)
Chordal and decomposable graphs
49(3)
Junction trees
52(3)
From chain graph to junction tree
55(6)
Triangulation
57(2)
Elimination tree
59(2)
Background references and further reading
61(2)
Markov Properties on Graphs
63(20)
Conditional independence
63(3)
Markov fields over undirected graphs
66(4)
Markov properties on directed acyclic graphs
70(5)
Markov properties on chain graphs
75(4)
Current research directions
79(1)
Markov equivalence
79(1)
Other graphical representations
80(1)
Background references and further reading
80(3)
Discrete Networks
83(42)
An illustration of local computation
84(1)
Definitions
85(2)
Basic operations
86(1)
Local computation on the junction tree
87(8)
Graphical specification
87(1)
Numerical specification and initialization
87(1)
Charges
88(1)
Flow of information between adjacent cliques
88(1)
Active flows
89(1)
Reaching equilibrium
90(2)
Scheduling of flows
92(1)
Two-phase propagation
92(1)
Entering and propagating evidence
93(2)
A propagation example
95(1)
Generalized marginalization operations
95(14)
Maximization
97(2)
Degeneracy of the most probable configuration
99(1)
Simulation
99(2)
Finding the M most probable configurations
101(2)
Sampling without replacement
103(1)
Fast retraction
104(2)
Moments of functions
106(3)
Example: Ch-Asia
109(11)
Description
109(1)
Graphical specification
109(1)
Numerical specification
109(3)
Initialization
112(2)
Propagation without evidence
114(1)
Propagation with evidence
114(5)
Max-propagation
119(1)
Dealing with large cliques
120(3)
Truncating small numbers
121(1)
Splitting cliques
122(1)
Current research directions and further reading
123(2)
Gaussian and Mixed Discrete-Gaussian Networks
125(30)
CG distributions
126(1)
Basic operations on CG potentials
127(4)
Marked graphs and their junction trees
131(4)
Decomposition of marked graphs
131(2)
Junction trees with strong roots
133(2)
Model specification
135(2)
Operating in the junction tree
137(6)
Initializing the junction tree
138(1)
Charges
138(1)
Entering evidence
139(1)
Flow of information between adjacent cliques
139(2)
Two-phase propagation
141(2)
A simple Gaussian example
143(1)
Example: Waste
144(6)
Structural specification
145(1)
Numerical specification
146(1)
Strong triangulation
147(1)
Forming the junction tree
148(1)
Initializing the junction tree
148(1)
Entering evidence
149(1)
Complexity considerations
150(1)
Numerical instability problems
151(1)
Exact marginal densities
152(1)
Current research directions
152(1)
Background references and further reading
153(2)
Discrete Multistage Decision Networks
155(34)
The nature of multistage decision problems
156(1)
Solving the decision problem
157(2)
Decision potentials
159(4)
Network specification and solution
163(9)
Structural and numerical specification
163(2)
Causal consistency lemma
165(1)
Making the elimination tree
166(1)
Initializing the elimination tree
167(1)
Message passing in the elimination tree
168(1)
Proof of elimination tree solution
169(3)
Example: Oil Wildcatter
172(5)
Specification
172(3)
Making the elimination tree
175(1)
Initializing the elimination tree
176(1)
Collecting evidence
177(1)
Example: Dec-Asia
177(6)
Triangulation issues
183(1)
Asymmetric problems
184(3)
Background references and further reading
187(2)
Learning About Probabilities
189(36)
Statistical modelling and parameter learning
189(1)
Parametrizing a directed Markov model
190(2)
Maximum likelihood with complete data
192(1)
Bayesian updating with complete data
193(7)
Priors for DAG models
193(4)
Specifying priors: An example
197(2)
Updating priors with complete data: An example
199(1)
Incomplete data
200(2)
Sequential and batch methods
201(1)
Maximum likelihood with incomplete data
202(2)
The EM algorithm
202(2)
Penalized EM algorithm
204(1)
Bayesian updating with incomplete data
204(12)
Exact theory
206(1)
Retaining global independence
207(2)
Retaining local independence
209(2)
Reducing the mixtures
211(2)
Simulation results: full mixture reduction
213(1)
Simulation results: partial mixture reduction
214(2)
Using Gibbs sampling for learning
216(5)
Hyper Markov laws for undirected models
221(1)
Current research directions and further reading
222(3)
Checking Models Against Data
225(18)
Scoring rules
226(3)
Standardization
227(2)
Parent-child monitors
229(5)
Batch monitors
232(1)
Missing data
233(1)
Node monitors
234(1)
Global monitors
235(3)
Example: Child
236(2)
Simulation experiments
238(3)
Further reading
241(2)
Structural Learning
243(22)
Purposes of modelling
244(1)
Inference about models
244(1)
Criteria for comparing models
245(6)
Maximized likelihood
246(1)
Predictive assessment
247(1)
Marginal likelihood
248(1)
Model probabilities
249(1)
Model selection and model averaging
250(1)
Graphical models and conditional independence
251(2)
Classes of models
253(3)
Models containing only observed quantities
253(1)
Models with latent or hidden variables
254(1)
Missing data
255(1)
Handling multiple models
256(9)
Search strategies
256(2)
Probability specification
258(2)
Prior information on parameters
260(1)
Variable precision
261(4)
Epilogue
265(16)
Conjugate Analysis for Discrete Data
267(4)
Bernoulli process
267(2)
Multinomial process
269(2)
Gibbs Sampling
271(6)
Gibbs sampling
271(2)
Sampling from the moral graph
273(1)
General probability densities
274(1)
Further reading
275(2)
Information and Software on the World Wide Web
277(4)
Information about probabilistic networks
277(2)
Software for probabilistic networks
279(1)
Markov chain Monte Carlo methods
280(1)
Bibliography 281(26)
Author Index 307(6)
Subject Index 313