Atjaunināt sīkdatņu piekrišanu

E-grāmata: System Parameter Identification: Information Criteria and Algorithms

, (Department of Electrical and Computer Engineering, University of Florida, Gainesville, FL, USA and Department of Precision Instruments and Mechanology, Tsinghua University, Beijing, China), (Department of Precision Instruments and Mecha),
  • Formāts: EPUB+DRM
  • Izdošanas datums: 17-Jul-2013
  • Izdevniecība: Elsevier Science Publishing Co Inc
  • Valoda: eng
  • ISBN-13: 9780124045958
Citas grāmatas par šo tēmu:
  • Formāts - EPUB+DRM
  • Cena: 86,85 €*
  • * ši ir gala cena, t.i., netiek piemērotas nekādas papildus atlaides
  • Ielikt grozā
  • Pievienot vēlmju sarakstam
  • Šī e-grāmata paredzēta tikai personīgai lietošanai. E-grāmatas nav iespējams atgriezt un nauda par iegādātajām e-grāmatām netiek atmaksāta.
  • Formāts: EPUB+DRM
  • Izdošanas datums: 17-Jul-2013
  • Izdevniecība: Elsevier Science Publishing Co Inc
  • Valoda: eng
  • ISBN-13: 9780124045958
Citas grāmatas par šo tēmu:

DRM restrictions

  • Kopēšana (kopēt/ievietot):

    nav atļauts

  • Drukāšana:

    nav atļauts

  • Lietošana:

    Digitālo tiesību pārvaldība (Digital Rights Management (DRM))
    Izdevējs ir piegādājis šo grāmatu šifrētā veidā, kas nozīmē, ka jums ir jāinstalē bezmaksas programmatūra, lai to atbloķētu un lasītu. Lai lasītu šo e-grāmatu, jums ir jāizveido Adobe ID. Vairāk informācijas šeit. E-grāmatu var lasīt un lejupielādēt līdz 6 ierīcēm (vienam lietotājam ar vienu un to pašu Adobe ID).

    Nepieciešamā programmatūra
    Lai lasītu šo e-grāmatu mobilajā ierīcē (tālrunī vai planšetdatorā), jums būs jāinstalē šī bezmaksas lietotne: PocketBook Reader (iOS / Android)

    Lai lejupielādētu un lasītu šo e-grāmatu datorā vai Mac datorā, jums ir nepieciešamid Adobe Digital Editions (šī ir bezmaksas lietotne, kas īpaši izstrādāta e-grāmatām. Tā nav tas pats, kas Adobe Reader, kas, iespējams, jau ir jūsu datorā.)

    Jūs nevarat lasīt šo e-grāmatu, izmantojot Amazon Kindle.

Recently, criterion functions based on information theoretic measures (entropy, mutual information, information divergence) have attracted attention and become an emerging area of study in signal processing and system identification domain. This book presents a systematic framework for system identification and information processing, investigating system identification from an information theory point of view. The book is divided into six chapters, which cover the information needed to understand the theory and application of system parameter identification. The authors’ research provides a base for the book, but it incorporates the results from the latest international research publications.

  • One of the first books to present system parameter identification with information theoretic criteria so readers can track the latest developments
  • Contains numerous illustrative examples to help the reader grasp basic methods

Recenzijas

"almost all of the variables used in the formulas are defined, something I cannot say about many other mathematical booksI found this book timely, interesting, and very well written. Readers can learn about estimation methodologies, the art of proof, and identification of the parameters assumed by the system architect or designer." --ComputingReviews.com, March 5, 2014

"Chen Zhu, Huand Principesynthesize their recent papers into a single-volume reference on system identification under criteria based on the information theory descriptors of entropy and dissimilarity. They cover information measures, information theoretic parameter estimation, system identification under minimum error entropy criteria, system identification under information divergence criteria, and system identification based on mutual information criteria." --Reference & Research Book News, December 2013

Papildus informācija

Introduces the information criteria and related algorithms for system parameter identification.
About the Authors ix
Preface xi
Symbols and Abbreviations xiii
1 Introduction 1(12)
1.1 Elements of System Identification
1(2)
1.2 Traditional Identification Criteria
3(1)
1.3 Information Theoretic Criteria
4(4)
1.3.1 MEE Criteria
6(1)
1.3.2 Minimum Information Divergence Criteria
7(1)
1.3.3 Mutual Information-Based Criteria
7(1)
1.4 Organization of This Book
8(1)
Appendix A: Unifying Framework of ITL
9(4)
2 Information Measures 13(16)
2.1 Entropy
13(6)
2.2 Mutual Information
19(2)
2.3 Information Divergence
21(2)
2.4 Fisher Information
23(1)
2.5 Information Rate
24(2)
Appendix B: a-Stable Distribution
26(1)
Appendix C: Proof of (2.17)
26(1)
Appendix D: Proof of Cramer-Rao Inequality
27(2)
3 Information Theoretic Parameter Estimation 29(32)
3.1 Traditional Methods for Parameter Estimation
29(5)
3.1.1 Classical Estimation
29(2)
3.1.2 Bayes Estimation
31(3)
3.2 Information Theoretic Approaches to Classical Estimation
34(6)
3.2.1 Entropy Matching Method
34(1)
3.2.2 Maximum Entropy Method
35(2)
3.2.3 Minimum Divergence Estimation
37(3)
3.3 Information Theoretic Approaches to Bayes Estimation
40(16)
3.3.1 Minimum Error Entropy Estimation
40(11)
3.3.2 MC Estimation
51(5)
3.4 Information Criteria for Model Selection
56(1)
Appendix E: EM Algorithm
57(1)
Appendix F: Minimum MSE Estimation
58(1)
Appendix G: Derivation of AIC Criterion
58(3)
4 System Identification Under Minimum Error Entropy Criteria 61(106)
4.1 Brief Sketch of System Parameter Identification
61(11)
4.1.1 Model Structure
62(3)
4.1.2 Criterion Function
65(1)
4.1.3 Identification Algorithm
65(7)
4.2 MEE Identification Criterion
72(10)
4.2.1 Common Approaches to Entropy Estimation
73(3)
4.2.2 Empirical Error Entropies Based on KDE
76(6)
4.3 Identification Algorithms Under MEE Criterion
82(22)
4.3.1 Nonparametric Information Gradient Algorithms
82(4)
4.3.2 Parametric IG Algorithms
86(5)
4.3.3 Fixed-Point Minimum Error Entropy Algorithm
91(2)
4.3.4 Kernel Minimum Error Entropy Algorithm
93(2)
4.3.5 Simulation Examples
95(9)
4.4 Convergence Analysis
104(18)
4.4.1 Convergence Analysis Based on Approximate Linearization
104(2)
4.4.2 Energy Conservation Relation
106(5)
4.4.3 Mean Square Convergence Analysis Based on Energy Conservation Relation
111(11)
4.5 Optimization of 0-Entropy Criterion
122(7)
4.6 Survival Information Potential Criterion
129(14)
4.6.1 Definition of SIP
129(2)
4.6.2 Properties of the SIP
131(5)
4.6.3 Empirical SIP
136(3)
4.6.4 Application to System Identification
139(4)
4.7 A-Entropy Criterion
143(18)
4.7.1 Definition of 0-Entropy
145(3)
4.7.2 Some Properties of the 0-Entropy
148(4)
4.7.3 Estimation of A-Entropy
152(5)
4.7.4 Application to System Identification
157(4)
4.8 System Identification with MCC
161(3)
Appendix H: Vector Gradient and Matrix Gradient
164(3)
5 System Identification Under Information Divergence Criteria 167(38)
5.1 Parameter Identifiability Under KLID Criterion
167(19)
5.1.1 Definitions and Assumptions
168(1)
5.1.2 Relations with Fisher Information
169(4)
5.1.3 Gaussian Process Case
173(3)
5.1.4 Markov Process Case
176(4)
5.1.5 Asymptotic KLID-Identifiability
180(6)
5.2 Minimum Information Divergence Identification with Reference PDF
186(19)
5.2.1 Some Properties
188(8)
5.2.2 Identification Algorithm
196(2)
5.2.3 Simulation Examples
198(3)
5.2.4 Adaptive Infinite Impulsive Response Filter with Euclidean Distance Criterion
201(4)
6 System Identification Based on Mutual Information Criteria 205(34)
6.1 System Identification Under the MinMI Criterion
205(11)
6.1.1 Properties of MinMI Criterion
207(4)
6.1.2 Relationship with Independent Component Analysis
211(1)
6.1.3 ICA-Based Stochastic Gradient Identification Algorithm
212(2)
6.1.4 Numerical Simulation Example
214(2)
6.2 System Identification Under the MaxMI Criterion
216(23)
6.2.1 Properties of the MaxMI Criterion
217(5)
6.2.2 Stochastic Mutual Information Gradient Identification Algorithm
222(5)
6.2.3 Double-Criterion Identification Method
227(11)
Appendix I: MinMI Rate Criterion
238(1)
References 239
Badong Chen received the B.S. and M.S. degrees in control theory and engineering from Chongqing University, in 1997 and 2003, respectively, and the Ph.D. degree in computer science and technology from Tsinghua University in 2008. He was a Post-Doctoral Researcher with Tsinghua University from 2008 to 2010, and a Post-Doctoral Associate at the University of Florida Computational NeuroEngineering Laboratory (CNEL) during the period October, 2010 to September, 2012. He is currently a professor at the Institute of Artificial Intelligence and Robotics (IAIR), Xian Jiaotong University. His research interests are in system identification and control, information theory, machine learning, and their applications in cognition and neuroscience. Yu Zhu received the B.S. degree in radio electronics in 1983 from Beijing Normal University, and the M.S. degree in computer applications in 1993, and the Ph.D. degree in mechanical design and theory in 2001, both from China University of Mining and Technology. He is currently a professor with the Department of Mechanical Engineering, Tsinghua University. His research field mainly covers IC manufacturing equipment development strategy, ultra-precision air/maglev stage machinery design theory and technology, ultra-precision measurement theory and technology, and precision motion control theory and technology. Prof. Zhu has more than 140 research papers and 100 (48 awarded) invention patents. Jinchun Huassociate professor, born in 1972, graduated from Nanjing University of Science & Technology. He received the B.Eng and Ph.D. degrees in control science and engineering in 1994 and 1998, respectively. Now he works at the Department of Mechanical Engineering, Tsinghua University. His current research interests include modern control theory and control systems, ultra-precision measurement principles and methods, micro/nano motion control system analysis and realization, special driver technology and device for precision motion systems, and super-precision measurement & control. Jose C. Principe is a Distinguished Professor of Electrical and Computer Engineering and Biomedical Engineering at the University of Florida where he teaches advanced signal processing, machine learning and artificial neural networks (ANNs) modeling. He is BellSouth Professor and the Founding Director of the University of Florida Computational NeuroEngineering Laboratory (CNEL). His primary research interests are in advanced signal processing with information theoretic criteria (entropy and mutual information) and adaptive models in reproducing kernel Hilbert spaces (RKHS), and the application of these advanced algorithms to Brain Machine Interfaces (BMI). Dr. Principe is a Fellow of the IEEE, ABME, and AIBME. He is the past Editor in Chief of the IEEE Transactions on Biomedical Engineering, past Chair of the Technical Committee on Neural Networks of the IEEE Signal Processing Society, and Past-President of the International Neural Network Society. He received the IEEE EMBS Career Award, and the IEEE Neural Network Pioneer Award. He has more than 600 publications and 30 patents (awarded or filed).