Atjaunināt sīkdatņu piekrišanu

E-grāmata: Entropy and Information Theory

  • Formāts: PDF+DRM
  • Izdošanas datums: 27-Jan-2011
  • Izdevniecība: Springer-Verlag New York Inc.
  • Valoda: eng
  • ISBN-13: 9781441979704
Citas grāmatas par šo tēmu:
  • Formāts - PDF+DRM
  • Cena: 177,85 €*
  • * ši ir gala cena, t.i., netiek piemērotas nekādas papildus atlaides
  • Ielikt grozā
  • Pievienot vēlmju sarakstam
  • Šī e-grāmata paredzēta tikai personīgai lietošanai. E-grāmatas nav iespējams atgriezt un nauda par iegādātajām e-grāmatām netiek atmaksāta.
  • Formāts: PDF+DRM
  • Izdošanas datums: 27-Jan-2011
  • Izdevniecība: Springer-Verlag New York Inc.
  • Valoda: eng
  • ISBN-13: 9781441979704
Citas grāmatas par šo tēmu:

DRM restrictions

  • Kopēšana (kopēt/ievietot):

    nav atļauts

  • Drukāšana:

    nav atļauts

  • Lietošana:

    Digitālo tiesību pārvaldība (Digital Rights Management (DRM))
    Izdevējs ir piegādājis šo grāmatu šifrētā veidā, kas nozīmē, ka jums ir jāinstalē bezmaksas programmatūra, lai to atbloķētu un lasītu. Lai lasītu šo e-grāmatu, jums ir jāizveido Adobe ID. Vairāk informācijas šeit. E-grāmatu var lasīt un lejupielādēt līdz 6 ierīcēm (vienam lietotājam ar vienu un to pašu Adobe ID).

    Nepieciešamā programmatūra
    Lai lasītu šo e-grāmatu mobilajā ierīcē (tālrunī vai planšetdatorā), jums būs jāinstalē šī bezmaksas lietotne: PocketBook Reader (iOS / Android)

    Lai lejupielādētu un lasītu šo e-grāmatu datorā vai Mac datorā, jums ir nepieciešamid Adobe Digital Editions (šī ir bezmaksas lietotne, kas īpaši izstrādāta e-grāmatām. Tā nav tas pats, kas Adobe Reader, kas, iespējams, jau ir jūsu datorā.)

    Jūs nevarat lasīt šo e-grāmatu, izmantojot Amazon Kindle.

This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties.New in this edition:Expanded treatment of stationary or sliding-block codes and their relations to traditional block codesExpanded discussion of results from ergodic theory relevant to information theoryExpanded treatment of B-processes -- processes formed by stationary coding memoryless sourcesNew material on trading off information and distortion, including the Marton inequalityNew material on the properties of optimal and asymptotically optimal source codesNew material on the relationships of source coding and rate-constrained simulation or modeling of random processesSignificant material not covered in other information theory texts includes stationary/sliding-block codes, a geometric view of information theory provided by process distance measures, and general Shannon coding theorems for asymptotic mean stationary sources, which may be neither ergodic nor stationary, and d-bar continuous channels.

This fully updated new edition of the classic work on information theory presents a detailed analysis of Shannon-source and channel-coding theorems, before moving on to address sources, channels, codes and the properties of information and distortion measures.

Recenzijas

From the book reviews:

This book is the second edition of the classic 1990 text and inherits much of the structure and all of the virtues of the original. this is a deep and important book, which would reward further study as the focus of a reading group or graduate course, and comes enthusiastically recommended. (Oliver Johnson, Mathematical Reviews, October, 2014)

In Entropy and Information Theory Robert Gray offers an excellent text to stimulate research in this field. Entropy and Information Theory is highly recommended as essential reading to academics and researchers in the field, especially to engineers interested in the mathematical aspects and mathematicians interested in the engineering applications. it will contribute to further synergy between the two fields and the deepening of research efforts. (Ina Fourie, Online Information Review, Vol. 36 (3), 2012)

The book offers interesting and very important information about the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The main goal is a general development of Shannons mathematical theory of communication for single-user systems. The author manages to balance the practice with the theory, every chapter is very well structured and has high-value content. (Nicolae Constantinescu, Zentralblatt MATH, Vol. 1216, 2011)

Preface vii
Introduction xvii
1 Information Sources
1(20)
1.1 Probability Spaces and Random Variables
1(4)
1.2 Random Processes and Dynamical Systems
5(2)
1.3 Distributions
7(5)
1.4 Standard Alphabets
12(1)
1.5 Expectation
13(3)
1.6 Asymptotic Mean Stationarity
16(1)
1.7 Ergodic Properties
17(4)
2 Pair Processes: Channels, Codes, and Couplings
21(40)
2.1 Pair Processes
21(1)
2.2 Channels
22(3)
2.3 Stationarity Properties of Channels
25(4)
2.4 Extremes: Noiseless and Completely Random Channels
29(1)
2.5 Deterministic Channels and Sequence Coders
30(1)
2.6 Stationary and Sliding-Block Codes
31(6)
2.7 Block Codes
37(1)
2.8 Random Punctuation Sequences
38(4)
2.9 Memoryless Channels
42(1)
2.10 Finite-Memory Channels
42(1)
2.11 Output Mixing Channels
43(2)
2.12 Block Independent Channels
45(1)
2.13 Conditionally Block Independent Channels
46(1)
2.14 Stationarizing Block Independent Channels
46(2)
2.15 Primitive Channels
48(1)
2.16 Additive Noise Channels
49(1)
2.17 Markov Channels
49(1)
2.18 Finite-State Channels and Codes
50(1)
2.19 Cascade Channels
51(1)
2.20 Communication Systems
52(1)
2.21 Couplings
52(1)
2.22 Block to Sliding-Block: The Rohlin-Kakutani Theorem
53(8)
3 Entropy
61(36)
3.1 Entropy and Entropy Rate
61(4)
3.2 Divergence Inequality and Relative Entropy
65(4)
3.3 Basic Properties of Entropy
69(9)
3.4 Entropy Rate
78(3)
3.5 Relative Entropy Rate
81(1)
3.6 Conditional Entropy and Mutual Information
82(8)
3.7 Entropy Rate Revisited
90(1)
3.8 Markov Approximations
91(2)
3.9 Relative Entropy Densities
93(4)
4 The Entropy Ergodic Theorem
97(20)
4.1 History
97(3)
4.2 Stationary Ergodic Sources
100(6)
4.3 Stationary Nonergodic Sources
106(4)
4.4 AMS Sources
110(4)
4.5 The Asymptotic Equipartition Property
114(3)
5 Distortion and Approximation
117(30)
5.1 Distortion Measures
117(3)
5.2 Fidelity Criteria
120(1)
5.3 Average Limiting Distortion
121(2)
5.4 Communications Systems Performance
123(1)
5.5 Optimal Performance
124(1)
5.6 Code Approximation
124(5)
5.7 Approximating Random Vectors and Processes
129(3)
5.8 The Monge/Kantorovich/Vasershtein Distance
132(1)
5.9 Variation and Distribution Distance
132(2)
5.10 Coupling Discrete Spaces with the Hamming Distance
134(1)
5.11 Process Distance and Approximation
135(6)
5.12 Source Approximation and Codes
141(1)
5.13 d-bar Continuous Channels
142(5)
6 Distortion and Entropy
147(26)
6.1 The Fano Inequality
147(3)
6.2 Code Approximation and Entropy Rate
150(2)
6.3 Pinsker's and Marton's Inequalities
152(4)
6.4 Entropy and Isomorphism
156(4)
6.5 Almost Lossless Source Coding
160(8)
6.6 Asymptotically Optimal Almost Lossless Codes
168(1)
6.7 Modeling and Simulation
169(4)
7 Relative Entropy
173(46)
7.1 Divergence
173(16)
7.2 Conditional Relative Entropy
189(13)
7.3 Limiting Entropy Densities
202(2)
7.4 Information for General Alphabets
204(12)
7.5 Convergence Results
216(3)
8 Information Rates
219(18)
8.1 Information Rates for Finite Alphabets
219(2)
8.2 Information Rates for General Alphabets
221(4)
8.3 A Mean Ergodic Theorem for Densities
225(2)
8.4 Information Rates of Stationary Processes
227(7)
8.5 The Data Processing Theorem
234(1)
8.6 Memoryless Channels and Sources
235(2)
9 Distortion and Information
237(28)
9.1 The Shannon Distortion-Rate Function
237(2)
9.2 Basic Properties
239(3)
9.3 Process Definitions of the Distortion-Rate Function
242(8)
9.4 The Distortion-Rate Function as a Lower Bound
250(2)
9.5 Evaluating the Rate-Distortion Function
252(13)
10 Relative Entropy Rates
265(16)
10.1 Relative Entropy Densities and Rates
265(3)
10.2 Markov Dominating Measures
268(4)
10.3 Stationary Processes
272(3)
10.4 Mean Ergodic Theorems
275(6)
11 Ergodic Theorems for Densities
281(14)
11.1 Stationary Ergodic Sources
281(5)
11.2 Stationary Nonergodic Sources
286(4)
11.3 AMS Sources
290(3)
11.4 Ergodic Theorems for Information Densities
293(2)
12 Source Coding Theorems
295(40)
12.1 Source Coding and Channel Coding
295(1)
12.2 Block Source Codes for AMS Sources
296(11)
12.3 Block Source Code Mismatch
307(3)
12.4 Block Coding Stationary Sources
310(2)
12.5 Block Coding AMS Ergodic Sources
312(7)
12.6 Subadditive Fidelity Criteria
319(2)
12.7 Asynchronous Block Codes
321(2)
12.8 Sliding-Block Source Codes
323(10)
12.9 A Geometric Interpretation
333(2)
13 Properties of Good Source Codes
335(24)
13.1 Optimal and Asymptotically Optimal Codes
335(2)
13.2 Block Codes
337(6)
13.3 Sliding-Block Codes
343(16)
14 Coding for Noisy Channels
359(36)
14.1 Noisy Channels
359(2)
14.2 Feinstein's Lemma
361(3)
14.3 Feinstein's Theorem
364(3)
14.4 Channel Capacity
367(5)
14.5 Robust Block Codes
372(3)
14.6 Block Coding Theorems for Noisy Channels
375(2)
14.7 Joint Source and Channel Block Codes
377(3)
14.8 Synchronizing Block Channel Codes
380(4)
14.9 Sliding-block Source and Channel Coding
384(11)
References 395(10)
Index 405
Robert M. Gray is the Alcatel-Lucent Technologies Professor of Communications and Networking in the School of Engineering and Professor of Electrical Engineering at Stanford University. For over four decades he has done research, taught, and published in the areas of information theory and statistical signal processing. He is a Fellow of the IEEE and the Institute for Mathematical Statistics. He has won several professional awards, including a Guggenheim Fellowship, the Society Award and Education Award of the IEEE Signal Processing Society, the Claude E. Shannon Award from the IEEE Information Theory Society, the Jack S. Kilby Signal Processing Medal, Centennial Medal, and Third Millennium Medal from the IEEE, and a Presidential Award for Excellence in Science, Mathematics and Engineering Mentoring (PAESMEM). He is a member of the National Academy of Engineering.