Atjaunināt sīkdatņu piekrišanu

E-grāmata: Information Theory and Network Coding

3.77/5 (14 ratings by Goodreads)
  • Formāts - PDF+DRM
  • Cena: 65,42 €*
  • * ši ir gala cena, t.i., netiek piemērotas nekādas papildus atlaides
  • Ielikt grozā
  • Pievienot vēlmju sarakstam
  • Šī e-grāmata paredzēta tikai personīgai lietošanai. E-grāmatas nav iespējams atgriezt un nauda par iegādātajām e-grāmatām netiek atmaksāta.

DRM restrictions

  • Kopēšana (kopēt/ievietot):

    nav atļauts

  • Drukāšana:

    nav atļauts

  • Lietošana:

    Digitālo tiesību pārvaldība (Digital Rights Management (DRM))
    Izdevējs ir piegādājis šo grāmatu šifrētā veidā, kas nozīmē, ka jums ir jāinstalē bezmaksas programmatūra, lai to atbloķētu un lasītu. Lai lasītu šo e-grāmatu, jums ir jāizveido Adobe ID. Vairāk informācijas šeit. E-grāmatu var lasīt un lejupielādēt līdz 6 ierīcēm (vienam lietotājam ar vienu un to pašu Adobe ID).

    Nepieciešamā programmatūra
    Lai lasītu šo e-grāmatu mobilajā ierīcē (tālrunī vai planšetdatorā), jums būs jāinstalē šī bezmaksas lietotne: PocketBook Reader (iOS / Android)

    Lai lejupielādētu un lasītu šo e-grāmatu datorā vai Mac datorā, jums ir nepieciešamid Adobe Digital Editions (šī ir bezmaksas lietotne, kas īpaši izstrādāta e-grāmatām. Tā nav tas pats, kas Adobe Reader, kas, iespējams, jau ir jūsu datorā.)

    Jūs nevarat lasīt šo e-grāmatu, izmantojot Amazon Kindle.

This book is an evolution from my book A First Course in Information Theory published in 2002 when network coding was still at its infancy. The last few years have witnessed the rapid development of network coding into a research ?eld of its own in information science. With its root in infor- tion theory, network coding has not only brought about a paradigm shift in network communications at large, but also had signi cant in uence on such speci c research ?elds as coding theory, networking, switching, wireless c- munications,distributeddatastorage,cryptography,andoptimizationtheory. While new applications of network coding keep emerging, the fundamental - sults that lay the foundation of the subject are more or less mature. One of the main goals of this book therefore is to present these results in a unifying and coherent manner. While the previous book focused only on information theory for discrete random variables, the current book contains two new chapters on information theory for continuous random variables, namely the chapter on di erential entropy and the chapter on continuous-valued channels. With these topics included, the book becomes more comprehensive and is more suitable to be used as a textbook for a course in an electrical engineering department.

Recenzijas

From the reviews:









"This book could serve as a reference in the general area of information theory and would be of interest to electrical engineers, computer engineers, or computer scientists with an interest in information theory. Each chapter has an appropriate problem set at the end and a brief paragraph that provides insight into the historical significance of the material covered therein. Summing Up: Recommended. Upper-division undergraduate through professional collections." (J. Beidler, Choice, Vol. 46 (9), May, 2009)



"The book consisting of 21 chapters is divided into two parts. Part I, Components of Information Theory . Part II Fundamentals of Network Coding . A comprehensive instructors manual is available. This is a well planned comprehensive book on the subject. The writing style of the author is quite reader friendly. it is a welcome addition to the subject and will be very useful to students as well as to the researchers in the field." (Arjun K. Gupta, Zentralblatt MATH, Vol. 1154, 2009)

1 The Science of Information 1(6)
Part I Components of Information Theory
2 Information Measures
7(44)
2.1 Independence and Markov Chains
7(5)
2.2 Shannon's Information Measures
12(6)
2.3 Continuity of Shannon's Information Measures for Fixed Finite Alphabets
18(3)
2.4 Chain Rules
21(2)
2.5 Informational Divergence
23(3)
2.6 The Basic Inequalities
26(2)
2.7 Some Useful Information Inequalities
28(4)
2.8 Fano's Inequality
32(4)
2.9 Maximum Entropy Distributions
36(2)
2.10 Entropy Rate of a Stationary Source
38(3)
Appendix 2.A: Approximation of Random Variables with Countably Infinite Alphabets by Truncation
41(2)
Chapter Summary
43(2)
Problems
45(5)
Historical Notes
50(1)
3 The I-Measure
51(30)
3.1 Preliminaries
52(1)
3.2 The I-Measure for Two Random Variables
53(2)
3.3 Construction of the I-Measure μ*
55(4)
3.4 μ* Can Be Negative
59(2)
3.5 Information Diagrams
61(6)
3.6 Examples of Applications
67(7)
Appendix 3.A: A Variation of the Inclusion-Exclusion Formula
74(2)
Chapter Summary
76(2)
Problems
78(2)
Historical Notes
80(1)
4 Zero-Error Data Compression
81(20)
4.1 The Entropy Bound
81(5)
4.2 Prefix Codes
86(7)
4.2.1 Definition and Existence
86(2)
4.2.2 Huffman Codes
88(5)
4.3 Redundancy of Prefix Codes
93(4)
Chapter Summary
97(1)
Problems
98(1)
Historical Notes
99(2)
5 Weak Typicality
101(12)
5.1 The Weak AEP
101(3)
5.2 The Source Coding Theorem
104(2)
5.3 Efficient Source Coding
106(1)
5.4 The Shannon-McMillan-Breiman Theorem
107(2)
Chapter Summary
109(1)
Problems
110(2)
Historical Notes
112(1)
6 Strong Typicality
113(24)
6.1 Strong AEP
113(8)
6.2 Strong Typicality Versus Weak Typicality
121(1)
6.3 Joint Typicality
122(9)
6.4 An Interpretation of the Basic Inequalities
131(1)
Chapter Summary
131(1)
Problems
132(3)
Historical Notes
135(2)
7 Discrete Memoryless Channels
137(46)
7.1 Definition and Capacity
141(8)
7.2 The Channel Coding Theorem
149(3)
7.3 The Converse
152(5)
7.4 Achievability
157(7)
7.5 A Discussion
164(3)
7.6 Feedback Capacity
167(5)
7.7 Separation of Source and Channel Coding
172(3)
Chapter Summary
175(1)
Problems
176(5)
Historical Notes
181(2)
8 Rate-Distortion Theory
183(28)
8.1 Single-Letter Distortion Measures
184(3)
8.2 The Rate-Distortion Function R(D)
187(5)
8.3 The Rate-Distortion Theorem
192(8)
8.4 The Converse
200(2)
8.5 Achievability of RI(D)
202(5)
Chapter Summary
207(1)
Problems
208(1)
Historical Notes
209(2)
9 The Blahut-Arimoto Algorithms
211(18)
9.1 Alternating Optimization
212(2)
9.2 The Algorithms
214(8)
9.2.1 Channel Capacity
214(5)
9.2.2 The Rate-Distortion Function
219(3)
9.3 Convergence
222(4)
9.3.1 A Sufficient Condition
222(3)
9.3.2 Convergence to the Channel Capacity
225(1)
Chapter Summary
226(1)
Problems
227(1)
Historical Notes
228(1)
10 Differential Entropy
229(28)
10.1 Preliminaries
231(4)
10.2 Definition
235(3)
10.3 Joint Differential Entropy, Conditional (Differential) Entropy, and Mutual Information
238(7)
10.4 The AEP for Continuous Random Variables
245(3)
10.5 Informational Divergence
248(1)
10.6 Maximum Differential Entropy Distributions
249(3)
Chapter Summary
252(3)
Problems
255(1)
Historical Notes
256(1)
11 Continuous-Valued Channels
257(42)
11.1 Discrete-Time Channels
257(3)
11.2 The Channel Coding Theorem
260(2)
11.3 Proof of the Channel Coding Theorem
262(8)
11.3.1 The Converse
262(3)
11.3.2 Achievability
265(5)
11.4 Memoryless Gaussian Channels
270(2)
11.5 Parallel Gaussian Channels
272(6)
11.6 Correlated Gaussian Channels
278(2)
11.7 The Bandlimited White Gaussian Channel
280(7)
11.8 The Bandlimited Colored Gaussian Channel
287(3)
11.9 Zero-Mean Gaussian Noise Is the Worst Additive Noise
290(4)
Chapter Summary
294(2)
Problems
296(1)
Historical Notes
297(2)
12 Markov Structures
299(24)
12.1 Conditional Mutual Independence
300(9)
12.2 Full Conditional Mutual Independence
309(5)
12.3 Markov Random Field
314(3)
12.4 Markov Chain
317(2)
Chapter Summary
319(1)
Problems
320(1)
Historical Notes
321(2)
13 Information Inequalities
323(16)
13.1 The Region Π*n
325(1)
13.2 Information Expressions in Canonical Form
326(3)
13.3 A Geometrical Framework
329(4)
13.3.1 Unconstrained Inequalities
329(1)
13.3.2 Constrained Inequalities
330(2)
13.3.3 Constrained Identities
332(1)
13.4 Equivalence of Constrained Inequalities
333(3)
13.5 The Implication Problem of Conditional Independence
336(1)
Chapter Summary
337(1)
Problems
338(1)
Historical Notes
338(1)
14 Shannon-Type Inequalities
339(22)
14.1 The Elemental Inequalities
339(2)
14.2 A Linear Programming Approach
341(4)
14.2.1 Unconstrained Inequalities
343(1)
14.2.2 Constrained Inequalities and Identities
344(1)
14.3 A Duality
345(2)
14.4 Machine Proving - au,
347(4)
14.5 Tackling the Implication Problem
351(2)
14.6 Minimality of the Elemental Inequalities
353(3)
Appendix 14.A: The Basic Inequalities and the Polymatroidal Axioms
356(1)
Chapter Summary
357(1)
Problems
358(2)
Historical Notes
360(1)
15 Beyond Shannon-Type Inequalities
361(26)
15.1 Characterizations of Π*2, Π*3 and Π*n
361(8)
15.2 A Non-Shannon-Type Unconstrained Inequality
369(5)
15.3 A Non-Shannon-Type Constrained Inequality
374(6)
15.4 Applications
380(3)
Chapter Summitry
383(1)
Problems
383(2)
Historical Notes
385(2)
16 Entropy and Groups
387(24)
16.1 Group Preliminaries
388(5)
16.2 Group-Characterizable Entropy Functions
393(5)
16.3 A Group Characterization of Π*n
398(3)
16.4 Information Inequalities and Group Inequalities
401(4)
Chapter Summary
405(1)
Problems
406(2)
Historical Notes
408(3)
Part II Fundamentals of Network Coding
17 Introduction
411(10)
17.1 The Butterfly Network
412(3)
17.2 Wireless and Satellite Communications
415(2)
17.3 Source Separation
417(1)
Chapter Summary
418(1)
Problems
418(1)
Historical Notes
419(2)
18 The Max-Flow Bound
421(14)
18.1 Point-to-Point Communication Networks
421(3)
18.2 Examples Achieving the Max-Flow Bound
424(3)
18.3 A Class of Network Codes
427(2)
18.4 Proof of the Max-Flow Bound
429(2)
Chapter Summary
431(1)
Problems
431(3)
Historical Notes
434(1)
19 Single-Source Linear Network Coding: Acyclic Networks
435(50)
19.1 Acyclic Networks
436(1)
19.2 Linear Network Codes
437(6)
19.3 Desirable Properties of a Linear Network Code
443(6)
19.3.1 Transformation of a Linear Network Code
447(1)
19.3.2 Implementation of a Linear Network Code
448(1)
19.4 Existence and Construction
449(11)
19.5 Generic Network Codes
460(8)
19.6 Static Network Codes
468(5)
19.7 Random Network Coding: A Case Study
473(5)
19.7.1 How the System Works
474(1)
19.7.2 Model and Analysis
475(3)
Chapter Summary
478(1)
Problems
479(3)
Historical Notes
482(3)
20 Single-Source Linear Network Coding: Cyclic Networks
485(20)
20.1 Delay-Free Cyclic Networks
485(3)
20.2 Convolutional Network Codes
488(10)
20.3 Decoding of Convolutional Network Codes
498(5)
Chapter Summary
503(1)
Problems
503(1)
Historical Notes
504(1)
21 Multi-source Network Coding
505(36)
21.1 The Max-Flow Bounds
505(3)
21.2 Examples of Application
508(3)
21.2.1 Multilevel Diversity Coding
508(2)
21.2.2 Satellite Communication Network
510(1)
21.3 A Network Code for Acyclic Networks
511(1)
21.4 The Achievable Information Rate Region
512(3)
21.5 Explicit Inner and Outer Bounds
515(1)
21.6 The Converse
516(5)
21.7 Achievability
521(15)
21.7.1 Random Code Construction
524(3)
21.7.2 Performance Analysis
527(9)
Chapter Summary
536(1)
Problems
537(2)
Historical Notes
539(2)
Bibliography 541(20)
Index 561