1 The Science of Information |
|
1 | (6) |
Part I Components of Information Theory |
|
|
|
7 | (44) |
|
2.1 Independence and Markov Chains |
|
|
7 | (5) |
|
2.2 Shannon's Information Measures |
|
|
12 | (6) |
|
2.3 Continuity of Shannon's Information Measures for Fixed Finite Alphabets |
|
|
18 | (3) |
|
|
21 | (2) |
|
2.5 Informational Divergence |
|
|
23 | (3) |
|
2.6 The Basic Inequalities |
|
|
26 | (2) |
|
2.7 Some Useful Information Inequalities |
|
|
28 | (4) |
|
|
32 | (4) |
|
2.9 Maximum Entropy Distributions |
|
|
36 | (2) |
|
2.10 Entropy Rate of a Stationary Source |
|
|
38 | (3) |
|
Appendix 2.A: Approximation of Random Variables with Countably Infinite Alphabets by Truncation |
|
|
41 | (2) |
|
|
43 | (2) |
|
|
45 | (5) |
|
|
50 | (1) |
|
|
51 | (30) |
|
|
52 | (1) |
|
3.2 The I-Measure for Two Random Variables |
|
|
53 | (2) |
|
3.3 Construction of the I-Measure μ* |
|
|
55 | (4) |
|
|
59 | (2) |
|
|
61 | (6) |
|
3.6 Examples of Applications |
|
|
67 | (7) |
|
Appendix 3.A: A Variation of the Inclusion-Exclusion Formula |
|
|
74 | (2) |
|
|
76 | (2) |
|
|
78 | (2) |
|
|
80 | (1) |
|
4 Zero-Error Data Compression |
|
|
81 | (20) |
|
|
81 | (5) |
|
|
86 | (7) |
|
4.2.1 Definition and Existence |
|
|
86 | (2) |
|
|
88 | (5) |
|
4.3 Redundancy of Prefix Codes |
|
|
93 | (4) |
|
|
97 | (1) |
|
|
98 | (1) |
|
|
99 | (2) |
|
|
101 | (12) |
|
|
101 | (3) |
|
5.2 The Source Coding Theorem |
|
|
104 | (2) |
|
5.3 Efficient Source Coding |
|
|
106 | (1) |
|
5.4 The Shannon-McMillan-Breiman Theorem |
|
|
107 | (2) |
|
|
109 | (1) |
|
|
110 | (2) |
|
|
112 | (1) |
|
|
113 | (24) |
|
|
113 | (8) |
|
6.2 Strong Typicality Versus Weak Typicality |
|
|
121 | (1) |
|
|
122 | (9) |
|
6.4 An Interpretation of the Basic Inequalities |
|
|
131 | (1) |
|
|
131 | (1) |
|
|
132 | (3) |
|
|
135 | (2) |
|
7 Discrete Memoryless Channels |
|
|
137 | (46) |
|
7.1 Definition and Capacity |
|
|
141 | (8) |
|
7.2 The Channel Coding Theorem |
|
|
149 | (3) |
|
|
152 | (5) |
|
|
157 | (7) |
|
|
164 | (3) |
|
|
167 | (5) |
|
7.7 Separation of Source and Channel Coding |
|
|
172 | (3) |
|
|
175 | (1) |
|
|
176 | (5) |
|
|
181 | (2) |
|
|
183 | (28) |
|
8.1 Single-Letter Distortion Measures |
|
|
184 | (3) |
|
8.2 The Rate-Distortion Function R(D) |
|
|
187 | (5) |
|
8.3 The Rate-Distortion Theorem |
|
|
192 | (8) |
|
|
200 | (2) |
|
8.5 Achievability of RI(D) |
|
|
202 | (5) |
|
|
207 | (1) |
|
|
208 | (1) |
|
|
209 | (2) |
|
9 The Blahut-Arimoto Algorithms |
|
|
211 | (18) |
|
9.1 Alternating Optimization |
|
|
212 | (2) |
|
|
214 | (8) |
|
|
214 | (5) |
|
9.2.2 The Rate-Distortion Function |
|
|
219 | (3) |
|
|
222 | (4) |
|
9.3.1 A Sufficient Condition |
|
|
222 | (3) |
|
9.3.2 Convergence to the Channel Capacity |
|
|
225 | (1) |
|
|
226 | (1) |
|
|
227 | (1) |
|
|
228 | (1) |
|
|
229 | (28) |
|
|
231 | (4) |
|
|
235 | (3) |
|
10.3 Joint Differential Entropy, Conditional (Differential) Entropy, and Mutual Information |
|
|
238 | (7) |
|
10.4 The AEP for Continuous Random Variables |
|
|
245 | (3) |
|
10.5 Informational Divergence |
|
|
248 | (1) |
|
10.6 Maximum Differential Entropy Distributions |
|
|
249 | (3) |
|
|
252 | (3) |
|
|
255 | (1) |
|
|
256 | (1) |
|
11 Continuous-Valued Channels |
|
|
257 | (42) |
|
11.1 Discrete-Time Channels |
|
|
257 | (3) |
|
11.2 The Channel Coding Theorem |
|
|
260 | (2) |
|
11.3 Proof of the Channel Coding Theorem |
|
|
262 | (8) |
|
|
262 | (3) |
|
|
265 | (5) |
|
11.4 Memoryless Gaussian Channels |
|
|
270 | (2) |
|
11.5 Parallel Gaussian Channels |
|
|
272 | (6) |
|
11.6 Correlated Gaussian Channels |
|
|
278 | (2) |
|
11.7 The Bandlimited White Gaussian Channel |
|
|
280 | (7) |
|
11.8 The Bandlimited Colored Gaussian Channel |
|
|
287 | (3) |
|
11.9 Zero-Mean Gaussian Noise Is the Worst Additive Noise |
|
|
290 | (4) |
|
|
294 | (2) |
|
|
296 | (1) |
|
|
297 | (2) |
|
|
299 | (24) |
|
12.1 Conditional Mutual Independence |
|
|
300 | (9) |
|
12.2 Full Conditional Mutual Independence |
|
|
309 | (5) |
|
|
314 | (3) |
|
|
317 | (2) |
|
|
319 | (1) |
|
|
320 | (1) |
|
|
321 | (2) |
|
13 Information Inequalities |
|
|
323 | (16) |
|
|
325 | (1) |
|
13.2 Information Expressions in Canonical Form |
|
|
326 | (3) |
|
13.3 A Geometrical Framework |
|
|
329 | (4) |
|
13.3.1 Unconstrained Inequalities |
|
|
329 | (1) |
|
13.3.2 Constrained Inequalities |
|
|
330 | (2) |
|
13.3.3 Constrained Identities |
|
|
332 | (1) |
|
13.4 Equivalence of Constrained Inequalities |
|
|
333 | (3) |
|
13.5 The Implication Problem of Conditional Independence |
|
|
336 | (1) |
|
|
337 | (1) |
|
|
338 | (1) |
|
|
338 | (1) |
|
14 Shannon-Type Inequalities |
|
|
339 | (22) |
|
14.1 The Elemental Inequalities |
|
|
339 | (2) |
|
14.2 A Linear Programming Approach |
|
|
341 | (4) |
|
14.2.1 Unconstrained Inequalities |
|
|
343 | (1) |
|
14.2.2 Constrained Inequalities and Identities |
|
|
344 | (1) |
|
|
345 | (2) |
|
14.4 Machine Proving - au, |
|
|
347 | (4) |
|
14.5 Tackling the Implication Problem |
|
|
351 | (2) |
|
14.6 Minimality of the Elemental Inequalities |
|
|
353 | (3) |
|
Appendix 14.A: The Basic Inequalities and the Polymatroidal Axioms |
|
|
356 | (1) |
|
|
357 | (1) |
|
|
358 | (2) |
|
|
360 | (1) |
|
15 Beyond Shannon-Type Inequalities |
|
|
361 | (26) |
|
15.1 Characterizations of Π*2, Π*3 and Π*n |
|
|
361 | (8) |
|
15.2 A Non-Shannon-Type Unconstrained Inequality |
|
|
369 | (5) |
|
15.3 A Non-Shannon-Type Constrained Inequality |
|
|
374 | (6) |
|
|
380 | (3) |
|
|
383 | (1) |
|
|
383 | (2) |
|
|
385 | (2) |
|
|
387 | (24) |
|
|
388 | (5) |
|
16.2 Group-Characterizable Entropy Functions |
|
|
393 | (5) |
|
16.3 A Group Characterization of Π*n |
|
|
398 | (3) |
|
16.4 Information Inequalities and Group Inequalities |
|
|
401 | (4) |
|
|
405 | (1) |
|
|
406 | (2) |
|
|
408 | (3) |
Part II Fundamentals of Network Coding |
|
|
|
411 | (10) |
|
17.1 The Butterfly Network |
|
|
412 | (3) |
|
17.2 Wireless and Satellite Communications |
|
|
415 | (2) |
|
|
417 | (1) |
|
|
418 | (1) |
|
|
418 | (1) |
|
|
419 | (2) |
|
|
421 | (14) |
|
18.1 Point-to-Point Communication Networks |
|
|
421 | (3) |
|
18.2 Examples Achieving the Max-Flow Bound |
|
|
424 | (3) |
|
18.3 A Class of Network Codes |
|
|
427 | (2) |
|
18.4 Proof of the Max-Flow Bound |
|
|
429 | (2) |
|
|
431 | (1) |
|
|
431 | (3) |
|
|
434 | (1) |
|
19 Single-Source Linear Network Coding: Acyclic Networks |
|
|
435 | (50) |
|
|
436 | (1) |
|
19.2 Linear Network Codes |
|
|
437 | (6) |
|
19.3 Desirable Properties of a Linear Network Code |
|
|
443 | (6) |
|
19.3.1 Transformation of a Linear Network Code |
|
|
447 | (1) |
|
19.3.2 Implementation of a Linear Network Code |
|
|
448 | (1) |
|
19.4 Existence and Construction |
|
|
449 | (11) |
|
19.5 Generic Network Codes |
|
|
460 | (8) |
|
19.6 Static Network Codes |
|
|
468 | (5) |
|
19.7 Random Network Coding: A Case Study |
|
|
473 | (5) |
|
19.7.1 How the System Works |
|
|
474 | (1) |
|
19.7.2 Model and Analysis |
|
|
475 | (3) |
|
|
478 | (1) |
|
|
479 | (3) |
|
|
482 | (3) |
|
20 Single-Source Linear Network Coding: Cyclic Networks |
|
|
485 | (20) |
|
20.1 Delay-Free Cyclic Networks |
|
|
485 | (3) |
|
20.2 Convolutional Network Codes |
|
|
488 | (10) |
|
20.3 Decoding of Convolutional Network Codes |
|
|
498 | (5) |
|
|
503 | (1) |
|
|
503 | (1) |
|
|
504 | (1) |
|
21 Multi-source Network Coding |
|
|
505 | (36) |
|
|
505 | (3) |
|
21.2 Examples of Application |
|
|
508 | (3) |
|
21.2.1 Multilevel Diversity Coding |
|
|
508 | (2) |
|
21.2.2 Satellite Communication Network |
|
|
510 | (1) |
|
21.3 A Network Code for Acyclic Networks |
|
|
511 | (1) |
|
21.4 The Achievable Information Rate Region |
|
|
512 | (3) |
|
21.5 Explicit Inner and Outer Bounds |
|
|
515 | (1) |
|
|
516 | (5) |
|
|
521 | (15) |
|
21.7.1 Random Code Construction |
|
|
524 | (3) |
|
21.7.2 Performance Analysis |
|
|
527 | (9) |
|
|
536 | (1) |
|
|
537 | (2) |
|
|
539 | (2) |
Bibliography |
|
541 | (20) |
Index |
|
561 | |