Preface |
|
vii | |
Introduction |
|
xvii | |
|
|
1 | (20) |
|
1.1 Probability Spaces and Random Variables |
|
|
1 | (4) |
|
1.2 Random Processes and Dynamical Systems |
|
|
5 | (2) |
|
|
7 | (5) |
|
|
12 | (1) |
|
|
13 | (3) |
|
1.6 Asymptotic Mean Stationarity |
|
|
16 | (1) |
|
|
17 | (4) |
|
2 Pair Processes: Channels, Codes, and Couplings |
|
|
21 | (40) |
|
|
21 | (1) |
|
|
22 | (3) |
|
2.3 Stationarity Properties of Channels |
|
|
25 | (4) |
|
2.4 Extremes: Noiseless and Completely Random Channels |
|
|
29 | (1) |
|
2.5 Deterministic Channels and Sequence Coders |
|
|
30 | (1) |
|
2.6 Stationary and Sliding-Block Codes |
|
|
31 | (6) |
|
|
37 | (1) |
|
2.8 Random Punctuation Sequences |
|
|
38 | (4) |
|
|
42 | (1) |
|
2.10 Finite-Memory Channels |
|
|
42 | (1) |
|
2.11 Output Mixing Channels |
|
|
43 | (2) |
|
2.12 Block Independent Channels |
|
|
45 | (1) |
|
2.13 Conditionally Block Independent Channels |
|
|
46 | (1) |
|
2.14 Stationarizing Block Independent Channels |
|
|
46 | (2) |
|
|
48 | (1) |
|
2.16 Additive Noise Channels |
|
|
49 | (1) |
|
|
49 | (1) |
|
2.18 Finite-State Channels and Codes |
|
|
50 | (1) |
|
|
51 | (1) |
|
2.20 Communication Systems |
|
|
52 | (1) |
|
|
52 | (1) |
|
2.22 Block to Sliding-Block: The Rohlin-Kakutani Theorem |
|
|
53 | (8) |
|
|
61 | (36) |
|
3.1 Entropy and Entropy Rate |
|
|
61 | (4) |
|
3.2 Divergence Inequality and Relative Entropy |
|
|
65 | (4) |
|
3.3 Basic Properties of Entropy |
|
|
69 | (9) |
|
|
78 | (3) |
|
3.5 Relative Entropy Rate |
|
|
81 | (1) |
|
3.6 Conditional Entropy and Mutual Information |
|
|
82 | (8) |
|
3.7 Entropy Rate Revisited |
|
|
90 | (1) |
|
3.8 Markov Approximations |
|
|
91 | (2) |
|
3.9 Relative Entropy Densities |
|
|
93 | (4) |
|
4 The Entropy Ergodic Theorem |
|
|
97 | (20) |
|
|
97 | (3) |
|
4.2 Stationary Ergodic Sources |
|
|
100 | (6) |
|
4.3 Stationary Nonergodic Sources |
|
|
106 | (4) |
|
|
110 | (4) |
|
4.5 The Asymptotic Equipartition Property |
|
|
114 | (3) |
|
5 Distortion and Approximation |
|
|
117 | (30) |
|
|
117 | (3) |
|
|
120 | (1) |
|
5.3 Average Limiting Distortion |
|
|
121 | (2) |
|
5.4 Communications Systems Performance |
|
|
123 | (1) |
|
|
124 | (1) |
|
|
124 | (5) |
|
5.7 Approximating Random Vectors and Processes |
|
|
129 | (3) |
|
5.8 The Monge/Kantorovich/Vasershtein Distance |
|
|
132 | (1) |
|
5.9 Variation and Distribution Distance |
|
|
132 | (2) |
|
5.10 Coupling Discrete Spaces with the Hamming Distance |
|
|
134 | (1) |
|
5.11 Process Distance and Approximation |
|
|
135 | (6) |
|
5.12 Source Approximation and Codes |
|
|
141 | (1) |
|
5.13 d-bar Continuous Channels |
|
|
142 | (5) |
|
|
147 | (26) |
|
|
147 | (3) |
|
6.2 Code Approximation and Entropy Rate |
|
|
150 | (2) |
|
6.3 Pinsker's and Marton's Inequalities |
|
|
152 | (4) |
|
6.4 Entropy and Isomorphism |
|
|
156 | (4) |
|
6.5 Almost Lossless Source Coding |
|
|
160 | (8) |
|
6.6 Asymptotically Optimal Almost Lossless Codes |
|
|
168 | (1) |
|
6.7 Modeling and Simulation |
|
|
169 | (4) |
|
|
173 | (46) |
|
|
173 | (16) |
|
7.2 Conditional Relative Entropy |
|
|
189 | (13) |
|
7.3 Limiting Entropy Densities |
|
|
202 | (2) |
|
7.4 Information for General Alphabets |
|
|
204 | (12) |
|
|
216 | (3) |
|
|
219 | (18) |
|
8.1 Information Rates for Finite Alphabets |
|
|
219 | (2) |
|
8.2 Information Rates for General Alphabets |
|
|
221 | (4) |
|
8.3 A Mean Ergodic Theorem for Densities |
|
|
225 | (2) |
|
8.4 Information Rates of Stationary Processes |
|
|
227 | (7) |
|
8.5 The Data Processing Theorem |
|
|
234 | (1) |
|
8.6 Memoryless Channels and Sources |
|
|
235 | (2) |
|
9 Distortion and Information |
|
|
237 | (28) |
|
9.1 The Shannon Distortion-Rate Function |
|
|
237 | (2) |
|
|
239 | (3) |
|
9.3 Process Definitions of the Distortion-Rate Function |
|
|
242 | (8) |
|
9.4 The Distortion-Rate Function as a Lower Bound |
|
|
250 | (2) |
|
9.5 Evaluating the Rate-Distortion Function |
|
|
252 | (13) |
|
10 Relative Entropy Rates |
|
|
265 | (16) |
|
10.1 Relative Entropy Densities and Rates |
|
|
265 | (3) |
|
10.2 Markov Dominating Measures |
|
|
268 | (4) |
|
10.3 Stationary Processes |
|
|
272 | (3) |
|
10.4 Mean Ergodic Theorems |
|
|
275 | (6) |
|
11 Ergodic Theorems for Densities |
|
|
281 | (14) |
|
11.1 Stationary Ergodic Sources |
|
|
281 | (5) |
|
11.2 Stationary Nonergodic Sources |
|
|
286 | (4) |
|
|
290 | (3) |
|
11.4 Ergodic Theorems for Information Densities |
|
|
293 | (2) |
|
12 Source Coding Theorems |
|
|
295 | (40) |
|
12.1 Source Coding and Channel Coding |
|
|
295 | (1) |
|
12.2 Block Source Codes for AMS Sources |
|
|
296 | (11) |
|
12.3 Block Source Code Mismatch |
|
|
307 | (3) |
|
12.4 Block Coding Stationary Sources |
|
|
310 | (2) |
|
12.5 Block Coding AMS Ergodic Sources |
|
|
312 | (7) |
|
12.6 Subadditive Fidelity Criteria |
|
|
319 | (2) |
|
12.7 Asynchronous Block Codes |
|
|
321 | (2) |
|
12.8 Sliding-Block Source Codes |
|
|
323 | (10) |
|
12.9 A Geometric Interpretation |
|
|
333 | (2) |
|
13 Properties of Good Source Codes |
|
|
335 | (24) |
|
13.1 Optimal and Asymptotically Optimal Codes |
|
|
335 | (2) |
|
|
337 | (6) |
|
|
343 | (16) |
|
14 Coding for Noisy Channels |
|
|
359 | (36) |
|
|
359 | (2) |
|
|
361 | (3) |
|
|
364 | (3) |
|
|
367 | (5) |
|
|
372 | (3) |
|
14.6 Block Coding Theorems for Noisy Channels |
|
|
375 | (2) |
|
14.7 Joint Source and Channel Block Codes |
|
|
377 | (3) |
|
14.8 Synchronizing Block Channel Codes |
|
|
380 | (4) |
|
14.9 Sliding-block Source and Channel Coding |
|
|
384 | (11) |
References |
|
395 | (10) |
Index |
|
405 | |