|
1 Concept of Information, Discrete Entropy and Mutual Information |
|
|
1 | (96) |
|
1.1 The Meaning of Information |
|
|
1 | (2) |
|
1.2 Review of Discrete Random Variables |
|
|
3 | (3) |
|
|
6 | (30) |
|
1.3.1 Interpretation of Entropy |
|
|
9 | (1) |
|
|
9 | (2) |
|
1.3.3 Conditional Entropy |
|
|
11 | (12) |
|
1.3.4 Properties of the Discrete Entropy |
|
|
23 | (1) |
|
|
24 | (12) |
|
|
36 | (4) |
|
|
40 | (14) |
|
1.5.1 Properties of the Mutual Information |
|
|
46 | (6) |
|
1.5.2 Mutual Information Involving More Than Two Random Variables |
|
|
52 | (2) |
|
1.6 Probabilistic Distance |
|
|
54 | (1) |
|
|
55 | (9) |
|
|
64 | (9) |
|
1.9 Conditional Mutual Information |
|
|
73 | (18) |
|
1.9.1 Properties of Conditional Mutual Information |
|
|
75 | (4) |
|
|
79 | (1) |
|
1.9.3 Data Processing Inequality for Mutual Information |
|
|
80 | (11) |
|
1.10 Some Properties for Mutual Information |
|
|
91 | (6) |
|
2 Entropy for Continuous Random Variables Discrete Channel Capacity, Continuous Channel Capacity |
|
|
97 | (78) |
|
2.1 Entropy for Continuous Random Variable |
|
|
97 | (7) |
|
2.1.1 Differential Entropy |
|
|
97 | (4) |
|
2.1.2 Joint and Conditional Entropies for Continuous Random Variables |
|
|
101 | (1) |
|
2.1.3 The Relative Entropy of Two Continuous Distribution |
|
|
102 | (2) |
|
2.2 Mutual Information for Continuous Random Variables |
|
|
104 | (17) |
|
2.2.1 Properties for Differential Entropy |
|
|
108 | (1) |
|
2.2.2 Conditional Mutual Information for Continuous Random Variables |
|
|
109 | (1) |
|
2.2.3 Data Processing Inequality for Continuous Random Variables |
|
|
110 | (11) |
|
|
121 | (34) |
|
2.3.1 Discrete Channel Capacity |
|
|
122 | (33) |
|
2.4 Capacity for Continuous Channels, i.e., Continuous Random Variables |
|
|
155 | (10) |
|
2.4.1 Capacity of the Gaussian Channel with Power Constraint |
|
|
162 | (3) |
|
2.5 Bounds and Limiting Cases on AWGN Channel Capacity |
|
|
165 | (10) |
|
2.5.1 Effect of Information Signal Bandwidth on AWGN Channel Capacity |
|
|
165 | (2) |
|
2.5.2 Effect of Signal to Noise Ratio on the Capacity of AWGN Channel |
|
|
167 | (8) |
|
3 Typical Sequences and Data Compression |
|
|
175 | (60) |
|
3.1 Independent Identically Distributed Random Variables (ITD Random Variables) |
|
|
175 | (3) |
|
3.1.1 The Weak Law of Large Numbers |
|
|
177 | (1) |
|
3.2 Convergence of Random Variable Sequences |
|
|
178 | (7) |
|
3.2.1 Different Types of Convergence for the Sequence of Random Variables |
|
|
179 | (6) |
|
3.3 Asymptotic Equipartition Property Theorem |
|
|
185 | (17) |
|
3.3.1 Typical Sequences and Typical Set |
|
|
186 | (5) |
|
3.3.2 Strongly and Weakly Typical Sequences |
|
|
191 | (11) |
|
3.4 Data Compression or Source Coding |
|
|
202 | (33) |
|
|
206 | (6) |
|
|
212 | (8) |
|
3.4.3 Source Coding for Real Number Sequences |
|
|
220 | (7) |
|
|
227 | (8) |
|
|
235 | (38) |
|
4.1 Discrete Memory less Channel |
|
|
235 | (1) |
|
|
236 | (8) |
|
4.2.1 Probability of Error |
|
|
239 | (2) |
|
|
241 | (3) |
|
4.3 Jointly Typical Sequences |
|
|
244 | (20) |
|
4.3.1 Jointly Typical Set |
|
|
245 | (1) |
|
4.3.2 Strongly and Weakly Jointly Typical Sequences |
|
|
245 | (10) |
|
4.3.3 Number of Jointly Typical Sequences and Probability for Typical Sequences |
|
|
255 | (9) |
|
4.4 Channel Coding Theorem |
|
|
264 | (9) |
References |
|
273 | (2) |
Index |
|
275 | |