Atjaunināt sīkdatņu piekrišanu

E-grāmata: Information Theory for Electrical Engineers

Citas grāmatas par šo tēmu:
  • Formāts - PDF+DRM
  • Cena: 65,42 €*
  • * ši ir gala cena, t.i., netiek piemērotas nekādas papildus atlaides
  • Ielikt grozā
  • Pievienot vēlmju sarakstam
  • Šī e-grāmata paredzēta tikai personīgai lietošanai. E-grāmatas nav iespējams atgriezt un nauda par iegādātajām e-grāmatām netiek atmaksāta.
Citas grāmatas par šo tēmu:

DRM restrictions

  • Kopēšana (kopēt/ievietot):

    nav atļauts

  • Drukāšana:

    nav atļauts

  • Lietošana:

    Digitālo tiesību pārvaldība (Digital Rights Management (DRM))
    Izdevējs ir piegādājis šo grāmatu šifrētā veidā, kas nozīmē, ka jums ir jāinstalē bezmaksas programmatūra, lai to atbloķētu un lasītu. Lai lasītu šo e-grāmatu, jums ir jāizveido Adobe ID. Vairāk informācijas šeit. E-grāmatu var lasīt un lejupielādēt līdz 6 ierīcēm (vienam lietotājam ar vienu un to pašu Adobe ID).

    Nepieciešamā programmatūra
    Lai lasītu šo e-grāmatu mobilajā ierīcē (tālrunī vai planšetdatorā), jums būs jāinstalē šī bezmaksas lietotne: PocketBook Reader (iOS / Android)

    Lai lejupielādētu un lasītu šo e-grāmatu datorā vai Mac datorā, jums ir nepieciešamid Adobe Digital Editions (šī ir bezmaksas lietotne, kas īpaši izstrādāta e-grāmatām. Tā nav tas pats, kas Adobe Reader, kas, iespējams, jau ir jūsu datorā.)

    Jūs nevarat lasīt šo e-grāmatu, izmantojot Amazon Kindle.

This book explains the fundamental concepts of information theory, so as to help students better understand modern communication technologies. It was especially written for electrical and communication engineers working on communication subjects. The book especially focuses on the understandability of the topics, and accordingly uses simple and detailed mathematics, together with a wealth of solved examples.

The book consists of four chapters, the first of which explains the entropy and mutual information concept for discrete random variables. Chapter 2 introduces the concepts of entropy and mutual information for continuous random variables, along with the channel capacity. In turn, Chapter 3 is devoted to the typical sequences and data compression. One of Shannons most important discoveries is the channel coding theorem, and it is critical for electrical and communication engineers to fully comprehend the theorem. As such, Chapter 4 solely focuses on it.

To gain the most from the book, readers should have a fundamental grasp of probability and random variables; otherwise, they will find it nearly impossible to understand the topics discussed.
1 Concept of Information, Discrete Entropy and Mutual Information
1(96)
1.1 The Meaning of Information
1(2)
1.2 Review of Discrete Random Variables
3(3)
1.3 Discrete Entropy
6(30)
1.3.1 Interpretation of Entropy
9(1)
1.3.2 Joint Entropy
9(2)
1.3.3 Conditional Entropy
11(12)
1.3.4 Properties of the Discrete Entropy
23(1)
1.3.5 Log-Sum Inequality
24(12)
1.4 Information Channels
36(4)
1.5 Mutual Information
40(14)
1.5.1 Properties of the Mutual Information
46(6)
1.5.2 Mutual Information Involving More Than Two Random Variables
52(2)
1.6 Probabilistic Distance
54(1)
1.7 Jensen's Inequality
55(9)
1.8 Fano's Inequality
64(9)
1.9 Conditional Mutual Information
73(18)
1.9.1 Properties of Conditional Mutual Information
75(4)
1.9.2 Markov Chain
79(1)
1.9.3 Data Processing Inequality for Mutual Information
80(11)
1.10 Some Properties for Mutual Information
91(6)
2 Entropy for Continuous Random Variables Discrete Channel Capacity, Continuous Channel Capacity
97(78)
2.1 Entropy for Continuous Random Variable
97(7)
2.1.1 Differential Entropy
97(4)
2.1.2 Joint and Conditional Entropies for Continuous Random Variables
101(1)
2.1.3 The Relative Entropy of Two Continuous Distribution
102(2)
2.2 Mutual Information for Continuous Random Variables
104(17)
2.2.1 Properties for Differential Entropy
108(1)
2.2.2 Conditional Mutual Information for Continuous Random Variables
109(1)
2.2.3 Data Processing Inequality for Continuous Random Variables
110(11)
2.3 Channel Capacity
121(34)
2.3.1 Discrete Channel Capacity
122(33)
2.4 Capacity for Continuous Channels, i.e., Continuous Random Variables
155(10)
2.4.1 Capacity of the Gaussian Channel with Power Constraint
162(3)
2.5 Bounds and Limiting Cases on AWGN Channel Capacity
165(10)
2.5.1 Effect of Information Signal Bandwidth on AWGN Channel Capacity
165(2)
2.5.2 Effect of Signal to Noise Ratio on the Capacity of AWGN Channel
167(8)
3 Typical Sequences and Data Compression
175(60)
3.1 Independent Identically Distributed Random Variables (ITD Random Variables)
175(3)
3.1.1 The Weak Law of Large Numbers
177(1)
3.2 Convergence of Random Variable Sequences
178(7)
3.2.1 Different Types of Convergence for the Sequence of Random Variables
179(6)
3.3 Asymptotic Equipartition Property Theorem
185(17)
3.3.1 Typical Sequences and Typical Set
186(5)
3.3.2 Strongly and Weakly Typical Sequences
191(11)
3.4 Data Compression or Source Coding
202(33)
3.4.1 Kraft Inequality
206(6)
3.4.2 Optimal Codes
212(8)
3.4.3 Source Coding for Real Number Sequences
220(7)
3.4.4 Huffman Codes
227(8)
4 Channel Coding Theorem
235(38)
4.1 Discrete Memory less Channel
235(1)
4.2 Communication System
236(8)
4.2.1 Probability of Error
239(2)
4.2.2 Rate Achievability
241(3)
4.3 Jointly Typical Sequences
244(20)
4.3.1 Jointly Typical Set
245(1)
4.3.2 Strongly and Weakly Jointly Typical Sequences
245(10)
4.3.3 Number of Jointly Typical Sequences and Probability for Typical Sequences
255(9)
4.4 Channel Coding Theorem
264(9)
References 273(2)
Index 275
Orhan Gazi is an associate professor in electronic and communication engineering department, Cankaya University.

He got his BS, MS, and PhD degrees all in electrical and electronics engineering from Middle East Technical University, Ankara-Turkey, in 1996, 2001, and 2007 respectively.





His research area involves signal processing, information theory, and forward error correction. Recently he is studying on polar channel codes and preparing publications in this area.