Atjaunināt sīkdatņu piekrišanu

Information Theoretic Principles for Agent Learning 2025 ed. [Hardback]

  • Formāts: Hardback, 95 pages, height x width: 240x168 mm, 2 Illustrations, color; 9 Illustrations, black and white; IX, 95 p. 11 illus., 2 illus. in color., 1 Hardback
  • Sērija : Synthesis Lectures on Engineering, Science, and Technology
  • Izdošanas datums: 06-Aug-2024
  • Izdevniecība: Springer International Publishing AG
  • ISBN-10: 3031653874
  • ISBN-13: 9783031653872
  • Hardback
  • Cena: 46,91 €*
  • * ši ir gala cena, t.i., netiek piemērotas nekādas papildus atlaides
  • Standarta cena: 55,19 €
  • Ietaupiet 15%
  • Grāmatu piegādes laiks ir 3-4 nedēļas, ja grāmata ir uz vietas izdevniecības noliktavā. Ja izdevējam nepieciešams publicēt jaunu tirāžu, grāmatas piegāde var aizkavēties.
  • Daudzums:
  • Ielikt grozā
  • Piegādes laiks - 4-6 nedēļas
  • Pievienot vēlmju sarakstam
  • Formāts: Hardback, 95 pages, height x width: 240x168 mm, 2 Illustrations, color; 9 Illustrations, black and white; IX, 95 p. 11 illus., 2 illus. in color., 1 Hardback
  • Sērija : Synthesis Lectures on Engineering, Science, and Technology
  • Izdošanas datums: 06-Aug-2024
  • Izdevniecība: Springer International Publishing AG
  • ISBN-10: 3031653874
  • ISBN-13: 9783031653872

This book provides readers with the fundamentals of information theoretic techniques for statistical data science analyses and for characterizing the behavior and performance of a learning agent outside of the standard results on communications and compression fundamental limits. Readers will benefit from the presentation of information theoretic quantities, definitions, and results that provide or could provide insights into data science and learning.

Background and Overview.- Entropy and Mutual Information.- Differential Entropy, Entropy Rate, and Maximum Entropy.- Typical Sequences and The AEP.- Markov Chains and Cascaded Systems.- Hypothesis Testing, Estimation, Information, and Sufficient Statistics.- Information Theoretic Quantities and Learning.- Estimation and Entropy Power.- Time Series Analyses.- Information Bottleneck Principle.- Channel Capacity.- Rate Distortion Theory.

Jerry D. Gibson is Professor of Electrical and Computer Engineering at the University of California, Santa Barbara. He has been an Associate Editor of the IEEE Transactions on Communications and the IEEE Transactions on Information Theory. He was an IEEE Communications Society Distinguished Lecturer for 2007-2008. He is an IEEE Fellow, and he has received The Fredrick Emmons Terman Award (1990), the 1993 IEEE Signal Processing Society Senior Paper Award, the 2009 IEEE Technical Committee on Wireless Communications Recognition Award, and the 2010 Best Paper Award from the IEEE Transactions on Multimedia. He is the author, coauthor, and editor of several books, the most recent of which are The Mobile Communications Handbook (Editor, 3rd ed., 2012), Rate Distortion Bounds for Voice and Video (Coauthor with Jing Hu, NOW Publishers, 2014), and Information Theory and Rate Distortion Theory for Communications and Compression (Morgan-Claypool, 2014). His research interests are lossy source coding, wireless communications and networks, and digital signal processing.