×







We sell 100% Genuine & New Books only!

Entropy and Information Theory at Meripustak

Entropy and Information Theory by Robert M. Gray , Springer

Books from same Author: Robert M. Gray

Books from same Publisher: Springer

Related Category: Author List / Publisher List


  • Price: ₹ 29093.00/- [ 7.00% off ]

    Seller Price: ₹ 27056.00

Estimated Delivery Time : 4-5 Business Days

Sold By: Meripustak      Click for Bulk Order

Free Shipping (for orders above ₹ 499) *T&C apply.

In Stock

We deliver across all postal codes in India

Orders Outside India


Add To Cart


Outside India Order Estimated Delivery Time
7-10 Business Days


  • We Deliver Across 100+ Countries

  • MeriPustak’s Books are 100% New & Original
  • General Information  
    Author(s)Robert M. Gray
    PublisherSpringer
    ISBN9781489981325
    Pages409
    BindingPaperback
    LanguageEnglish
    Publish YearSeptember 2014

    Description

    Springer Entropy and Information Theory by Robert M. Gray

    This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties. _x000D__x000D_New in this edition:_x000D__x000D__x000D__x000D_Expanded treatment of stationary or sliding-block codes and their relations to traditional block codes_x000D_Expanded discussion of results from ergodic theory relevant to information theory_x000D_Expanded treatment of B-processes -- processes formed by stationary coding memoryless sources_x000D_New material on trading off information and distortion, including the Marton inequality_x000D_New material on the properties of optimal and asymptotically optimal source codes_x000D_New material on the relationships of source coding and rate-constrained simulation or modeling of random processes_x000D__x000D_Significant material not covered in other information theory texts includes stationary/sliding-block codes, a geometric view of information theory provided by process distance measures, and general Shannon coding theorems for asymptotic mean stationary sources, which may be neither ergodic nor stationary, and d-bar continuous channels._x000D_ Table of contents : - _x000D_ Preface.- Introduction.- Information Sources.- Pair Processes: Channels, Codes, and Couplings.- Entropy.- The Entropy Ergodic Theorem.- Distortion and Approximation.- Distortion and Entropy.- Relative Entropy.- Information Rates.- Distortion vs. Rate.- Relative Entropy Rates.- Ergodic Theorems for Densities.- Source Coding Theorems.- Coding for Noisy Channels.- Bibliography.- References.- Index_x000D_



    Book Successfully Added To Your Cart