Elements of Information Theory (Record no. 2428)
[ view plain ]
000 -LEADER | |
---|---|
fixed length control field | 10070nam a22001937a 4500 |
005 - DATE AND TIME OF LATEST TRANSACTION | |
control field | 20240930104436.0 |
008 - FIXED-LENGTH DATA ELEMENTS--GENERAL INFORMATION | |
fixed length control field | 240930b |||||||| |||| 00| 0 eng d |
020 ## - INTERNATIONAL STANDARD BOOK NUMBER | |
International Standard Book Number | 9780471241959 |
082 ## - DEWEY DECIMAL CLASSIFICATION NUMBER | |
Classification number | 003.54 |
Item number | COV |
100 ## - MAIN ENTRY--PERSONAL NAME | |
Personal name | Cover Thomas M |
245 ## - TITLE STATEMENT | |
Title | Elements of Information Theory |
Statement of responsibility, etc. | Cover Thomas M Thomas Joy A |
250 ## - EDITION STATEMENT | |
Edition statement | 2 |
260 ## - PUBLICATION, DISTRIBUTION, ETC. | |
Place of publication, distribution, etc. | New Jersey |
Name of publisher, distributor, etc. | WILEY Hoboken, |
Date of publication, distribution, etc. | 2006 |
300 ## - PHYSICAL DESCRIPTION | |
Page number | 748p |
505 ## - FORMATTED CONTENTS NOTE | |
Title | 1 Introduction and Preview 1<br/><br/>1.1 Preview of the Book 5<br/><br/>2 Entropy, Relative Entropy, and Mutual Information 13<br/><br/>2.1 Entropy 13<br/><br/>2.2 Joint Entropy and Conditional Entropy 16<br/><br/>2.3 Relative Entropy and Mutual Information 19<br/><br/>2.4 Relationship Between Entropy and Mutual Information 20<br/><br/>2.5 Chain Rules for Entropy, Relative Entropy, and Mutual Information 22<br/><br/>2.6 Jensen’s Inequality and Its Consequences 25<br/><br/>2.7 Log Sum Inequality and Its Applications 30<br/><br/>2.8 Data-Processing Inequality 34<br/><br/>2.9 Sufficient Statistics 35<br/><br/>2.10 Fano’s Inequality 37<br/><br/>Summary 41<br/><br/>Problems 43<br/><br/>Historical Notes 54<br/><br/>3 Asymptotic Equipartition Property 57<br/><br/>3.1 Asymptotic Equipartition Property Theorem 58<br/><br/>3.2 Consequences of the AEP: Data Compression 60<br/><br/>3.3 High-Probability Sets and the Typical Set 62<br/><br/>Summary 64<br/><br/>Problems 64<br/><br/>Historical Notes 69<br/><br/>4 Entropy Rates of a Stochastic Process 71<br/><br/>4.1 Markov Chains 71<br/><br/>4.2 Entropy Rate 74<br/><br/>4.3 Example: Entropy Rate of a Random Walk on a Weighted Graph 78<br/><br/>4.4 Second Law of Thermodynamics 81<br/><br/>4.5 Functions of Markov Chains 84<br/><br/>Summary 87<br/><br/>Problems 88<br/><br/>Historical Notes 100<br/><br/>5 Data Compression 103<br/><br/>5.1 Examples of Codes 103<br/><br/>5.2 Kraft Inequality 107<br/><br/>5.3 Optimal Codes 110<br/><br/>5.4 Bounds on the Optimal Code Length 112<br/><br/>5.5 Kraft Inequality for Uniquely Decodable Codes 115<br/><br/>5.6 Huffman Codes 118<br/><br/>5.7 Some Comments on Huffman Codes 120<br/><br/>5.8 Optimality of Huffman Codes 123<br/><br/>5.9 Shannon–Fano–Elias Coding 127<br/><br/>5.10 Competitive Optimality of the Shannon Code 130<br/><br/>5.11 Generation of Discrete Distributions from Fair Coins 134<br/><br/>Summary 141<br/><br/>Problems 142<br/><br/>Historical Notes 157<br/><br/>6 Gambling and Data Compression 159<br/><br/>6.1 The Horse Race 159<br/><br/>6.2 Gambling and Side Information 164<br/><br/>6.3 Dependent Horse Races and Entropy Rate 166<br/><br/>6.4 The Entropy of English 168<br/><br/>6.5 Data Compression and Gambling 171<br/><br/>6.6 Gambling Estimate of the Entropy of English 173<br/><br/>Summary 175<br/><br/>Problems 176<br/><br/>Historical Notes 182<br/><br/>7 Channel Capacity 183<br/><br/>7.1 Examples of Channel Capacity 184<br/><br/>7.1.1 Noiseless Binary Channel 184<br/><br/>7.1.2 Noisy Channel with Nonoverlapping Outputs 185<br/><br/>7.1.3 Noisy Typewriter 186<br/><br/>7.1.4 Binary Symmetric Channel 187<br/><br/>7.1.5 Binary Erasure Channel 188<br/><br/>7.2 Symmetric Channels 189<br/><br/>7.3 Properties of Channel Capacity 191<br/><br/>7.4 Preview of the Channel Coding Theorem 191<br/><br/>7.5 Definitions 192<br/><br/>7.6 Jointly Typical Sequences 195<br/><br/>7.7 Channel Coding Theorem 199<br/><br/>7.8 Zero-Error Codes 205<br/><br/>7.9 Fano’s Inequality and the Converse to the Coding Theorem 206<br/><br/>7.10 Equality in the Converse to the Channel Coding Theorem 208<br/><br/>7.11 Hamming Codes 210<br/><br/>7.12 Feedback Capacity 216<br/><br/>7.13 Source–Channel Separation Theorem 218<br/><br/>Summary 222<br/><br/>Problems 223<br/><br/>Historical Notes 240<br/><br/>8 Differential Entropy 243<br/><br/>8.1 Definitions 243<br/><br/>8.2 AEP for Continuous Random Variables 245<br/><br/>8.3 Relation of Differential Entropy to Discrete Entropy 247<br/><br/>8.4 Joint and Conditional Differential Entropy 249<br/><br/>8.5 Relative Entropy and Mutual Information 250<br/><br/>8.6 Properties of Differential Entropy, Relative Entropy, and Mutual Information 252<br/><br/>Summary 256<br/><br/>Problems 256<br/><br/>Historical Notes 259<br/><br/>9 Gaussian Channel 261<br/><br/>9.1 Gaussian Channel: Definitions 263<br/><br/>9.2 Converse to the Coding Theorem for Gaussian Channels 268<br/><br/>9.3 Bandlimited Channels 270<br/><br/>9.4 Parallel Gaussian Channels 274<br/><br/>9.5 Channels with Colored Gaussian Noise 277<br/><br/>9.6 Gaussian Channels with Feedback 280<br/><br/>Summary 289<br/><br/>Problems 290<br/><br/>Historical Notes 299<br/><br/>10 Rate Distortion Theory 301<br/><br/>10.1 Quantization 301<br/><br/>10.2 Definitions 303<br/><br/>10.3 Calculation of the Rate Distortion Function 307<br/><br/>10.3.1 Binary Source 307<br/><br/>10.3.2 Gaussian Source 310<br/><br/>10.3.3 Simultaneous Description of Independent Gaussian Random Variables 312<br/><br/>10.4 Converse to the Rate Distortion Theorem 315<br/><br/>10.5 Achievability of the Rate Distortion Function 318<br/><br/>10.6 Strongly Typical Sequences and Rate Distortion 325<br/><br/>10.7 Characterization of the Rate Distortion Function 329<br/><br/>10.8 Computation of Channel Capacity and the Rate Distortion Function 332<br/><br/>Summary 335<br/><br/>Problems 336<br/><br/>Historical Notes 345<br/><br/>11 Information Theory and Statistics 347<br/><br/>11.1 Method of Types 347<br/><br/>11.2 Law of Large Numbers 355<br/><br/>11.3 Universal Source Coding 357<br/><br/>11.4 Large Deviation Theory 360<br/><br/>11.5 Examples of Sanov’s Theorem 364<br/><br/>11.6 Conditional Limit Theorem 366<br/><br/>11.7 Hypothesis Testing 375<br/><br/>11.8 Chernoff–Stein Lemma 380<br/><br/>11.9 Chernoff Information 384<br/><br/>11.10 Fisher Information and the Cramér–Rao Inequality 392<br/><br/>Summary 397<br/><br/>Problems 399<br/><br/>Historical Notes 408<br/><br/>12 Maximum Entropy 409<br/><br/>12.1 Maximum Entropy Distributions 409<br/><br/>12.2 Examples 411<br/><br/>12.3 Anomalous Maximum Entropy Problem 413<br/><br/>12.4 Spectrum Estimation 415<br/><br/>12.5 Entropy Rates of a Gaussian Process 416<br/><br/>12.6 Burg’s Maximum Entropy Theorem 417<br/><br/>Summary 420<br/><br/>Problems 421<br/><br/>Historical Notes 425<br/><br/>13 Universal Source Coding 427<br/><br/>13.1 Universal Codes and Channel Capacity 428<br/><br/>13.2 Universal Coding for Binary Sequences 433<br/><br/>13.3 Arithmetic Coding 436<br/><br/>13.4 Lempel–Ziv Coding 440<br/><br/>13.4.1 Sliding Window Lempel–Ziv Algorithm 441<br/><br/>13.4.2 Tree-Structured Lempel–Ziv Algorithms 442<br/><br/>13.5 Optimality of Lempel–Ziv Algorithms 443<br/><br/>13.5.1 Sliding Window Lempel–Ziv Algorithms 443<br/><br/>13.5.2 Optimality of Tree-Structured Lempel–Ziv Compression 448<br/><br/>Summary 456<br/><br/>Problems 457<br/><br/>Historical Notes 461<br/><br/>14 Kolmogorov Complexity 463<br/><br/>14.1 Models of Computation 464<br/><br/>14.2 Kolmogorov Complexity: Definitions and Examples 466<br/><br/>14.3 Kolmogorov Complexity and Entropy 473<br/><br/>14.4 Kolmogorov Complexity of Integers 475<br/><br/>14.5 Algorithmically Random and Incompressible Sequences 476<br/><br/>14.6 Universal Probability 480<br/><br/>14.7 Kolmogorov complexity 482<br/><br/>14.8 Ω 484<br/><br/>14.9 Universal Gambling 487<br/><br/>14.10 Occam’s Razor 488<br/><br/>14.11 Kolmogorov Complexity and Universal Probability 490<br/><br/>14.12 Kolmogorov Sufficient Statistic 496<br/><br/>14.13 Minimum Description Length Principle 500<br/><br/>Summary 501<br/><br/>Problems 503<br/><br/>Historical Notes 507<br/><br/>15 Network Information Theory 509<br/><br/>15.1 Gaussian Multiple-User Channels 513<br/><br/>15.1.1 Single-User Gaussian Channel 513<br/><br/>15.1.2 Gaussian Multiple-Access Channel with m Users 514<br/><br/>15.1.3 Gaussian Broadcast Channel 515<br/><br/>15.1.4 Gaussian Relay Channel 516<br/><br/>15.1.5 Gaussian Interference Channel 518<br/><br/>15.1.6 Gaussian Two-Way Channel 519<br/><br/>15.2 Jointly Typical Sequences 520<br/><br/>15.3 Multiple-Access Channel 524<br/><br/>15.3.1 Achievability of the Capacity Region for the Multiple-Access Channel 530<br/><br/>15.3.2 Comments on the Capacity Region for the Multiple-Access Channel 532<br/><br/>15.3.3 Convexity of the Capacity Region of the Multiple-Access Channel 534<br/><br/>15.3.4 Converse for the Multiple-Access Channel 538<br/><br/>15.3.5 m-User Multiple-Access Channels 543<br/><br/>15.3.6 Gaussian Multiple-Access Channels 544<br/><br/>15.4 Encoding of Correlated Sources 549<br/><br/>15.4.1 Achievability of the Slepian–Wolf Theorem 551<br/><br/>15.4.2 Converse for the Slepian–Wolf Theorem 555<br/><br/>15.4.3 Slepian–Wolf Theorem for Many Sources 556<br/><br/>15.4.4 Interpretation of Slepian–Wolf Coding 557<br/><br/>15.5 Duality Between Slepian–Wolf Encoding and Multiple-Access Channels 558<br/><br/>15.6 Broadcast Channel 560<br/><br/>15.6.1 Definitions for a Broadcast Channel 563<br/><br/>15.6.2 Degraded Broadcast Channels 564<br/><br/>15.6.3 Capacity Region for the Degraded Broadcast Channel 565<br/><br/>15.7 Relay Channel 571<br/><br/>15.8 Source Coding with Side Information 575<br/><br/>15.9 Rate Distortion with Side Information 580<br/><br/>15.10 General Multiterminal Networks 587<br/><br/>Summary 594<br/><br/>Problems 596<br/><br/>Historical Notes 609<br/><br/>16 Information Theory and Portfolio Theory 613<br/><br/>16.1 The Stock Market: Some Definitions 613<br/><br/>16.2 Kuhn–Tucker Characterization of the Log-Optimal Portfolio 617<br/><br/>16.3 Asymptotic Optimality of the Log-Optimal Portfolio 619<br/><br/>16.4 Side Information and the Growth Rate 621<br/><br/>16.5 Investment in Stationary Markets 623<br/><br/>16.6 Competitive Optimality of the Log-Optimal Portfolio 627<br/><br/>16.7 Universal Portfolios 629<br/><br/>16.7.1 Finite-Horizon Universal Portfolios 631<br/><br/>16.7.2 Horizon-Free Universal Portfolios 638<br/><br/>16.8 Shannon–McMillan–Breiman Theorem (General AEP) 644<br/><br/>Summary 650<br/><br/>Problems 652<br/><br/>Historical Notes 655<br/><br/>17 Inequalities in Information Theory 657<br/><br/>17.1 Basic Inequalities of Information Theory 657<br/><br/>17.2 Differential Entropy 660<br/><br/>17.3 Bounds on Entropy and Relative Entropy 663<br/><br/>17.4 Inequalities for Types 665<br/><br/>17.5 Combinatorial Bounds on Entropy 666<br/><br/>17.6 Entropy Rates of Subsets 667<br/><br/>17.7 Entropy and Fisher Information 671<br/><br/>17.8 Entropy Power Inequality and Brunn–Minkowski Inequality 674<br/><br/>17.9 Inequalities for Determinants 679<br/><br/>17.10 Inequalities for Ratios of Determinants 683 |
520 ## - SUMMARY, ETC. | |
Summary, etc. | The Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory.<br/><br/>All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. |
650 ## - SUBJECT ADDED ENTRY--TOPICAL TERM | |
Topical term or geographic name entry element | Information Theory Data Compression Entropy, Relative Entropy, and Mutual Information |
942 ## - ADDED ENTRY ELEMENTS (KOHA) | |
Source of classification or shelving scheme | Dewey Decimal Classification |
Koha item type | Books |
952 ## - LOCATION AND ITEM INFORMATION (KOHA) | |
-- | 7984 |
Withdrawn status | Lost status | Source of classification or shelving scheme | Damaged status | Not for loan | Collection code | Home library | Current library | Shelving location | Date acquired | Source of acquisition | Cost, normal purchase price | Inventory number | Total Checkouts | Full call number | Barcode | Checked out | Date last seen | Date last checked out | Cost, replacement price | Price effective from | Currency | Koha item type |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Dewey Decimal Classification | Non-fiction | IIITDM Kurnool | IIITDM Kurnool | ELECTRONICS COMMUNICATION ENGINEERING | 30.09.2024 | New India Book Agency | 132.95 | 5395 dt 31-8-2024 | 1 | 003.54 COV | 0006966 | 07.02.2026 | 09.12.2024 | 09.12.2024 | 132.95 | 30.09.2024 | USD | Books |