Buy An Introduction to Information Theory by Fazlollah M. Reza
Book 1
Book 2
Book 3
Book 1
Book 2
Book 3
Book 1
Book 2
Book 3
Book 1
Book 2
Book 3
Home > Mathematics and Science Textbooks > Science: general issues > An Introduction to Information Theory: (Dover Books on Mathema 1.4tics)
An Introduction to Information Theory: (Dover Books on Mathema 1.4tics)

An Introduction to Information Theory: (Dover Books on Mathema 1.4tics)


     0     
5
4
3
2
1



Out of Stock


Notify me when this book is in stock
X
About the Book

Written for an engineering audience, this book has a threefold purpose: (1) to present elements of modern probability theory - discrete, continuous, and stochastic; (2) to present elements of information theory with emphasis on its basic roots in probability theory; and (3) to present elements of coding theory.The emphasis throughout the book is on such basic concepts as sets, the probability measure associated with sets, sample space, random variables, information measure, and capacity. These concepts proceed from set theory to probability theory and then to information and coding theories. No formal prerequisites are required other than the usual undergraduate mathematics included in an engineering or science program. However, since these programs may not include a course in probability, the author presents an introductory treatment of probability for those who wish to pursue the general study of statistical theory of communications. The book is divided into four parts: memoryless discrete themes, memoryless continuum, schemes with memory, and an outline of some recent developments. An appendix contains notes to help familiarize the reader with the literature in the field, while the inclusion of many reference tables and an extensive bibliography with some 200 entries makes this an excellent resource for any student in the field.

Table of Contents:
PREFACE CHAPTER 1 Introduction 1-1. Communication Processes 1-2. A Model for a Communication System 1-3. A Quantitative Measure of Information 1-4. A Binary Unit of Information 1-5. Sketch of the Plan 1-6. Main Contributors to Information theory 1-7. An Outline of Information Theory Part 1 : Discrete Schemes without Memory CHAPTER 2 Basic Concepts of Probability 2-1. Intuitive Background 2-2. Sets 2-3. Operations on Sets 2-4. Algebra of Sets 2-5. Functions 2-6. Sample Space 2-7. Probability Measure 2-8. Frequency of Events 2-9. Theorem of Addition 2-10. Conditional Probability 2-11. Theorem of Multiplication 2-12. Bayes's Theorem 2-13. Combinatorial Problems in Probability 2-14. Trees and State Diagrams 2-15. Random Variables 2-16. Discrete Probability Functions and Distribution 2-17. Bivariate Discrete Distributions 2-18. Binomial Distribution 2-19. Poisson's Distribution 2-20. Expected Value of a Random Variable CHAPTER 3 Basic Concepts of Information Theory: Memoryless Finite Schemes 3-1. A Measure of Uncertainty 3-2. An Intuitive Justification 3-3. Formal Requirements for the Average Uncertainty 3-4. H Function as a Measure of Uncertainty 3-5. An Alternative Proof That the Entropy Function Possesses a Maximum 3-6. Sources and Binary Sources 3-7. Measure of Information for Two-dimensional Discrete Finite Probability Schemes 3-8. Conditional Entropies 3-9. A Sketch of a Communication Network 3-10. Derivation of the Noise Characteristics of a Channel 3-11. Some Basic Relationships among Different Entropies 3-12. A Measure of Mutual Information 3-13. Set-theory Interpretation of Shannon's Fundamental Inequalities 3-14. "Redundancy, Efficiency, and Channel Capacity" 3-15. Capacity of Channels with Symmetric Noise Structure 3-16. BSC and BEC 3-17. Capacity of Binary Channels 3-18. Binary Pulse Width Communication Channel 3-19. Uniqueness of the Entropy Function CHAPTER 4 Elements of Encoding 4-1. The Purpose of Encoding 4-2. Separable Binary Codes 4-3. Shannon-Fano Encoding 4-4. Necessary and Sufficient Conditions for Noiseless 4-5. A Theorem on Decodability 4-6. Average Length of Encoded Messages 4-7. Shannon's Binary Encoding 4-8. Fundamental Theorem of Discrete Noiseless Coding 4-9. Huffman's Minimum-redundancy Code 4-10. Gilbert-Moore Encoding 4-11. Fundamental Theorem of Discrete Encoding in Presence of Noise 4-12. Error-detecting and Error-correcting Codes 4-13. Geometry of the Binary Code Space 4-14. Hammings Single-error Correcting Code 4-15. Elias's Iteration Technique 4-16. A Mathematical Proof of the Fundamental Theorem of Information Theory for Discrete BSC 4-17. Encoding the English Alphabet Part 2: Continuum without Memory CHAPTER 5 Continuous Probability Distribution and Density 5-1. Continuous Sample Space 5-2. Probability Distribution Functions 5-3. Probability Density Function 5-4. Normal Distribution 5-5. Cauchy's Distribution 5-6. Exponential Distribution 5-7. Multidimensional Random Variables 5-8. Joint Distribution of Two Variables: Marginal Distribution 5-9. Conditional Probability Distribution and Density 5-10. Bivariate Normal Distribution 5-11. Functions of Random Variables 5-12. Transformation from Cartesian to Polar Coordinate System CHAPTER 6 Statistical Averages 6-1. Expected Values; Discrete Case 6-2. Expectation of Sums and Products of a Finite Number of Independent Discrete Random Variables 6-3. Moments of a Univariate Random Variable 6-4. Two Inequalities 6-5. Moments of Bivariate Random Variables 6-6. Correlation Coefficient 6-7. Linear Combination of Random Variables 6-8. Moments of Some Common Distribution Functions 6-9. Characteristic Function of a Random Variable 6-10. Characteristic Function and Moment-generating Function of Random Variables 6-11. Density Functions of the Sum of Two Random Variables CHAPTER 7 Normal Distributions and Limit Theorems 7-1. Bivariate Normal Considered as an Extension of One-dimensional Normal Distribution 7-2. MuItinormal Distribution 7-3. Linear Combination of Normally Distributed Independent Random Variables 7-4. Central-limit Theorem 7-5. A Simple Random-walk Problem 7-6. Approximation of the Binomial Distribution by the Normal Distribution 7-7. Approximation of Poisson Distribution by a Normal Distribution 7-8. The Laws of Large Numbers CHAPTER 8 Continuous Channel without Memory 8-1. Definition of Different Entropies 8-2. The Nature of Mathematical Difficulties Involved 8-3. Infiniteness of Continuous Entropy 8-4. The Variability of the Entropy in the Continuous Case with Coordinate Systems 8-5. A Measure of Information in the Continuous Case 8-6. Maximization of the Entropy of a Continuous Random Variable 8-7. Entropy Maximization Problems 8-8. Gaussian Noisy Channels 8-9. Transmission of Information in the Presence of Additive Noise 8-10. Channel Capacity in Presence of Gaussian Additive Noise and Specified Transmitter and Noise Average Power 8-11. Relation Between the Entropies of Two Related Random Variables 8-12. Note on the Definition of Mutual Information CHAPTER 9 Transmission of Band-limited Signals 9-1. Introduction 9-2. Entropies of Continuous Multivariate Distributions 9-3. Mutual Information of Two Gaussian Random Vectors 9-4. A Channel-capacity Theorem for Additive Gaussian Noise 9-5. Digression 9-6. Sampling Theorem 9-7. A Physical Interpretation of the Sampling Theorem 9-8. The Concept of a Vector Space 9-9. Fourier-series Signal Space 9-10. Band-limited Singal Space 9-11. Band-limited Ensembles 9-12. Entropies of Band-limited Ensemble in Signal Space 9-13. A Mathematical Model for Communication of Continuous Signals 9-14 Optimal Decoding 9-15. A Lower Bound for the Probability of Error 9-16. An Upper Bound for the Probability of Error. 9-17. Fundamental Theorem of Continuous Memoryless Channels in Presence of Additive Noise 9-18. Thomasian's Estimate Part 3 : Schemes with Memory CHAPTER 10 Stochastic Processes 10-1. Stochastic Theory 10-2. Examples of a Stochastic Process 10-3. Moments and Expectations 10-4. Stationary Processes 10-5. Ergodic Processes 10-6. Correlation Coefficients and Correlation Functions 10-7. Example of a Normal Stochastic Process 10-8. Examples of Computation of Correlation Functions 10-9. Some Elementary Properties of Correlation Functions Stationary Processes 10-10. Power Spectra and Correlation Functions 10-11. Response of Linear Lumped Systems to Ergodic Excitation 10-12. Stochastic Limits and Convergence 10-13. Stochastic Differentiation and Integration 10-14. Gaussian-process Example of a Stationary Process 10-15. The Over-all Mathematical Structure of the Stochastic Processes 10-16. A Relation between Positive Definite Functions and Theory of Probability CHAPTER 11 Communication under Stochastic Regimes 11-1. Stochastic Nature of Communication 11-2. Finite Markov Chains 11-3. A Basic Theorem on Regular Markov Chains 11-4. Entropy of a Simple Markov Chain 11-5. Entropy of a Discrete Stationary Source 11-6. Discrete Channels with Finite Memory 11-7. Connection of the Source and the Discrete Channel with Memory 11-8. Connection of a Stationary Source to a Stationary Channel Part 4 : Some Recent Developments CHAPTER 12 The Fundamental Theorem of Information Theory PRELIMINARIES 12-1. A Decision Scheme 12-2. The Probability of Error in a Decision Scheme 12-3. A Relation between Error Probability and Equivocation 12-4. The Extension of Discrete Memoryless Noisy Channels FEINSTEIN'S PROOF 12-5. On Certain Random Variables Associated with a Communication System 12-6. Feinstein's Lemma 12-7. Completion of the Proof SHANNON'S PROOF 12-8. Ensemble Codes 12-9. A Relation between Transinformation and Error Probability 12-10. An Exponential Bound for Error Probability WOLFOWITZ'S PROOF 12-11. The Code Book 12-12. A Lemma and Its Application 12-13. Estimation of Bounds 12-14. Completion of Wolfowitz's Proof CHAPTER 13 Group Codes 13-1. Introduction 13-2. The Concept of a Group 13-3. Fields and Rings 13-4. Algebra for Binary n-Digit Words 13-5. Hammings Codes 13-6. Group Codes 13-7. A Detection Scheme for Group Codes 13-8. Slepian's Technique for Single-error Correcting Group Codes 13-9. Further Notes on Group Codes 13-10. Some Bounds on the Number of Words in a Systematic Code APPENDIX Additional Notes and Tables N-1 The Gambler with a Private Wire N-2 Some Remarks on Sampling Theorem N-3 Analytic Signals and the Uncertainty Relation N-4 Elias's Proof of the Fundamental Theorem for BSC N-5 Further Remarks oil Coding Theory N-6 Partial Ordering of Channels N-7 Information Theory and Radar Problems T-1 Normal Probability Integral T-2 Normal Distributions T-3 A Summary of Some Common Probability Functions T-4 Probability of No Error for Best Group Code T-5 Parity-check Rules for Best Group Alphabets T-6 Logarithms to the Base 2 T-7 Entropy of a Discrete Binary Source BIBLIOGRAPHY NAME INDEX SUBJECT INDEX


Best Sellers


Product Details
  • ISBN-13: 9780486682105
  • Publisher: Dover Publications Inc.
  • Publisher Imprint: Dover Publications Inc.
  • Height: 215 mm
  • No of Pages: 528
  • Spine Width: 214 mm
  • Width: 135 mm
  • ISBN-10: 0486682102
  • Publisher Date: 28 Mar 2003
  • Binding: Paperback
  • Language: English
  • Series Title: Dover Books on Mathema 1.4tics
  • Weight: 526 gr


Similar Products

Add Photo
Add Photo

Customer Reviews

REVIEWS      0     
Click Here To Be The First to Review this Product
An Introduction to Information Theory: (Dover Books on Mathema 1.4tics)
Dover Publications Inc. -
An Introduction to Information Theory: (Dover Books on Mathema 1.4tics)
Writing guidlines
We want to publish your review, so please:
  • keep your review on the product. Review's that defame author's character will be rejected.
  • Keep your review focused on the product.
  • Avoid writing about customer service. contact us instead if you have issue requiring immediate attention.
  • Refrain from mentioning competitors or the specific price you paid for the product.
  • Do not include any personally identifiable information, such as full names.

An Introduction to Information Theory: (Dover Books on Mathema 1.4tics)

Required fields are marked with *

Review Title*
Review
    Add Photo Add up to 6 photos
    Would you recommend this product to a friend?
    Tag this Book Read more
    Does your review contain spoilers?
    What type of reader best describes you?
    I agree to the terms & conditions
    You may receive emails regarding this submission. Any emails will include the ability to opt-out of future communications.

    CUSTOMER RATINGS AND REVIEWS AND QUESTIONS AND ANSWERS TERMS OF USE

    These Terms of Use govern your conduct associated with the Customer Ratings and Reviews and/or Questions and Answers service offered by Bookswagon (the "CRR Service").


    By submitting any content to Bookswagon, you guarantee that:
    • You are the sole author and owner of the intellectual property rights in the content;
    • All "moral rights" that you may have in such content have been voluntarily waived by you;
    • All content that you post is accurate;
    • You are at least 13 years old;
    • Use of the content you supply does not violate these Terms of Use and will not cause injury to any person or entity.
    You further agree that you may not submit any content:
    • That is known by you to be false, inaccurate or misleading;
    • That infringes any third party's copyright, patent, trademark, trade secret or other proprietary rights or rights of publicity or privacy;
    • That violates any law, statute, ordinance or regulation (including, but not limited to, those governing, consumer protection, unfair competition, anti-discrimination or false advertising);
    • That is, or may reasonably be considered to be, defamatory, libelous, hateful, racially or religiously biased or offensive, unlawfully threatening or unlawfully harassing to any individual, partnership or corporation;
    • For which you were compensated or granted any consideration by any unapproved third party;
    • That includes any information that references other websites, addresses, email addresses, contact information or phone numbers;
    • That contains any computer viruses, worms or other potentially damaging computer programs or files.
    You agree to indemnify and hold Bookswagon (and its officers, directors, agents, subsidiaries, joint ventures, employees and third-party service providers, including but not limited to Bazaarvoice, Inc.), harmless from all claims, demands, and damages (actual and consequential) of every kind and nature, known and unknown including reasonable attorneys' fees, arising out of a breach of your representations and warranties set forth above, or your violation of any law or the rights of a third party.


    For any content that you submit, you grant Bookswagon a perpetual, irrevocable, royalty-free, transferable right and license to use, copy, modify, delete in its entirety, adapt, publish, translate, create derivative works from and/or sell, transfer, and/or distribute such content and/or incorporate such content into any form, medium or technology throughout the world without compensation to you. Additionally,  Bookswagon may transfer or share any personal information that you submit with its third-party service providers, including but not limited to Bazaarvoice, Inc. in accordance with  Privacy Policy


    All content that you submit may be used at Bookswagon's sole discretion. Bookswagon reserves the right to change, condense, withhold publication, remove or delete any content on Bookswagon's website that Bookswagon deems, in its sole discretion, to violate the content guidelines or any other provision of these Terms of Use.  Bookswagon does not guarantee that you will have any recourse through Bookswagon to edit or delete any content you have submitted. Ratings and written comments are generally posted within two to four business days. However, Bookswagon reserves the right to remove or to refuse to post any submission to the extent authorized by law. You acknowledge that you, not Bookswagon, are responsible for the contents of your submission. None of the content that you submit shall be subject to any obligation of confidence on the part of Bookswagon, its agents, subsidiaries, affiliates, partners or third party service providers (including but not limited to Bazaarvoice, Inc.)and their respective directors, officers and employees.

    Accept

    Fresh on the Shelf


    Inspired by your browsing history


    Your review has been submitted!

    You've already reviewed this product!