About the Book
Please note that the content of this book primarily consists of articles available from Wikipedia or other free sources online. Pages: 150. Chapters: Quantum computer, Nyquist-Shannon sampling theorem, Kolmogorov complexity, Entropy, Bra-ket notation, Metcalfe's law, Shannon-Hartley theorem, Nevanlinna Prize, Unicity distance, Channel, Harry Nyquist, Distributed source coding, Fisher information, MIMO, Mutual information, PU2RC, Maximum entropy thermodynamics, Multi-user MIMO, Semiotic information theory, Information algebra, Network performance, Random number generation, Kelly criterion, Algorithmic information theory, Differential entropy, Noisy-channel coding theorem, Physical information, Interaction information, 3G MIMO, Asymptotic equipartition property, Lovasz number, Spectral efficiency, Theil index, Structural information theory, Compressed sensing, Principle of least privilege, Network coding, Information seeking behavior, Channel state information, Error exponent, Information flow, Spatial correlation, Information geometry, Hirschman uncertainty, Inequalities in information theory, Operator Grammar, Quantum t-design, Quantities of information, Gambling and information theory, Shannon's source coding theorem, Rate-distortion theory, Proebsting's paradox, Extreme physical information, Channel capacity, Information theory and measure theory, History of information theory, Infonomics, LIFO, Shannon index, Renyi entropy, Typical set, Conditional mutual information, Entropic gravity, Zero-forcing precoding, Timeline of information theory, Total correlation, Kullback's inequality, Linear partial information, Bandwidth extension, Information velocity, Chain rule for Kolmogorov complexity, Entropy estimation, Self-information, Minimum Fisher information, Map communication model, Oversampling, Karl Kupfmuller, Spatial multiplexing, The Use of Knowledge in Society, Constant-weight code, Redundancy, Conditional entropy, Cheung-Marks theorem, Relay...