Linear Statistical Inference and its Applications 2nd Edition
Book 1
Book 2
Book 3
Book 1
Book 2
Book 3
Book 1
Book 2
Book 3
Book 1
Book 2
Book 3
Home > Mathematics and Science Textbooks > Mathematics > Linear Statistical Inference and its Applications 2nd Edition
Linear Statistical Inference and its Applications  2nd Edition

Linear Statistical Inference and its Applications 2nd Edition


     0     
5
4
3
2
1



Out of Stock


Notify me when this book is in stock
X
About the Book

"C. R. Rao would be found in almost any statistician's list of five outstanding workers in the world of Mathematical Statistics today. His book represents a comprehensive account of the main body of results that comprise modern statistical theory." -W. G. Cochran "[C. R. Rao is] one of the pioneers who laid the foundations of statistics which grew from ad hoc origins into a firmly grounded mathematical science." -B. Efrom Translated into six major languages of the world, C. R. Rao's Linear Statistical Inference and Its Applications is one of the foremost works in statistical inference in the literature. Incorporating the important developments in the subject that have taken place in the last three decades, this paperback reprint of his classic work on statistical inference remains highly applicable to statistical analysis. Presenting the theory and techniques of statistical inference in a logically integrated and practical form, it covers: * The algebra of vectors and matrices * Probability theory, tools, and techniques * Continuous probability models * The theory of least squares and the analysis of variance * Criteria and methods of estimation * Large sample theory and methods * The theory of statistical inference * Multivariate normal distribution Written for the student and professional with a basic knowledge of statistics, this practical paperback edition gives this industry standard new life as a key resource for practicing statisticians and statisticians-in-training.

Table of Contents:
Chapter 1: Algebra of Vectors and Matrices. Vector Spaces. 1a.1 Definition of Vector Spaces and Subspaces. 1a.2 Basis of a Vector Space. 1a.3 Linear Equations. 1a.4 Vector Spaces with an Inner Product. Complements and Problems. lb. Theory of Matrices and Determinants. 1b.1 Matrix Operations. 1b.2 Elementary Matrices and Diagonal Reduction of a Matrix. b.3 Determinants. 1b.4 Transformations. 1b.5 Generalized Inverse of a Matrix. 1b.6 Matrix Representation, of Vector Spaces, Bases, etc. 1b.7 Idempotent Matrices. 1b.8 Special Products of Matrices. Complements and Problems. 1c. Eigenvalues and Reduction of Matrices. 1c.1 Classification and Transformation of Quadratic Forms. 1c.2 Roots of Determinantal Equations. 1c.3 Canonical Reduction of Matrices. 1c.4 Projection Operator. 1c.5 Further Results on g-Inverse. 1c.6 Restricted Eigenvalue Problem. 1d. Convex Sets in Vector Spaces. 1d.1 Definitions. 1d.2 Separation Theorems for Convex Sets. 1e. Inequalities. 1e.1 Cauchy-Schwarz (C-S) Inequality. 1e.2 Holder?s Inequality. 1e.3 Hadamard?s Inequality. 1e.4 Inequalities Involving Moments. 1e.5 Convex Functions and Jensen?s Inequality. 1e.6 Inequalities in Information Theory. 1e.7 Stirling?s Approximation. 1f. Extrema of Quadratic Forms. 1f.1 General Results. 1f. 2 Results Involving Eigenvalues and Vectors. 1f. 3 Minimum Trace Problems. Complements and Problems. Chapter 2: Probability Theory, Tools and Techniques. 2a. Calculus of Probability. 2a.l The Space of Elementary Events. 2a.2 The Class of Subsets (Events). 2a.3 Probability as a Set Function. 2a.4 Borel Field (sigma-field) and Extension of Probability Measure. 2a.5 Notion of a Random Variable and Distribution Function. 2a.6 Multidimensional Random Variable. 2a. 7 Conditional Probability and Statistical Independence. 2a.8 Conditional Distribution of a Random Variable. 2b. Mathematical Expectation and Moments of Random Variables. 2b.1 Properties of Mathematical Expectation. 2b.2 Moments, 2b.3 Conditional Expectation. 2b.4 Characteristic Function (c.f.). 2b.5 Inversion Theorems. 2b.6 Multivariate Moments. 2c. Limit Theorems. 2c.1 Kolmogorov Consistency Theorem. 2c.2 Convergence of a Sequence of Random Variables. 2c.3 Law of Large Numbers. 2c.4 Convergence of a Sequence of Distribution Functions. 2c.5 Central Limit Theorems. 2c.6 Sums of Independent Random Variables. 2d. Family of Probability Measures and Problems of Statistics. 2d.1 Family of Probability Measures. 2d.2 The Concept of a Sufficient Statistic. 2d.3 Characterization of Sufficiency. Appendix 2A. Stieltjes and Lebesgue Integrals. Appendix 2B. Some Important Theorems in Measure Theory and Integration. Appendix 2C. Invariance. Appendix 2D. Statistics, Subfields, and Sufficiency. Appendix 2E. Non-Negative Definiteness of a Characteristic Function. Complements and Problems Chapter 3: Continuous Probability Models. 3a. Univariate Models. 3a.1 Normal Distribution. 3a.2 Gamma Distribution. 3a.3 Beta Distribution. 3a.4 Cauchy Distribution. 3a.5 Student?s t Distribution. 3a.6 Distributions Describing Equilibrium States in Statistical Mechanics. 3a.7 Distribution on a Circle. 3b. Sampling Distributions. 3b.1 Definitions and Results. 3b.2 Sum of Squares of Normal Variables. 3b.3 Joint Distribution of the Sample Mean and Variance. 3b.4 Distribution of Quadratic Forms. 3b.5 Three Fundamental Theorems of the least Squares Theory. 3b.6 The p-Variate Normal Distribution. 3b.7 The Exponential Family of Distributions. 3c. Symmetric Normal Distribution. 3c.1 Definition. 3c.2 Sampling Distributions. 3d. Bivariate Normal Distribution. 3d.1 General Properties. 3d. 2 Sampling Distributions. Complements and Problems. Chapter 4: The Theory of least Squares and Analysis of Variance. 4a. Theory of least Squares (Linear Estimation). 4a.1 Gauss-Markoff Setup (Y, Xbeta, sigma 2 I). 4a.2 Normal Equations and least Squares (l.s.) Estimators. 4a.3 g-Inverse and a Solution of the Normal Equation. 4a.4 Variances and Covariances of l.s. Estimators. 4a.5 Estimation of sigma 2 . 4a.6 Other Approaches to the l.s. Theory (Geometric Solution). 4a.7 Explicit Expressions for Correlated Observations. 4a.8 Some Computational Aspects of the l.s. Theory. 4a.9 least Squares Estimation with Restrictions on Parameters. 4a.10 Simultaneous Estimation of Parametric Functions. 4a.11 least Squares Theory when the Parameters Are Random Variables. 4a.12 Choice of the Design Matrix. 4b. Tests of Hypotheses and Interval Estimation. 4b.1 Single Parametric Function (Inference). 4b.2 More than One Parametric Function (Inference). 4b.3 Setup with Restrictions. 4c. Problems of a Single Sample. 4c.1 The Test Criterion. 4c.2 Asymmetry of Right and left Femora (Paired Comparison). 4d. One-Way Classified Data. 4d.1 The Test Criterion. 4d.2 An Example. 4e. Two-Way Classified Data. 4e.1 Single Observation in Each Cell. 4e.2 Multiple but Equal Numbers in Each Cell. 4e.3 Unequal Numbers in Cells. 4f. A General Model for Two-Way Data and Variance Components. 4f.1 A General Model. 4f.2 Variance Components Model. 4f.3 Treatment of the General Model. 4g. The Theory and Application of Statistical Regression. 4g.1 Concept of Regression (General Theory). 4g.2 Measurement of Additional Association. 4g.3 Prediction of Cranial Capacity (a Practical Example). 4g.4 Test for Equality of the Regression Equations. 4g.5 The Test for an Assigned Regression Function. 4g.6 Restricted Regression. 4h. The General Problem of least Squares with Two Sets of Parameters. 4h.1 Concomitant Variables. 4h.2 Analysis of Covariance. 4h.3 An Illustrative Example. 4i. Unified Theory of Linear Estimation 4i.1 A Basic Lemma on Generalized Inverse. 4i.2 The General Gauss-Markoff Model (GGM). 4i.3 The Inverse Partitioned Matrix (IPM) Method. 4i.4 Untried Theory of Least Squares. 4j. Estimation of Variance Components. 4j.1 Variance Components Model. 4j.2 Minque Theory. 4j.3 Computation under the Euclidian Norm. 4k. Biased Estimation in Linear Models. 4k.1 Best Linear Estimator (BLE). 4k.2 Best Linear Minimum Bias Estimation (BLIMBE) Complements and Problems. Chapter 5: Criteria and Methods of Estimation. 5a. Minimum Variance Unbiased Estimation. 5a.1 Minimum Variance Criterion. 5a.2 Some Fundamental Results on Minimum Variance Estimation. 5a.3 The Case of Several Parameters. 5a.4 Fisher?s Information Measure. 5a.5 An Improvement of Un-biased Estimators. 5b. General Procedures. 5b.1 Statement of the General Problem (Bayes Theorem). 5b.2 Joint d.f. of (&Teata;, x) Completely Known. 5b.3 The Law of Equal Ignorance. 5b.4 Empirical Bayes Estimation Procedures. 5b.5 Fiducial Probability. 5b.6 Minimax Principle. 5b.7 Principle of Invariance. 5c. Criteria of Estimation in Large Samples. 5c.1, Consistency. 5c.2 Efficiency. 5d. Some Methods of Estimation in Large Samples. 5d.1 Method of Moments. 5d.2 Minimum Chi-Square and Associated Methods. 5d. 3 Maximum Likelihood. 5e. Estimation of the Multinomial Distribution. 5e.1 Nonparametric Case. 5e.2 Parametric Case. 5f. Estimation of Parameters in the General Case. 5f.1 Assumptions and Notations. 5f.2 Properties of m.l. Equation Estimators. 5g. The Method of Scoring for the Estimation of Parameters, Complements and Problems. Chapter 6: Large Sample Theory and Methods. 6a. Some Basic Results. 6a.1 Asymptotic Distribution of Quadratic Functions of Frequencies. 6a.2 Some Convergence Theorems. 6b. Chi-Square Tests for the Multinomial Distribution 6b.1 Test of Departure from a Simple Hypothesis. 6b.2 Chi-Square Test for Goodness of Fit. 6b.3 Test for Deviation in a Single Cell. 6b.4 Test Whether the Parameters Lie in a Subset. 6b.5 Some Examples. 6b.6 Test for Deviations in a Number of Cells 6c. Tests Relating to Independent Samples from Multinomial Distributions. 6c.1 General Results. 6c.2 Test of Homogeneity of Parallel Samples. 6c.3 An Example. 6d. Contingency Tables. 6d.1 The Probability of an Observed Configuration and Tests in Large Samples. 6d.2 Tests of Independence in a Contingency Table. 6d.3. Tests of Independence in Small Samples. 6e. Some General Classes of Large Sample Tests. 6e.1 Notations and Basic Results. 6e.2 Test of a Simple Hypothesis. 6e.3 Test of a Composite Hypothesis. 6f. Order Statistics. 6f.1 The Empirical Distribution Function. 6f.2 Asymptotic Distribution of Sample Fractiles. 6g. Transformation of Statistics. 6g.1 A General Formula. 6g.2 Square Root Transformation of the Poisson Variate. 6g.3 Sin -1 Transformation of the Square Root of the Binomial Proportion. 6g.4 Tanh -1 Transformation of the Correlation Coefficient. 6h. Standard Errors of Moments and Related Statistics. 6h.1 Variances and Covariances of Raw Moments. 6h.2 Asymptotic Variances and Covariances of Central Moments. 6h.3 Exact Expressions for Variances and Covariances of Central Moments. Complements and Problems. Chapter 7: Theory of Statistical Inference. 7a. Testing of Statistical Hypotheses. 7a.1 Statement of the Problem. 7a.2 Neyman-Pearson Fundamental Lemma and Generalizations. 7a.3 Simple H o against Simple H. 7a.4 Locally Most Powerful Tests. 7a.5 Testing a Composite Hypothesis. 7a.6 Fisher-Behrens Problem. 7a.7 Asymptotic Efficiency of Tests. 7b. Confidence Intervals. 7b.1 The General Problem. 7b.2 A General Method of Constructing a Confidence Set. 7b.3 Set Estimators for Functions of &Teata;. 7c. Sequential Analysis. 7c.1 Wald?s Sequential Probability Ratio Test. 7c.2 Some Properties of the S.P.R.T. 7c.3 Efficiency of the S.P.R.T. 7c.4 An Example of Economy of Sequential Testing. 7c.5 The Fundamental Identity of Sequential Analysis. 7c.6 Sequential Estimation. 7c.7 Sequential Tests with Power One. 7d. Problem of Identification?Decision Theory. 7d.1 Statement of the Problem. 7d.2 Randomized and Nonrandomized Decision Rules. 7d.3 Bayes Solution. 7d.4 Complete Class of Decision Rules. 7d.5 Minimax Rule. 7e. Nonparametric Inference. 7e.1 Concept of Robustness. 7e.2 Distribution-Free Methods. 7e.3 Some Nonparametric Tests. 7e.4 Principle of Randomization. 7f. Ancillary Information. Complements and Problems. Chapter 8: Multivariate Analysis. 8a. Multivariate Normal Distribution. 8a.1 Definition. 8a.2 Properties of the Distribution. 8a.3 Some Characterizations of N p . 8a.4 Density Function of the Multivariate Normal Distribution. 8a.5 Estimation of Parameters. 8a.6 N p as a Distribution with Maximum Entropy 8b. Wishart Distribution 8b.1 Definition and Notation. 8b.2 Some Results on Wishart Distribution 8c. Analysis of Dispersion 8c.1 The Gauss-Markoff Setup for Multiple Measurements. 8c.2 Estimation of Parameters. 8c.3 Tests of Linear Hypotheses, Analysis of Dispersion (A.D.). 8c.4 Test for Additional Information. 8c.5 The Distribution of A. 8c.6 Test for Dimensionality (Structural Relationship). 8c.7 Analysis of Dispersion with Structural Parameters (Growth Model). 8d. Some Applications of Multivariate Tests 8d.1 Test for Assigned Mean Values. 8d.2 Test for a Given Structure of Mean Values. 8d.3 Test for Differences between Mean Values of Two Populations. 8d.4 Test for Differences in Mean Values between Several Populations. 8d.5 Barnard?s Problem of Secular Variations in Skull Characters. 8e. Discriminatory Analysis (Identification). 8e.1 Discriminant Scores for Decision. 8e.2 Discriminant Analysis in Research Work. 8e.3 Discrimination between Composite Hypotheses. 8f. Relation between Sets of Variates. 8f.1 Canonical Correlations. 8f.2 Properties of Canonical Variables. 8f.3 Effective Number of Common Factors. 8f.4 Factor Analysis. 8g. Orthonormal Basis of a Random Variable. 8g.1 The Gram-Schmidt Basis. 8g.2 Principal Component Analysis. Complements and Problems. Publications of the Author. Author Index. Subject Index.

About the Author :
C. RADHAKRISHNA RAO is a former director at the Indian Statistical Institute and a Professor Emeritus in the Department of Statistics at Pennsylvania State University. For his academic achievements, Dr. Rao has received numerous awards. A past president of the International Statistical Institute and other leading statistical organizations, Dr. Rao has been made a Fellow of the Royal Society (U.K. Academy of Sciences), a Member of the U.S. National Academy of Sciences, a Fellow of the American Academy of Arts and Sciences, the Indian National Science Academy, and the Third World Academy of Sciences, and a Foreign Member of the Lithuanian Academy of Sciences.


Best Sellers


Product Details
  • ISBN-13: 9780470316436
  • Publisher: John Wiley and Sons Ltd
  • Publisher Imprint: Wiley-Blackwell
  • Height: 250 mm
  • No of Pages: 656
  • Weight: 666 gr
  • ISBN-10: 0470316438
  • Publisher Date: 27 May 2008
  • Binding: Other digital
  • Language: English
  • Spine Width: 15 mm
  • Width: 150 mm


Similar Products

Add Photo
Add Photo

Customer Reviews

REVIEWS      0     
Click Here To Be The First to Review this Product
Linear Statistical Inference and its Applications  2nd Edition
John Wiley and Sons Ltd -
Linear Statistical Inference and its Applications 2nd Edition
Writing guidlines
We want to publish your review, so please:
  • keep your review on the product. Review's that defame author's character will be rejected.
  • Keep your review focused on the product.
  • Avoid writing about customer service. contact us instead if you have issue requiring immediate attention.
  • Refrain from mentioning competitors or the specific price you paid for the product.
  • Do not include any personally identifiable information, such as full names.

Linear Statistical Inference and its Applications 2nd Edition

Required fields are marked with *

Review Title*
Review
    Add Photo Add up to 6 photos
    Would you recommend this product to a friend?
    Tag this Book Read more
    Does your review contain spoilers?
    What type of reader best describes you?
    I agree to the terms & conditions
    You may receive emails regarding this submission. Any emails will include the ability to opt-out of future communications.

    CUSTOMER RATINGS AND REVIEWS AND QUESTIONS AND ANSWERS TERMS OF USE

    These Terms of Use govern your conduct associated with the Customer Ratings and Reviews and/or Questions and Answers service offered by Bookswagon (the "CRR Service").


    By submitting any content to Bookswagon, you guarantee that:
    • You are the sole author and owner of the intellectual property rights in the content;
    • All "moral rights" that you may have in such content have been voluntarily waived by you;
    • All content that you post is accurate;
    • You are at least 13 years old;
    • Use of the content you supply does not violate these Terms of Use and will not cause injury to any person or entity.
    You further agree that you may not submit any content:
    • That is known by you to be false, inaccurate or misleading;
    • That infringes any third party's copyright, patent, trademark, trade secret or other proprietary rights or rights of publicity or privacy;
    • That violates any law, statute, ordinance or regulation (including, but not limited to, those governing, consumer protection, unfair competition, anti-discrimination or false advertising);
    • That is, or may reasonably be considered to be, defamatory, libelous, hateful, racially or religiously biased or offensive, unlawfully threatening or unlawfully harassing to any individual, partnership or corporation;
    • For which you were compensated or granted any consideration by any unapproved third party;
    • That includes any information that references other websites, addresses, email addresses, contact information or phone numbers;
    • That contains any computer viruses, worms or other potentially damaging computer programs or files.
    You agree to indemnify and hold Bookswagon (and its officers, directors, agents, subsidiaries, joint ventures, employees and third-party service providers, including but not limited to Bazaarvoice, Inc.), harmless from all claims, demands, and damages (actual and consequential) of every kind and nature, known and unknown including reasonable attorneys' fees, arising out of a breach of your representations and warranties set forth above, or your violation of any law or the rights of a third party.


    For any content that you submit, you grant Bookswagon a perpetual, irrevocable, royalty-free, transferable right and license to use, copy, modify, delete in its entirety, adapt, publish, translate, create derivative works from and/or sell, transfer, and/or distribute such content and/or incorporate such content into any form, medium or technology throughout the world without compensation to you. Additionally,  Bookswagon may transfer or share any personal information that you submit with its third-party service providers, including but not limited to Bazaarvoice, Inc. in accordance with  Privacy Policy


    All content that you submit may be used at Bookswagon's sole discretion. Bookswagon reserves the right to change, condense, withhold publication, remove or delete any content on Bookswagon's website that Bookswagon deems, in its sole discretion, to violate the content guidelines or any other provision of these Terms of Use.  Bookswagon does not guarantee that you will have any recourse through Bookswagon to edit or delete any content you have submitted. Ratings and written comments are generally posted within two to four business days. However, Bookswagon reserves the right to remove or to refuse to post any submission to the extent authorized by law. You acknowledge that you, not Bookswagon, are responsible for the contents of your submission. None of the content that you submit shall be subject to any obligation of confidence on the part of Bookswagon, its agents, subsidiaries, affiliates, partners or third party service providers (including but not limited to Bazaarvoice, Inc.)and their respective directors, officers and employees.

    Accept

    Fresh on the Shelf


    Inspired by your browsing history


    Your review has been submitted!

    You've already reviewed this product!