About the Book
Please note that the content of this book primarily consists of articles available from Wikipedia or other free sources online. Pages: 106. Chapters: Extreme value theory, Entropy, Statistical inference, Likelihood-ratio test, Bayesian inference, Statistical model, Statistical population, Statistical assumption, Maximum likelihood, Convergence of random variables, Design of experiments, Uncertainty, Accuracy and precision, Sufficient statistic, Optimal design, Kullback-Leibler divergence, Robust statistics, Window function, Principle of maximum entropy, Ergodic theory, Galton's problem, Invariant estimator, Binomial proportion confidence interval, Bias of an estimator, Behrens-Fisher problem, Fiducial inference, Sensitivity and specificity, Response surface methodology, Information geometry, Factorial experiment, Consistent estimator, Parametric model, Errors and residuals in statistics, Loss function, Edgeworth series, Fractional factorial design, Efficiency, Peirce's criterion, Efficient estimator, Asymptotic theory, Completeness, Pivotal quantity, Fisher consistency, Mathematical statistics, Recursive partitioning, Ancillary statistic, Fisher transformation, Independent and identically distributed random variables, Parameter space, Model selection, Sampling distribution, Semiparametric model, Spatial dependence, Restricted maximum likelihood, Nuisance parameter, Shrinkage estimator, Winsorising, Analytic and enumerative statistical studies, Statistical parameter, Frequency, Conditionality principle, Neutral vector, Studentization, Coherence, Exponential dispersion model, A priori probability, Berkson error model, Youden's J statistic. Excerpt: Bayesian inference is a method of statistical inference in which evidence is used to estimate parameters and predictions in a probability model. In Bayesian inference, all uncertainty is summarized by a "posterior distribution," which is a probability distribution for all uncertain quantities, given the data and the mod...