Advanced Markov Chain Monte Carlo Methods
Home > Mathematics and Science Textbooks > Mathematics > Calculus and mathematical analysis > Numerical analysis > Advanced Markov Chain Monte Carlo Methods: Learning from Past Samples(Wiley Series in Computational Statistics)
Advanced Markov Chain Monte Carlo Methods: Learning from Past Samples(Wiley Series in Computational Statistics)

Advanced Markov Chain Monte Carlo Methods: Learning from Past Samples(Wiley Series in Computational Statistics)


     0     
5
4
3
2
1



International Edition


X
About the Book

Markov Chain Monte Carlo (MCMC) methods are now an indispensable tool in scientific computing. This book discusses recent developments of MCMC methods with an emphasis on those making use of past sample information during simulations. The application examples are drawn from diverse fields such as bioinformatics, machine learning, social science, combinatorial optimization, and computational physics. Key Features: Expanded coverage of the stochastic approximation Monte Carlo and dynamic weighting algorithms that are essentially immune to local trap problems. A detailed discussion of the Monte Carlo Metropolis-Hastings algorithm that can be used for sampling from distributions with intractable normalizing constants. Up-to-date accounts of recent developments of the Gibbs sampler. Comprehensive overviews of the population-based MCMC algorithms and the MCMC algorithms with adaptive proposals. This book can be used as a textbook or a reference book for a one-semester graduate course in statistics, computational biology, engineering, and computer sciences. Applied or theoretical researchers will also find this book beneficial.

Table of Contents:
Preface xiii Acknowledgments xvii Publisher’s Acknowledgments xix 1 Bayesian Inference and Markov Chain Monte Carlo 1 1.1 Bayes 1 1.1.1 Specification of Bayesian Models 2 1.1.2 The Jeffreys Priors and Beyond 2 1.2 Bayes Output 4 1.2.1 Credible Intervals and Regions 4 1.2.2 Hypothesis Testing: Bayes Factors 5 1.3 Monte Carlo Integration 8 1.3.1 The Problem 8 1.3.2 Monte Carlo Approximation 9 1.3.3 Monte Carlo via Importance Sampling 9 1.4 Random Variable Generation 10 1.4.1 Direct or Transformation Methods 1 1.4.2 Acceptance-Rejection Methods 11 1.4.3 The Ratio-of-Uniforms Method and Beyond 14 1.4.4 Adaptive Rejection Sampling 18 1.4.5 Perfect Sampling 18 1.5 Markov Chain Monte Carlo 18 1.5.1 Markov Chains 18 1.5.2 Convergence Results 20 1.5.3 Convergence Diagnostics 23 Exercises 24 2 The Gibbs Sampler 27 2.1 The Gibbs Sampler 27 2.2 Data Augmentation 30 2.3 Implementation Strategies and Acceleration Methods 33 2.3.1 Blocking and Collapsing 33 2.3.2 Hierarchical Centering and Reparameterization 34 2.3.3 Parameter Expansion for Data Augmentation 35 2.3.4 Alternating Subspace-Spanning Resampling 43 2.4 Applications 45 2.4.1 The Student-t Model 45 2.4.2 Robit Regression or Binary Regression with the Student-t Link 47 2.4.3 Linear Regression with Interval-Censored Responses 50 Exercises 54 Appendix 2A: The EM and PX-EM Algorithms 56 3 The Metropolis-Hastings Algorithm 59 3.1 The Metropolis-Hastings Algorithm 59 3.1.1 Independence Sampler 62 3.1.2 Random Walk Chains 63 3.1.3 Problems with Metropolis-Hastings Simulations 63 3.2 Variants of the Metropolis-Hastings Algorithm 65 3.2.1 The Hit-and-Run Algorithm. 65 3.2.2 The Langevin Algorithm 65 3.2.3 The Multiple-Try MH Algorithm 66 3.3 Reversible Jump MCMC Algorithm for Bayesian Model Selection Problems 67 3.3.1 Reversible Jump MCMC Algorithm 67 3.3.2 Change-Point Identification 70 3.4 Metropolis-Within-Gibbs Sampler for ChIP-chip Data Analysis 75 3.4.1 Metropolis-Within-Gibbs Sampler 75 3.4.2 Bayesian Analysis for ChIP-chip Data 76 Exercises 83 4 Auxiliary Variable MCMC Methods 85 4.1 Simulated Annealing 86 4.2 Simulated Tempering 88 4.3 The Slice Sampler 90 4.4 The Swendsen-Wang Algorithm 91 4.5 The Wolff Algorithm 93 4.6 The Mo/ller Algorithm 95 4.7 The Exchange Algorithm 97 4.8 The Double MH Sampler 98 4.8.1 Spatial Autologistic Models 99 4.9 Monte Carlo MH Sampler 103 4.9.1 Monte Carlo MH Algorithm 103 4.9.2 Convergence 107 4.9.3 Spatial Autologistic Models (Revisited) 110 4.9.4 Marginal Inference 111 4.10 Applications 113 4.10.1 Autonormal Models 114 4.10.2 Social Networks 116 Exercises 121 5 Population-Based MCMC Methods 123 5.1 Adaptive Direction Sampling 124 5.2 Conjugate Gradient Monte Carlo 125 5.3 Sample Metropolis-Hastings Algorithm 126 5.4 Parallel Tempering 127 5.5 Evolutionary Monte Carlo 128 5.5.1 Evolutionary Monte Carlo in Binary-Coded Space 129 5.5.2 Evolutionary Monte Carlo in Continuous Space 132 5.5.3 Implementation Issues 133 5.5.4 Two Illustrative Examples 134 5.5.5 Discussion 139 5.6 Sequential Parallel Tempering for Simulation of High Dimensional Systems 140 5.6.1 Build-up Ladder Construction 141 5.6.2 Sequential Parallel Tempering 142 5.6.3 An Illustrative Example: the Witch’s Hat Distribution 142 5.6.4 Discussion 145 5.7 Equi-Energy Sampler 146 5.8 Applications 148 5.8.1 Bayesian Curve Fitting 148 5.8.2 Protein Folding Simulations: 2D HP Model 153 5.8.3 Bayesian Neural Networks for Nonlinear Time Series Forecasting 156 Exercises 162 Appendix 5A: Protein Sequences for 2D HP Models 163 6 Dynamic Weighting 165 6.1 Dynamic Weighting 165 6.1.1 The IWIW Principle 165 6.1.2 Tempering Dynamic Weighting Algorithm 167 6.1.3 Dynamic Weighting in Optimization 171 6.2 Dynamically Weighted Importance Sampling 173 6.2.1 The Basic Idea 173 6.2.2 A Theory of DWIS 174 6.2.3 Some IWIWp Transition Rules 176 6.2.4 Two DWIS Schemes 179 6.2.5 Weight Behavior Analysis 180 6.2.6 A Numerical Example 183 6.3 Monte Carlo Dynamically Weighted Importance Sampling 185 6.3.1 Sampling from Distributions with Intractable Normalizing Constants 185 6.3.2 Monte Carlo Dynamically Weighted Importance Sampling 186 6.3.3 Bayesian Analysis for Spatial Autologistic Models 191 6.4 Sequentially Dynamically Weighted Importance Sampling 195 Exercises 197 7 Stochastic Approximation Monte Carlo 199 7.1 Multicanonical Monte Carlo 200 7.2 1/k-Ensemble Sampling 202 7.3 The Wang-Landau Algorithm 204 7.4 Stochastic Approximation Monte Carlo 207 7.5 Applications of Stochastic Approximation Monte Carlo 218 7.5.1 Efficient p-Value Evaluation for Resampling-Based Tests 218 7.5.2 Bayesian Phylogeny Inference 222 7.5.3 Bayesian Network Learning 227 7.6 Variants of Stochastic Approximation Monte Carlo 233 7.6.1 Smoothing SAMC for Model Selection Problems 233 7.6.2 Continuous SAMC for Marginal Density Estimation 239 7.6.3 Annealing SAMC for Global Optimization 244 7.7 Theory of Stochastic Approximation Monte Carlo 253 7.7.1 Convergence 253 7.7.2 Convergence Rate 267 7.7.3 Ergodicity and its IWIW Property 271 7.8 Trajectory Averaging: Toward the Optimal Convergence Rate 275 7.8.1 Trajectory Averaging for a SAMCMC Algorithm 277 7.8.2 Trajectory Averaging for SAMC 279 7.8.3 Proof of Theorems 7.8.2 and 7.8.3 281 Exercises 296 Appendix 7A: Test Functions for Global Optimization 298 8 Markov Chain Monte Carlo with Adaptive Proposals 305 8.1 Stochastic Approximation-Based Adaptive Algorithms 306 8.1.1 Ergodicity and Weak Law of Large Numbers 307 8.1.2 Adaptive Metropolis Algorithms 309 8.2 Adaptive Independent Metropolis-Hastings Algorithms 312 8.3 Regeneration-Based Adaptive Algorithms 315 8.3.1 Identification of Regeneration Times 315 8.3.2 Proposal Adaptation at Regeneration Times 317 8.4 Population-Based Adaptive Algorithms 317 8.4.1 ADS, EMC, NKC and More 317 8.4.2 Adaptive EMC 318 8.4.3 Application to Sensor Placement Problems 323 Exercises 324 References 327 Index 353

About the Author :
Faming Liang, Associate Professor, Department of Statistics, Texas A&M University. Chuanhai Liu, Professor, Department of Statistics, Purdue University. Raymond J. Carroll, Distinguished Professor, Department of Statistics, Texas A&M University.

Review :
"The book is suitable as a textbook for one-semester courses on Monte Carlo methods, offered at the advance postgraduate levels." (Mathematical Reviews, 1 December 2012) "Researchers working in the field of applied statistics will profit from this easy-to-access presentation. Further illustration is done by discussing interesting examples and relevant applications. The valuable reference list includes technical reports which are hard to and by searching in public data bases." (Zentralblatt MATH, 2011) "This book can be used as a textbook or a reference book for a one-semester graduate course in statistics, computational biology, engineering, and computer sciences. Applied or theoretical researchers will also find this book beneficial." (Breitbart.com: Business Wire , 1 February 2011) "The Markov Chain Monte Carlo method has now become the dominant methodology for solving many classes of computational problems in science and technology." (SciTech Book News, December 2010)


Best Sellers


Product Details
  • ISBN-13: 9780470748268
  • Publisher: John Wiley & Sons Inc
  • Publisher Imprint: John Wiley & Sons Inc
  • Height: 233 mm
  • No of Pages: 384
  • Returnable: N
  • Spine Width: 26 mm
  • Weight: 737 gr
  • ISBN-10: 0470748265
  • Publisher Date: 16 Jul 2010
  • Binding: Hardback
  • Language: English
  • Returnable: N
  • Series Title: Wiley Series in Computational Statistics
  • Sub Title: Learning from Past Samples
  • Width: 163 mm


Similar Products

Add Photo
Add Photo

Customer Reviews

REVIEWS      0     
Click Here To Be The First to Review this Product
Advanced Markov Chain Monte Carlo Methods: Learning from Past Samples(Wiley Series in Computational Statistics)
John Wiley & Sons Inc -
Advanced Markov Chain Monte Carlo Methods: Learning from Past Samples(Wiley Series in Computational Statistics)
Writing guidlines
We want to publish your review, so please:
  • keep your review on the product. Review's that defame author's character will be rejected.
  • Keep your review focused on the product.
  • Avoid writing about customer service. contact us instead if you have issue requiring immediate attention.
  • Refrain from mentioning competitors or the specific price you paid for the product.
  • Do not include any personally identifiable information, such as full names.

Advanced Markov Chain Monte Carlo Methods: Learning from Past Samples(Wiley Series in Computational Statistics)

Required fields are marked with *

Review Title*
Review
    Add Photo Add up to 6 photos
    Would you recommend this product to a friend?
    Tag this Book Read more
    Does your review contain spoilers?
    What type of reader best describes you?
    I agree to the terms & conditions
    You may receive emails regarding this submission. Any emails will include the ability to opt-out of future communications.

    CUSTOMER RATINGS AND REVIEWS AND QUESTIONS AND ANSWERS TERMS OF USE

    These Terms of Use govern your conduct associated with the Customer Ratings and Reviews and/or Questions and Answers service offered by Bookswagon (the "CRR Service").


    By submitting any content to Bookswagon, you guarantee that:
    • You are the sole author and owner of the intellectual property rights in the content;
    • All "moral rights" that you may have in such content have been voluntarily waived by you;
    • All content that you post is accurate;
    • You are at least 13 years old;
    • Use of the content you supply does not violate these Terms of Use and will not cause injury to any person or entity.
    You further agree that you may not submit any content:
    • That is known by you to be false, inaccurate or misleading;
    • That infringes any third party's copyright, patent, trademark, trade secret or other proprietary rights or rights of publicity or privacy;
    • That violates any law, statute, ordinance or regulation (including, but not limited to, those governing, consumer protection, unfair competition, anti-discrimination or false advertising);
    • That is, or may reasonably be considered to be, defamatory, libelous, hateful, racially or religiously biased or offensive, unlawfully threatening or unlawfully harassing to any individual, partnership or corporation;
    • For which you were compensated or granted any consideration by any unapproved third party;
    • That includes any information that references other websites, addresses, email addresses, contact information or phone numbers;
    • That contains any computer viruses, worms or other potentially damaging computer programs or files.
    You agree to indemnify and hold Bookswagon (and its officers, directors, agents, subsidiaries, joint ventures, employees and third-party service providers, including but not limited to Bazaarvoice, Inc.), harmless from all claims, demands, and damages (actual and consequential) of every kind and nature, known and unknown including reasonable attorneys' fees, arising out of a breach of your representations and warranties set forth above, or your violation of any law or the rights of a third party.


    For any content that you submit, you grant Bookswagon a perpetual, irrevocable, royalty-free, transferable right and license to use, copy, modify, delete in its entirety, adapt, publish, translate, create derivative works from and/or sell, transfer, and/or distribute such content and/or incorporate such content into any form, medium or technology throughout the world without compensation to you. Additionally,  Bookswagon may transfer or share any personal information that you submit with its third-party service providers, including but not limited to Bazaarvoice, Inc. in accordance with  Privacy Policy


    All content that you submit may be used at Bookswagon's sole discretion. Bookswagon reserves the right to change, condense, withhold publication, remove or delete any content on Bookswagon's website that Bookswagon deems, in its sole discretion, to violate the content guidelines or any other provision of these Terms of Use.  Bookswagon does not guarantee that you will have any recourse through Bookswagon to edit or delete any content you have submitted. Ratings and written comments are generally posted within two to four business days. However, Bookswagon reserves the right to remove or to refuse to post any submission to the extent authorized by law. You acknowledge that you, not Bookswagon, are responsible for the contents of your submission. None of the content that you submit shall be subject to any obligation of confidence on the part of Bookswagon, its agents, subsidiaries, affiliates, partners or third party service providers (including but not limited to Bazaarvoice, Inc.)and their respective directors, officers and employees.

    Accept

    New Arrivals


    Inspired by your browsing history


    Your review has been submitted!

    You've already reviewed this product!