Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

A First Course in Bayesian Statistical Methods

A First Course in Bayesian Statistical Methods A self-contained introduction to probability, exchangeability and Bayes’ rule provides a theoretical understanding of the applied material. Numerous examples with R-code that can be run "as-is" allow the reader to perform the data analyses themselves. The development of Monte Carlo and Markov chain Monte Carlo methods in the context of data analysis examples provides motivation for these computational methods. ; This compact, self-contained introduction to the theory and application of Bayesian statistical methods is accessible to those with a basic familiarity with probability, yet allows advanced readers to grasp the principles underlying Bayesian theory and method. ; Thisbookoriginatedfromasetoflecturenotesforaone-quartergradua- levelcoursetaughtattheUniversityofWashington. Thepurposeofthecourse istofamiliarizethestudentswiththebasicconceptsofBayesiantheoryand toquicklygetthemperformingtheirowndataanalysesusingBayesianc- putationaltools. Theaudienceforthiscourseincludesnon-statisticsgraduate studentswhodidwellintheirdepartment’sgraduate-levelintroductorystat- ticscoursesandwhoalsohaveaninterestinstatistics. Additionally,?rst-and second-yearstatisticsgraduatestudentshavefoundthiscoursetobeauseful introductiontostatisticalmodeling. Likethecourse,thisbookisintendedto beaself-containedandcompactintroductiontothemainconceptsofBayesian theoryandpractice. Bytheendofthetext,readersshouldhavetheabilityto understandandimplementthebasictoolsofBayesianstatisticalmethodsfor theirowndataanalysispurposes. Thetextisnotintendedasacompreh- sivehandbookforadvancedstatisticalresearchers,althoughitishopedthat thislattercategoryofreaderscouldusethisbookasaquickintroductionto Bayesianmethodsandasapreparationformorecomprehensiveanddetailed studies. Computing MonteCarlosummariesofposteriordistributionsplayanimportantrolein the way data analyses are presented in this text. My experience has been thatonceastudentunderstandsthebasicideaofposteriorsampling,their dataanalysesquicklybecomemorecreativeandmeaningful,usingrelevant posteriorpredictivedistributionsandinterestingfunctionsofparameters. The open-sourceRstatisticalcomputingenvironmentprovidessu?cientfuncti- alitytomakeMonteCarloestimationveryeasyforalargenumberofstat- ticalmodels,andexampleR-codeisprovidedthroughoutthetext. Muchof theexamplecodecanberun“asis”inR,andessentiallyallofitcanberun afterdownloadingtherelevantdatasetsfromthecompanionwebsiteforthis book. VI Preface Acknowledgments Thepresentationofmaterialinthisbook,andmyteachingstyleingeneral, havebeenheavilyin?uencedbythediversesetofstudentstakingCSSS-STAT 564attheUniversityofWashington. Mythankstothemforimprovingmy teaching. IalsothankChrisHo?man,VladimirMinin,XiaoyueNiuandMarc Suchard for their extensive comments, suggestions and corrections for this book,andtoAdrianRafteryforbibliographicsuggestions. Finally,Ithank mywifeJenforherpatienceandsupport. Seattle,WA PeterHo? March2009 Contents 1 Introductionandexamples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1. 1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1. 2 WhyBayes? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 1. 2. 1 Estimatingtheprobabilityofarareevent . . . . . . . . . . . . 3 1. 2. 2 Buildingapredictivemodel. . . . . . . . . . . . . . . . . . . . . . . . . 8 1. 3 Wherewearegoing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 1. 4 Discussionandfurtherreferences. . . . . . . . . . . . . . . . . . . . . . . . . . 12 2 Belief,probabilityandexchangeability. . . . . . . . . . . . . . . . . . . . . 13 2. 1 Belieffunctionsandprobabilities. . . . . . . . . . . . . . . . . . . . . . . . . . 13 2. 2 Events,partitionsandBayes’rule . . . . . . . . . . . . . . . . . . . . . . . . . 14 2. 3 Independence. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 2. 4 Randomvariables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 2. 4. 1 Discreterandomvariables. . . . . . . . . . . . . . . . . . . . . . . . . . 18 2. 4. 2 Continuousrandomvariables . . . . . . . . . . . . . . . . . . . . . . . 19 2. 4. 3 Descriptionsofdistributions. . . . . . . . . . . . . . . . . . . . . . . . 21 2. 5 Jointdistributions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 2. 6 Independentrandomvariables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 2. 7 Exchangeability. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 2. 8 deFinetti’stheorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 2. 9 Discussionandfurtherreferences. . . . . . . . . . . . . . . . . . . . . . . . . . 30 3 One-parametermodels. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 3. 1 Thebinomialmodel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 3. 1. 1 Inferenceforexchangeablebinarydata. . . . . . . . . . . . . . . 35 3. 1. 2 Con?denceregions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 3. 2 ThePoissonmodel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 3. 2. 1 Posteriorinference. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 3. 2. 2 Example:Birthrates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 3. 3 Exponentialfamiliesandconjugatepriors. . . . . . . . . . . . . . . . . . . 51 3. 4 Discussionandfurtherreferences. . . . . . . . . . . . . . . . . . . . . . . . . . 52 VIII Contents 4 MonteCarloapproximation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 4. 1 TheMonteCarlomethod. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 4. 2 Posteriorinferenceforarbitraryfunctions. . . . . . . . . . . . . . . . . . . 57 4. 3 Samplingfrompredictivedistributions . . . . . . . . . . . . . . . . . . . . . 60 4. 4 Posteriorpredictivemodelchecking. . . . . . . . . . . . . . . . . . . . . . . . 62 4. 5 Discussionandfurtherreferences. . . . . . . . . . . . . . . . . . . . . . . . . . 65 5 Thenormalmodel. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 5. 1 Thenormalmodel. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 5. 2 Inferenceforthemean,conditionalonthevariance . . . . . . . . . . 69 5. 3 Jointinferenceforthemeanandvariance. . . . . . . . . . . . . . . . . . . 73 5. 4 Bias,varianceandmeansquarederror . . . . . . . . . . . . . . . . . . . . . 79 5. 5 Priorspeci?cationbasedonexpectations . . . . . . . . . . . . . . . . . . .; and examples.- Belief, probability and exchangeability.- One-parameter models.- Monte Carlo approximation.- The normal model.- Posterior approximation with the Gibbs sampler.- The multivariate normal model.- Group comparisons and hierarchical modeling.- Linear regression.- Nonconjugate priors and Metropolis-Hastings algorithms.- Linear and generalized linear mixed effects models.- Latent variable methods for ordinal data.; From the reviews: This is an excellent book for its intended audience: statisticians who wish to learn Bayesian methods. Although designed for a statistics audience, it would also be a good book for econometricians who have been trained in frequentist methods, but wish to learn Bayes. In relatively few pages, it takes the reader through a vast amount of material, beginning with deep issues in statistical methodology such as de Finetti’s theorem, through the nitty-gritty of Bayesian computation to sophisticated models such as generalized linear mixed effects models and copulas. And it does so in a simple manner, always drawing parallels and contrasts between Bayesian and frequentist methods, so as to allow the reader to see the similarities and differences with clarity. (Econometrics Journal) “Generally, I think this is an excellent choice for a text for a one-semester Bayesian Course. It provides a good overview of the basic tenets of Bayesian thinking for the common one and two parameter distributions and gives introductions to Bayesian regression, multivariate-response modeling, hierarchical modeling, and mixed effects models. The book includes an ample collection of exercises for all the chapters. A strength of the book is its good discussion of Gibbs sampling and Metropolis-Hastings algorithms. The author goes beyond a description of the MCMC algorithms, but also provides insight into why the algorithms work. …I believe this text would be an excellent choice for my Bayesian class since it seems to cover a good number of introductory topics and giv the student a good introduction to the modern computational tools for Bayesian inference with illustrations using R. (Journal of the American Statistical Association, June 2010, Vol. 105, No. 490) “Statisticians and applied scientists. The book is accessible to readers having a basic familiarity with probability theory and grounding statistical methods. The author has succeeded in writing an acceptable introduction to the theory and application of Bayesian statistical methods which is modern and covers both the theory and practice. … this book can be useful as a quick introduction to Bayesian methods for self study. In addition, I highly recommend this book as a text for a course for Bayesian statistics.” (Lasse Koskinen, International Statistical Review, Vol. 78 (1), 2010) “The book under review covers a balanced choice of topics … presented with a focus on the interplay between Bayesian thinking and the underlying mathematical concepts. … the book by Peter D. Hoff appears to be an excellent choice for a main reading in an introductory course. After studying this text the student can go in a direction of his liking at the graduate level.” (Krzysztof Łatuszyński, Mathematical Reviews, Issue 2011 m) “The book is a good introductory treatment of methods of Bayes analysis. It should especially appeal to the reader who has had some statistical courses in estimation and modeling, and wants to understand the Bayesian interpretation of those methods. Also, readers who are primarily interested in modeling data and who are working in areas outside of statistics should find this to be a good reference book. … should appeal to the reader who wants to keep with modern approaches to data analysis.” (Richard P. Heydorn, Technometrics, Vol. 54 (1), February, 2012) ; This book provides a compact self-contained introduction to the theory and application of Bayesian statistical methods. The book is accessible to readers having a basic familiarity with probability, yet allows more advanced readers to quickly grasp the principles underlying Bayesian theory and methods. The examples and computer code allow the reader to understand and implement basic Bayesian data analyses using standard statistical models and to extend the standard models to specialized data analysis situations. The book begins with fundamental notions such as probability, exchangeability and Bayes' rule, and ends with modern topics such as variable selection in regression, generalized linear mixed effects models, and semiparametric copula estimation. Numerous examples from the social, biological and physical sciences show how to implement these methodologies in practice. Monte Carlo summaries of posterior distributions play an important role in Bayesian data analysis. The open-source R statistical computing environment provides sufficient functionality to make Monte Carlo estimation very easy for a large number of statistical models and example R-code is provided throughout the text. Much of the example code can be run ``as is'' in R, and essentially all of it can be run after downloading the relevant datasets from the companion website for this book. Peter Hoff is an Associate Professor of Statistics and Biostatistics at the University of Washington. He has developed a variety of Bayesian methods for multivariate data, including covariance and copula estimation, cluster analysis, mixture modeling and social network analysis. He is on the editorial board of the Annals of Applied Statistics . ; Provides a nice introduction to Bayesian statistics with sufficient grounding in the Bayesian framework without being distracted by more esoteric points The material is well-organized, weaving applications, background material and computation discussions throughout the book R examples also facilitate how the approaches work ; GB http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png

A First Course in Bayesian Statistical Methods

270 pages

Loading next page...
 
/lp/springer-e-books/a-first-course-in-bayesian-statistical-methods-ug7nkIxk0g

References (104)

Publisher
Springer New York
Copyright
Copyright � Springer Basel AG
DOI
10.1007/978-0-387-92407-6
Publisher site
See Book on Publisher Site

Abstract

A self-contained introduction to probability, exchangeability and Bayes’ rule provides a theoretical understanding of the applied material. Numerous examples with R-code that can be run "as-is" allow the reader to perform the data analyses themselves. The development of Monte Carlo and Markov chain Monte Carlo methods in the context of data analysis examples provides motivation for these computational methods. ; This compact, self-contained introduction to the theory and application of Bayesian statistical methods is accessible to those with a basic familiarity with probability, yet allows advanced readers to grasp the principles underlying Bayesian theory and method. ; Thisbookoriginatedfromasetoflecturenotesforaone-quartergradua- levelcoursetaughtattheUniversityofWashington. Thepurposeofthecourse istofamiliarizethestudentswiththebasicconceptsofBayesiantheoryand toquicklygetthemperformingtheirowndataanalysesusingBayesianc- putationaltools. Theaudienceforthiscourseincludesnon-statisticsgraduate studentswhodidwellintheirdepartment’sgraduate-levelintroductorystat- ticscoursesandwhoalsohaveaninterestinstatistics. Additionally,?rst-and second-yearstatisticsgraduatestudentshavefoundthiscoursetobeauseful introductiontostatisticalmodeling. Likethecourse,thisbookisintendedto beaself-containedandcompactintroductiontothemainconceptsofBayesian theoryandpractice. Bytheendofthetext,readersshouldhavetheabilityto understandandimplementthebasictoolsofBayesianstatisticalmethodsfor theirowndataanalysispurposes. Thetextisnotintendedasacompreh- sivehandbookforadvancedstatisticalresearchers,althoughitishopedthat thislattercategoryofreaderscouldusethisbookasaquickintroductionto Bayesianmethodsandasapreparationformorecomprehensiveanddetailed studies. Computing MonteCarlosummariesofposteriordistributionsplayanimportantrolein the way data analyses are presented in this text. My experience has been thatonceastudentunderstandsthebasicideaofposteriorsampling,their dataanalysesquicklybecomemorecreativeandmeaningful,usingrelevant posteriorpredictivedistributionsandinterestingfunctionsofparameters. The open-sourceRstatisticalcomputingenvironmentprovidessu?cientfuncti- alitytomakeMonteCarloestimationveryeasyforalargenumberofstat- ticalmodels,andexampleR-codeisprovidedthroughoutthetext. Muchof theexamplecodecanberun“asis”inR,andessentiallyallofitcanberun afterdownloadingtherelevantdatasetsfromthecompanionwebsiteforthis book. VI Preface Acknowledgments Thepresentationofmaterialinthisbook,andmyteachingstyleingeneral, havebeenheavilyin?uencedbythediversesetofstudentstakingCSSS-STAT 564attheUniversityofWashington. Mythankstothemforimprovingmy teaching. IalsothankChrisHo?man,VladimirMinin,XiaoyueNiuandMarc Suchard for their extensive comments, suggestions and corrections for this book,andtoAdrianRafteryforbibliographicsuggestions. Finally,Ithank mywifeJenforherpatienceandsupport. Seattle,WA PeterHo? March2009 Contents 1 Introductionandexamples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1. 1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1. 2 WhyBayes? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 1. 2. 1 Estimatingtheprobabilityofarareevent . . . . . . . . . . . . 3 1. 2. 2 Buildingapredictivemodel. . . . . . . . . . . . . . . . . . . . . . . . . 8 1. 3 Wherewearegoing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 1. 4 Discussionandfurtherreferences. . . . . . . . . . . . . . . . . . . . . . . . . . 12 2 Belief,probabilityandexchangeability. . . . . . . . . . . . . . . . . . . . . 13 2. 1 Belieffunctionsandprobabilities. . . . . . . . . . . . . . . . . . . . . . . . . . 13 2. 2 Events,partitionsandBayes’rule . . . . . . . . . . . . . . . . . . . . . . . . . 14 2. 3 Independence. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 2. 4 Randomvariables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 2. 4. 1 Discreterandomvariables. . . . . . . . . . . . . . . . . . . . . . . . . . 18 2. 4. 2 Continuousrandomvariables . . . . . . . . . . . . . . . . . . . . . . . 19 2. 4. 3 Descriptionsofdistributions. . . . . . . . . . . . . . . . . . . . . . . . 21 2. 5 Jointdistributions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 2. 6 Independentrandomvariables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 2. 7 Exchangeability. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 2. 8 deFinetti’stheorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 2. 9 Discussionandfurtherreferences. . . . . . . . . . . . . . . . . . . . . . . . . . 30 3 One-parametermodels. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 3. 1 Thebinomialmodel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 3. 1. 1 Inferenceforexchangeablebinarydata. . . . . . . . . . . . . . . 35 3. 1. 2 Con?denceregions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 3. 2 ThePoissonmodel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 3. 2. 1 Posteriorinference. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 3. 2. 2 Example:Birthrates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 3. 3 Exponentialfamiliesandconjugatepriors. . . . . . . . . . . . . . . . . . . 51 3. 4 Discussionandfurtherreferences. . . . . . . . . . . . . . . . . . . . . . . . . . 52 VIII Contents 4 MonteCarloapproximation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 4. 1 TheMonteCarlomethod. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 4. 2 Posteriorinferenceforarbitraryfunctions. . . . . . . . . . . . . . . . . . . 57 4. 3 Samplingfrompredictivedistributions . . . . . . . . . . . . . . . . . . . . . 60 4. 4 Posteriorpredictivemodelchecking. . . . . . . . . . . . . . . . . . . . . . . . 62 4. 5 Discussionandfurtherreferences. . . . . . . . . . . . . . . . . . . . . . . . . . 65 5 Thenormalmodel. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 5. 1 Thenormalmodel. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 5. 2 Inferenceforthemean,conditionalonthevariance . . . . . . . . . . 69 5. 3 Jointinferenceforthemeanandvariance. . . . . . . . . . . . . . . . . . . 73 5. 4 Bias,varianceandmeansquarederror . . . . . . . . . . . . . . . . . . . . . 79 5. 5 Priorspeci?cationbasedonexpectations . . . . . . . . . . . . . . . . . . .; and examples.- Belief, probability and exchangeability.- One-parameter models.- Monte Carlo approximation.- The normal model.- Posterior approximation with the Gibbs sampler.- The multivariate normal model.- Group comparisons and hierarchical modeling.- Linear regression.- Nonconjugate priors and Metropolis-Hastings algorithms.- Linear and generalized linear mixed effects models.- Latent variable methods for ordinal data.; From the reviews: This is an excellent book for its intended audience: statisticians who wish to learn Bayesian methods. Although designed for a statistics audience, it would also be a good book for econometricians who have been trained in frequentist methods, but wish to learn Bayes. In relatively few pages, it takes the reader through a vast amount of material, beginning with deep issues in statistical methodology such as de Finetti’s theorem, through the nitty-gritty of Bayesian computation to sophisticated models such as generalized linear mixed effects models and copulas. And it does so in a simple manner, always drawing parallels and contrasts between Bayesian and frequentist methods, so as to allow the reader to see the similarities and differences with clarity. (Econometrics Journal) “Generally, I think this is an excellent choice for a text for a one-semester Bayesian Course. It provides a good overview of the basic tenets of Bayesian thinking for the common one and two parameter distributions and gives introductions to Bayesian regression, multivariate-response modeling, hierarchical modeling, and mixed effects models. The book includes an ample collection of exercises for all the chapters. A strength of the book is its good discussion of Gibbs sampling and Metropolis-Hastings algorithms. The author goes beyond a description of the MCMC algorithms, but also provides insight into why the algorithms work. …I believe this text would be an excellent choice for my Bayesian class since it seems to cover a good number of introductory topics and giv the student a good introduction to the modern computational tools for Bayesian inference with illustrations using R. (Journal of the American Statistical Association, June 2010, Vol. 105, No. 490) “Statisticians and applied scientists. The book is accessible to readers having a basic familiarity with probability theory and grounding statistical methods. The author has succeeded in writing an acceptable introduction to the theory and application of Bayesian statistical methods which is modern and covers both the theory and practice. … this book can be useful as a quick introduction to Bayesian methods for self study. In addition, I highly recommend this book as a text for a course for Bayesian statistics.” (Lasse Koskinen, International Statistical Review, Vol. 78 (1), 2010) “The book under review covers a balanced choice of topics … presented with a focus on the interplay between Bayesian thinking and the underlying mathematical concepts. … the book by Peter D. Hoff appears to be an excellent choice for a main reading in an introductory course. After studying this text the student can go in a direction of his liking at the graduate level.” (Krzysztof Łatuszyński, Mathematical Reviews, Issue 2011 m) “The book is a good introductory treatment of methods of Bayes analysis. It should especially appeal to the reader who has had some statistical courses in estimation and modeling, and wants to understand the Bayesian interpretation of those methods. Also, readers who are primarily interested in modeling data and who are working in areas outside of statistics should find this to be a good reference book. … should appeal to the reader who wants to keep with modern approaches to data analysis.” (Richard P. Heydorn, Technometrics, Vol. 54 (1), February, 2012) ; This book provides a compact self-contained introduction to the theory and application of Bayesian statistical methods. The book is accessible to readers having a basic familiarity with probability, yet allows more advanced readers to quickly grasp the principles underlying Bayesian theory and methods. The examples and computer code allow the reader to understand and implement basic Bayesian data analyses using standard statistical models and to extend the standard models to specialized data analysis situations. The book begins with fundamental notions such as probability, exchangeability and Bayes' rule, and ends with modern topics such as variable selection in regression, generalized linear mixed effects models, and semiparametric copula estimation. Numerous examples from the social, biological and physical sciences show how to implement these methodologies in practice. Monte Carlo summaries of posterior distributions play an important role in Bayesian data analysis. The open-source R statistical computing environment provides sufficient functionality to make Monte Carlo estimation very easy for a large number of statistical models and example R-code is provided throughout the text. Much of the example code can be run ``as is'' in R, and essentially all of it can be run after downloading the relevant datasets from the companion website for this book. Peter Hoff is an Associate Professor of Statistics and Biostatistics at the University of Washington. He has developed a variety of Bayesian methods for multivariate data, including covariance and copula estimation, cluster analysis, mixture modeling and social network analysis. He is on the editorial board of the Annals of Applied Statistics . ; Provides a nice introduction to Bayesian statistics with sufficient grounding in the Bayesian framework without being distracted by more esoteric points The material is well-organized, weaving applications, background material and computation discussions throughout the book R examples also facilitate how the approaches work ; GB

Published: Jun 2, 2009

There are no references for this article.