Access the full text.
Sign up today, get DeepDyve free for 14 days.
P. Hoff (2009)
Posterior approximation with the Gibbs sampler
M. Pitt, David Chan, R. Kohn (2006)
Efficient Bayesian inference for Gaussian copula regression modelsBiometrika, 93
R. Little (2006)
Calibrated BayesThe American Statistician, 60
(1988)
Recent progress on de Finetti's notions of exchangeability
Write out the ratio of the posterior densities comparing a set of proposal values (a * , b * , θ) to values (a, b, θ)
Jack Smith, J. Everhart, W. Dickson, W. Knowler, R. Johannes (1988)
Using the ADAP Learning Algorithm to Forecast the Onset of Diabetes Mellitus
W. Gilks, G. Roberts, S. Sahu (1998)
Adaptive Markov Chain Monte Carlo through RegenerationJournal of the American Statistical Association, 93
J. Hilbe (2009)
Data Analysis Using Regression and Multilevel/Hierarchical ModelsJournal of Statistical Software, 30
B. Kleijn, W. A., Van Vaart (2006)
Misspecification in infinite-dimensional Bayesian statisticsAnnals of Statistics, 34
Robert Kass, A. Raftery (1995)
Bayes factors
(1931)
Atti della R Academia Nazionale dei Lincei, Serie 6 Memorie, Classe di Scienze Fisiche
J. Besag (1974)
Spatial Interaction and the Statistical Analysis of Lattice SystemsJournal of the royal statistical society series b-methodological, 36
B. Finetti (1937)
La prévision : ses lois logiques, ses sources subjectives, 7
J. Logan (1983)
A Multivariate Model for Mobility TablesAmerican Journal of Sociology, 89
N. Metropolis, A. Rosenbluth, M. Rosenbluth, A. Teller (1953)
Equation of state calculations by fast computing machinesJournal of Chemical Physics, 21
D. Rubin (1984)
Bayesianly Justifiable and Relevant Frequency Calculations for the Applied StatisticianAnnals of Statistics, 12
I. Guttman (1967)
The Use of the Concept of a Future Observation in Goodness‐Of‐Fit ProblemsJournal of the royal statistical society series b-methodological, 29
154 9.2.2 Default and weakly informative prior distributions
UC Irvine, J. Petit, J. Jouzel, D. Raynaud, N. Barkov, J. Barnola, I. Basile, M. Bender, J. Chappellaz, M. k, G. Delaygue, M. Delmotte, V. Kotlyakov, M. Legrand, V. Lipenkov, C. Lorius, L. Pe´pin, C. Ritz, E. k, M. Stiévenard (1999)
Climate and atmospheric history of the past 420,000 years from the Vostok ice core, AntarcticaNature, 399
Wray Buntine (1994)
Learning with Graphical Models
John Geweke (1991)
Evaluating the accuracy of sampling-based approaches to the calculation of posterior moments, 4
K. Haigis, P. Hoff, A. White, A. Shoemaker, R. Halberg, W. Dove (2004)
Tumor regionality in the mouse intestine reflects the mechanism of loss of Apc function.Proceedings of the National Academy of Sciences of the United States of America, 101 26
B. Welch, H. Peers (1963)
On Formulae for Confidence Points Based on Integrals of Weighted LikelihoodsJournal of the royal statistical society series b-methodological, 25
y n ) be the probability density that is proportional to p J (θ, Σ)×p(y 1
187 10.5.1 A regression model with correlated errors
J. Hartigan (1966)
Note on the Confidence-Prior of Welch and PeersJournal of the royal statistical society series b-methodological, 28
Hierarchical modeling of means and variances
θ 6 |a, b, x, y), and from this identify the full conditional distribution of the rate for each county p
Y. Atchadé, J. Rosenthal (2005)
On adaptive Markov chain Monte Carlo algorithmsBernoulli, 11
105 7.2 A semiconjugate prior distribution for the mean
E. Lawrence, D. Bingham, Chuanhai Liu, V. Nair (2008)
Bayesian Inference for Multivariate Ordinal Data Using Parameter ExpansionTechnometrics, 50
The hierarchical normal model
P. Sen, S. Press (1984)
Applied multivariate analysis : using Bayesian and frequentist methods of inferenceJournal of the American Statistical Association, 79
(2004)
Monte Carlo statistical methods, 2nd edn
(1986)
On assessing prior distributions and Bayesian regression analysis with g-prior distributions. In: Bayesian inference and decision techniques, Stud
C. Robert, G. Casella, Olivier Capp´e, David Spiegelhalter, Alan Gelfand, Peter Green, Jun Liu, Sharon McGrayne, Peter M¨uller, Gareth Roberts (2008)
A History of Markov Chain Monte Carlo–Subjective Recollections from Incomplete Data–
T. Sweeting (1999)
On the construction of Bayes–confidence regionsJournal of the Royal Statistical Society: Series B (Statistical Methodology), 61
B. Carlin, T. Louis (1996)
BAYES AND EMPIRICAL BAYES METHODS FOR DATA ANALYSISStatistics and Computing, 7
K. Moore, L. Waite (1977)
Early childbearing and educational attainment.Family planning perspectives, 9 5
H. White (1982)
Maximum Likelihood Estimation of Misspecified ModelsEconometrica, 50
(2000)
values for composite null models
C. Stein (1981)
Estimation of the Mean of a Multivariate Normal DistributionAnnals of Statistics, 9
Edward George, R. McCulloch (1993)
Variable selection via Gibbs samplingJournal of the American Statistical Association, 88
C. Geyer (1992)
Practical Markov Chain Monte CarloStatistical Science, 7
W. Grogan, W. Wirth (1981)
A new American genus of predaceous midges related to Palpomyia and Bezzia (Diptera: Ceratopogonidae), 94
D. Lindley, A. Smith (1972)
Bayes Estimates for the Linear ModelJournal of the royal statistical society series b-methodological, 34
G. Gregory (1973)
Foundations of Statistical InferenceJournal of the Operational Research Society, 24
Gibbs sampling and model averaging
P. Hoff (2006)
Extending the rank likelihood for semiparametric copula estimationThe Annals of Applied Statistics, 1
Jun Liu, Y. Wu (1999)
Parameter Expansion for Data AugmentationJournal of the American Statistical Association, 94
R. Borges (1966)
A characterization of the normal distributionZeitschrift für Wahrscheinlichkeitstheorie und Verwandte Gebiete, 5
A. Raftery, S. Lewis (1991)
How Many Iterations in the Gibbs Sampler
O. Bunke, X. Milhaud (1998)
Asymptotic behavior of Bayes estimates under possibly incorrect modelsAnnals of Statistics, 26
O. Papaspiliopoulos, G. Roberts, Martin Skold (2007)
A General Framework for the Parametrization of Hierarchical ModelsStatistical Science, 22
S. Geman, D. Geman (1984)
Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of ImagesIEEE Transactions on Pattern Analysis and Machine Intelligence, PAMI-6
D. Aldous (1985)
Exchangeability and related topicsLecture Notes in Mathematics, 1117
Sample new values of the θ j 's from their full conditional distributions. Perform diagnostic tests on your chain and modify if necessary
Feng Liang, Rui Paulo, Germán Molina, M. Clyde, J. Berger (2008)
Mixtures of g Priors for Bayesian Variable SelectionJournal of the American Statistical Association, 103
R. Tibshirani (1989)
Noninformative priors for one parameter of manyBiometrika, 76
A. Gelfand, Adrian Smith (1990)
Sampling-Based Approaches to Calculating Marginal DensitiesJournal of the American Statistical Association, 85
C. Stein (1955)
A NECESSARY AND SUFFICIENT CONDITION FOR ADMISSIBILITYAnnals of Mathematical Statistics, 26
V. Johnson (2007)
Bayesian Model Assessment Using Pivotal QuantitiesBayesian Analysis, 2
D. Madigan, A. Raftery (1994)
Model Selection and Accounting for Model Uncertainty in Graphical Models Using Occam's WindowJournal of the American Statistical Association, 89
P. Bickel, Y. Ritov (1997)
Local Asymptotic Normality of Ranks and Covariates in Transformation Models
Describe in words what the various components of the hierarchical model represent in terms of observed and expected disease rates
G. Roberts, J. Rosenthal (2007)
Coupling and Ergodicity of Adaptive Markov Chain Monte Carlo AlgorithmsJournal of Applied Probability, 44
David Dunson (2000)
Bayesian latent variable models for clustered mixed outcomesJournal of the Royal Statistical Society: Series B (Statistical Methodology), 62
A. Raftery (1997)
Bayesian Model Averaging for Linear Regression Models
(1995)
A philosophical essay on probabilities, english edn
T. Severini (1991)
On the Relationship between Bayesian and Non-Bayesian Interval EstimatesJournal of the royal statistical society series b-methodological, 53
R. Kass, L. Wasserman (1996)
The Selection of Prior Distributions by Formal RulesJournal of the American Statistical Association, 91
A. Pettitt (1982)
Inference for the Linear Model Using a Likelihood Based on RanksJournal of the royal statistical society series b-methodological, 44
J. Monahan, D. Boos (1992)
Proper likelihoods for Bayesian analysisBiometrika, 79
P. Diaconis, D. Ylvisaker (1979)
Conjugate Priors for Exponential FamiliesAnnals of Statistics, 7
(2003)
Chichester Cox RT (1946) Probability, frequency and reasonable expectation
John Olin (2003)
Markov Chain Monte Carlo Analysis of Correlated Count Data
B. Efron (2005)
Bayesians, Frequentists, and ScientistsJournal of the American Statistical Association, 100
Sampling from the conditional distributions
(1961)
Applied statistical decision theory. Studies in Managerial Economics, Division of Research, Graduate School of Business Administration
(2003)
An essay towards solving a problem in the doctrine of chancesResonance, 8
R. Kass, L. Wasserman (1995)
A Reference Bayesian Test for Nested Hypotheses and its Relationship to the Schwarz CriterionJournal of the American Statistical Association, 90
K. Quinn (2004)
Bayesian Factor Analysis for Mixed Ordinal and Continuous ResponsesPolitical Analysis, 12
T. Sweeting (2001)
Coverage probability bias, objective Bayes and the likelihood principleBiometrika, 88
Jerzy Neyman (2005)
INADMISSIBILITY OF THE USUAL ESTIMATOR FOR THE MEAN OF A MULTIVARIATE NORMAL DISTRIBUTION
A. Dawid, S. Lauritzen (1993)
Hyper Markov Laws in the Statistical Analysis of Decomposable Graphical ModelsAnnals of Statistics, 21
P. Arcese, James Smith, W. Hochachka, C. Rogers, D. Ludwig (1992)
Stability, Regulation, and the Determination of Abundance in an Insular Song Sparrow PopulationEcology, 73
R. Cox (1962)
The Algebra of Probable Inference
G. Letac, H. Massam (2007)
Wishart distributions for decomposable graphsAnnals of Statistics, 35
E. Hewitt, L. Savage (1955)
Symmetric measures on Cartesian productsTransactions of the American Mathematical Society, 80
112 7.5 Missing data and
A. Gelman, X. Meng, H. Stern (1996)
POSTERIOR PREDICTIVE ASSESSMENT OF MODEL FITNESS VIA REALIZED DISCREPANCIES
P. Diaconis, D. Freedman (1980)
Finite Exchangeable SequencesAnnals of Probability, 8
B. Ripley (1979)
Simulating Spatial Patterns: Dependent Samples from a Multivariate DensityJournal of The Royal Statistical Society Series C-applied Statistics, 28
R. Tibshirani (1996)
Regression Shrinkage and Selection via the LassoJournal of the royal statistical society series b-methodological, 58
W. Hastings (1970)
Monte Carlo Sampling Methods Using Markov Chains and Their ApplicationsBiometrika, 57
(1927)
Bayesian model choice: what and why? In: Bayesian statistics
R. Rubinstein (1981)
Simulation and the Monte Carlo method
(2003)
Cambridge, the logic of science
125 8.1 Comparing two groups
201 11.4.1 A Metropolis-Gibbs algorithm for posterior approximation
(1995)
Efficient parameterisations for normal linear mixed models
D. Berry (1982)
Statistical Decision Theory, Foundations, Concepts, and MethodsTechnometrics, 24
B. Efron, C. Morris (1973)
Stein's Estimation Rule and Its Competitors- An Empirical Bayes ApproachJournal of the American Statistical Association, 68
H. Haario, E. Saksman, J. Tamminen (2001)
An adaptive Metropolis algorithmBernoulli, 7
A self-contained introduction to probability, exchangeability and Bayes’ rule provides a theoretical understanding of the applied material. Numerous examples with R-code that can be run "as-is" allow the reader to perform the data analyses themselves. The development of Monte Carlo and Markov chain Monte Carlo methods in the context of data analysis examples provides motivation for these computational methods. ; This compact, self-contained introduction to the theory and application of Bayesian statistical methods is accessible to those with a basic familiarity with probability, yet allows advanced readers to grasp the principles underlying Bayesian theory and method. ; Thisbookoriginatedfromasetoflecturenotesforaone-quartergradua- levelcoursetaughtattheUniversityofWashington. Thepurposeofthecourse istofamiliarizethestudentswiththebasicconceptsofBayesiantheoryand toquicklygetthemperformingtheirowndataanalysesusingBayesianc- putationaltools. Theaudienceforthiscourseincludesnon-statisticsgraduate studentswhodidwellintheirdepartment’sgraduate-levelintroductorystat- ticscoursesandwhoalsohaveaninterestinstatistics. Additionally,?rst-and second-yearstatisticsgraduatestudentshavefoundthiscoursetobeauseful introductiontostatisticalmodeling. Likethecourse,thisbookisintendedto beaself-containedandcompactintroductiontothemainconceptsofBayesian theoryandpractice. Bytheendofthetext,readersshouldhavetheabilityto understandandimplementthebasictoolsofBayesianstatisticalmethodsfor theirowndataanalysispurposes. Thetextisnotintendedasacompreh- sivehandbookforadvancedstatisticalresearchers,althoughitishopedthat thislattercategoryofreaderscouldusethisbookasaquickintroductionto Bayesianmethodsandasapreparationformorecomprehensiveanddetailed studies. Computing MonteCarlosummariesofposteriordistributionsplayanimportantrolein the way data analyses are presented in this text. My experience has been thatonceastudentunderstandsthebasicideaofposteriorsampling,their dataanalysesquicklybecomemorecreativeandmeaningful,usingrelevant posteriorpredictivedistributionsandinterestingfunctionsofparameters. The open-sourceRstatisticalcomputingenvironmentprovidessu?cientfuncti- alitytomakeMonteCarloestimationveryeasyforalargenumberofstat- ticalmodels,andexampleR-codeisprovidedthroughoutthetext. Muchof theexamplecodecanberun“asis”inR,andessentiallyallofitcanberun afterdownloadingtherelevantdatasetsfromthecompanionwebsiteforthis book. VI Preface Acknowledgments Thepresentationofmaterialinthisbook,andmyteachingstyleingeneral, havebeenheavilyin?uencedbythediversesetofstudentstakingCSSS-STAT 564attheUniversityofWashington. Mythankstothemforimprovingmy teaching. IalsothankChrisHo?man,VladimirMinin,XiaoyueNiuandMarc Suchard for their extensive comments, suggestions and corrections for this book,andtoAdrianRafteryforbibliographicsuggestions. Finally,Ithank mywifeJenforherpatienceandsupport. Seattle,WA PeterHo? March2009 Contents 1 Introductionandexamples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1. 1 Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1. 2 WhyBayes? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 1. 2. 1 Estimatingtheprobabilityofarareevent . . . . . . . . . . . . 3 1. 2. 2 Buildingapredictivemodel. . . . . . . . . . . . . . . . . . . . . . . . . 8 1. 3 Wherewearegoing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 1. 4 Discussionandfurtherreferences. . . . . . . . . . . . . . . . . . . . . . . . . . 12 2 Belief,probabilityandexchangeability. . . . . . . . . . . . . . . . . . . . . 13 2. 1 Belieffunctionsandprobabilities. . . . . . . . . . . . . . . . . . . . . . . . . . 13 2. 2 Events,partitionsandBayes’rule . . . . . . . . . . . . . . . . . . . . . . . . . 14 2. 3 Independence. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 2. 4 Randomvariables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 2. 4. 1 Discreterandomvariables. . . . . . . . . . . . . . . . . . . . . . . . . . 18 2. 4. 2 Continuousrandomvariables . . . . . . . . . . . . . . . . . . . . . . . 19 2. 4. 3 Descriptionsofdistributions. . . . . . . . . . . . . . . . . . . . . . . . 21 2. 5 Jointdistributions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 2. 6 Independentrandomvariables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 2. 7 Exchangeability. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 2. 8 deFinetti’stheorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 2. 9 Discussionandfurtherreferences. . . . . . . . . . . . . . . . . . . . . . . . . . 30 3 One-parametermodels. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 3. 1 Thebinomialmodel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 3. 1. 1 Inferenceforexchangeablebinarydata. . . . . . . . . . . . . . . 35 3. 1. 2 Con?denceregions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 3. 2 ThePoissonmodel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 3. 2. 1 Posteriorinference. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 3. 2. 2 Example:Birthrates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 3. 3 Exponentialfamiliesandconjugatepriors. . . . . . . . . . . . . . . . . . . 51 3. 4 Discussionandfurtherreferences. . . . . . . . . . . . . . . . . . . . . . . . . . 52 VIII Contents 4 MonteCarloapproximation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 4. 1 TheMonteCarlomethod. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 4. 2 Posteriorinferenceforarbitraryfunctions. . . . . . . . . . . . . . . . . . . 57 4. 3 Samplingfrompredictivedistributions . . . . . . . . . . . . . . . . . . . . . 60 4. 4 Posteriorpredictivemodelchecking. . . . . . . . . . . . . . . . . . . . . . . . 62 4. 5 Discussionandfurtherreferences. . . . . . . . . . . . . . . . . . . . . . . . . . 65 5 Thenormalmodel. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 5. 1 Thenormalmodel. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 5. 2 Inferenceforthemean,conditionalonthevariance . . . . . . . . . . 69 5. 3 Jointinferenceforthemeanandvariance. . . . . . . . . . . . . . . . . . . 73 5. 4 Bias,varianceandmeansquarederror . . . . . . . . . . . . . . . . . . . . . 79 5. 5 Priorspeci?cationbasedonexpectations . . . . . . . . . . . . . . . . . . .; and examples.- Belief, probability and exchangeability.- One-parameter models.- Monte Carlo approximation.- The normal model.- Posterior approximation with the Gibbs sampler.- The multivariate normal model.- Group comparisons and hierarchical modeling.- Linear regression.- Nonconjugate priors and Metropolis-Hastings algorithms.- Linear and generalized linear mixed effects models.- Latent variable methods for ordinal data.; From the reviews: This is an excellent book for its intended audience: statisticians who wish to learn Bayesian methods. Although designed for a statistics audience, it would also be a good book for econometricians who have been trained in frequentist methods, but wish to learn Bayes. In relatively few pages, it takes the reader through a vast amount of material, beginning with deep issues in statistical methodology such as de Finetti’s theorem, through the nitty-gritty of Bayesian computation to sophisticated models such as generalized linear mixed effects models and copulas. And it does so in a simple manner, always drawing parallels and contrasts between Bayesian and frequentist methods, so as to allow the reader to see the similarities and differences with clarity. (Econometrics Journal) “Generally, I think this is an excellent choice for a text for a one-semester Bayesian Course. It provides a good overview of the basic tenets of Bayesian thinking for the common one and two parameter distributions and gives introductions to Bayesian regression, multivariate-response modeling, hierarchical modeling, and mixed effects models. The book includes an ample collection of exercises for all the chapters. A strength of the book is its good discussion of Gibbs sampling and Metropolis-Hastings algorithms. The author goes beyond a description of the MCMC algorithms, but also provides insight into why the algorithms work. …I believe this text would be an excellent choice for my Bayesian class since it seems to cover a good number of introductory topics and giv the student a good introduction to the modern computational tools for Bayesian inference with illustrations using R. (Journal of the American Statistical Association, June 2010, Vol. 105, No. 490) “Statisticians and applied scientists. The book is accessible to readers having a basic familiarity with probability theory and grounding statistical methods. The author has succeeded in writing an acceptable introduction to the theory and application of Bayesian statistical methods which is modern and covers both the theory and practice. … this book can be useful as a quick introduction to Bayesian methods for self study. In addition, I highly recommend this book as a text for a course for Bayesian statistics.” (Lasse Koskinen, International Statistical Review, Vol. 78 (1), 2010) “The book under review covers a balanced choice of topics … presented with a focus on the interplay between Bayesian thinking and the underlying mathematical concepts. … the book by Peter D. Hoff appears to be an excellent choice for a main reading in an introductory course. After studying this text the student can go in a direction of his liking at the graduate level.” (Krzysztof Łatuszyński, Mathematical Reviews, Issue 2011 m) “The book is a good introductory treatment of methods of Bayes analysis. It should especially appeal to the reader who has had some statistical courses in estimation and modeling, and wants to understand the Bayesian interpretation of those methods. Also, readers who are primarily interested in modeling data and who are working in areas outside of statistics should find this to be a good reference book. … should appeal to the reader who wants to keep with modern approaches to data analysis.” (Richard P. Heydorn, Technometrics, Vol. 54 (1), February, 2012) ; This book provides a compact self-contained introduction to the theory and application of Bayesian statistical methods. The book is accessible to readers having a basic familiarity with probability, yet allows more advanced readers to quickly grasp the principles underlying Bayesian theory and methods. The examples and computer code allow the reader to understand and implement basic Bayesian data analyses using standard statistical models and to extend the standard models to specialized data analysis situations. The book begins with fundamental notions such as probability, exchangeability and Bayes' rule, and ends with modern topics such as variable selection in regression, generalized linear mixed effects models, and semiparametric copula estimation. Numerous examples from the social, biological and physical sciences show how to implement these methodologies in practice. Monte Carlo summaries of posterior distributions play an important role in Bayesian data analysis. The open-source R statistical computing environment provides sufficient functionality to make Monte Carlo estimation very easy for a large number of statistical models and example R-code is provided throughout the text. Much of the example code can be run ``as is'' in R, and essentially all of it can be run after downloading the relevant datasets from the companion website for this book. Peter Hoff is an Associate Professor of Statistics and Biostatistics at the University of Washington. He has developed a variety of Bayesian methods for multivariate data, including covariance and copula estimation, cluster analysis, mixture modeling and social network analysis. He is on the editorial board of the Annals of Applied Statistics . ; Provides a nice introduction to Bayesian statistics with sufficient grounding in the Bayesian framework without being distracted by more esoteric points The material is well-organized, weaving applications, background material and computation discussions throughout the book R examples also facilitate how the approaches work ; GB
Published: Jun 2, 2009
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.