Convergence of alternating Markov chains

Loading...
Thumbnail Image
Date
2005
DOI
Open Access Location
Journal Title
Journal ISSN
Volume Title
Publisher
Rights
Abstract
Suppose we have two Markov chains defined on the same state space. What happens if we alternate them? If they both converge to the same stationary distribution, will the chain obtained by alternating them also converge? Consideration of these questions is motivated by the possible use of two different updating schemes for MCMC estimation, when much faster convergence can be achieved by alternating both schemes than by using either singly.
Description
Keywords
Markov chains
Citation
Jones, G., Alexander, D.L.J. (2005), Convergence of alternating Markov chains, Research Letters in the Information and Mathematical Sciences, 8, 197-202