#StackBounty: #gibbs #dirichlet-process #infinite-mixture-model Implementation of a blocked Gibbs sampler for a mixture model with a Di…

Bounty: 50

I am trying to understand and implement the blocked Gibbs sampler described on page 552 in Bayesian Data Analysis by Gelman et al. in the context of using a Dirichlet process as a prior in a mixture model. The three steps are as follows:

  1. Update $S_i in {1, dots, N}$ (cluster allocations) by multinomial sampling.
  2. Update the stick-breaking weight $V_c$, $c = 1, dots, N – 1$, from a suitable beta.
  3. Update $theta_c^*$, $c = 1, dots, N$ (cluster parameters), exactly as in the finite mixture model, with the parameters for unoccupied clusters with $n_c = 0$ sampled from the prior $P_0$.

The first two steps are more or less clear. The third one, however, is not so much, and I would appreciate if someone could spell it out. How exactly are the parameters updated?

For more context, if needed, I assume a Gaussian distribution for the data with an unknown mean and an unknown precision (hence, $theta_i^* = (mu_i, tau_i)$) and use a Gaussian–gamma distribution as a conjugate prior (which is $P_0$).

Get this bounty!!!

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.