#StackBounty: #expectation-maximization EM Algorithm Derivation, Discrete Case

Bounty: 50

Just wanted to ask whether the following derivation is correct:

Suppose $X$ is a vector of observed random variables, $Z$ is a vector of unobserved random variables and $theta$ is a vector of parameters. Let $S$ be the set of values $Z$ could take on; we’ll assume this is discrete. We’ll also assume the following are known

$f(x|z,theta)$ = the probability density function of X given Z and theta
$P(Z=z|theta)$ = the probability that $Z=z$ given $theta$

The likelihood function I want is

$prod_{zin S}[f(|z,theta)P(Z=z|theta)]^{1_{Z=z}}$

Taking the logarithm:

$sum_{zin S}1_{Z=z}[log f(|z,theta)+log P(Z=z|theta)]$

Now suppose I’m performing the EM algorithm and my current estimate of $theta$ is $theta^{(n)}$. Then I need to apply $E_{Z|theta^{(n)},x}$ to the above expression. This gives:

$sum_{zin S}P(Z=z|theta^{(n)})[log f(|z,theta)+log P(Z=z|theta)]$

That is the expression I need to maximize with respect to $theta$ in the M step.

Is that correct, or did I make a mistake somethere? Thanks


Get this bounty!!!

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.