#StackBounty: #probability #distributions #mathematical-statistics #estimation PDf of sum of multinomial and gaussian distribution

Bounty: 50

I have a model,
signal $y_n in mathcal{R}$ (signals in real domain) can be expressed as begin{align}
y_n &= s_n * h_n + v_n = sum_{k=0}^{L-1}h_k s_{n-k} + v_n,
label{Eq1}
end{align}
where $*$ is the convolution operator and $v_n$ is a zero mean AWGN. In an earlier Question asked http://dsp.stackexchange.com/questions/37698/help-in-proper-notations-and-mathematical-formulation

the input information source $s_n$ is an independent multinomial process with the probability parameter $p in (0,1)$. Let, there be $m$ distinct symbols $a_1, a_2, ldots, a_m$ in the sequence with probability of occurrence $p_1,ldots,p_m$, respectively.
Rewriting,
begin{align}
y_n &= mathbf{h}^Tmathbf{s}_n + v_n
end{align}

The unknowns are the channel coefficients, the input, and the noise variance. So, the parameter vector of unknowns is $mathbf{theta} = [{mathbf{h},mathbf{s},p_1,…,p_m,sigma^2_v}]^T$

SInce the input is also unknw, theFisher Information must include the input as well. But I don’t know how do I write the log likelihood expression so that the Fisher Information matrix includes the term for the unknown input as well. This is what I have tried but I don’t know if I am doing it correctly.

The conditional probability density function of $mathbf{y}$ can be written as:
begin{align}
P(mathbf{y}|mathbf{theta}) &= prod_{n=1}^{N}P(y_n|mathbf{s}n) nonumber\
&= (2 pi sigma^2_v)^{-N/2} exp left(-frac{sum
{n=1}^N {(y_n-mathbf{h}^T mathbf{s}_n)}^2}{2sigma_v^2} right)
label{Eq15}
end{align}
The log-likelihood probability density function (PDF) which is the logarithm of the joint conditional pdf is:
begin{align}
F &= -frac{N}{2} ln(2 pi sigma^2_v) – frac{1}{2sigma^2_v} left[ {(y_n – {mathbf{h}}^T mathbf{s}_n)}^2 right]
label{Eq16}
end{align}


Get this bounty!!!

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.