#StackBounty: #normal-distribution #poisson-distribution #likelihood #sufficient-statistics Finding the form $g(T(mathbf{y}), lambda)…

Bounty: 50

I’m studying some notes that present examples of sufficiency:

Let $Y_1, dots, Y_n$ be i.i.d. $N(mu, sigma^2)$. Note that $sum_{i = 1}^n (y_i – mu)^2 = sum_{i = 1}^n (y_i – bar{y})^2 + n(bar{y} – mu)^2$. Hence

$$begin{align} L(mu, sigma; mathbf{y}) &= prod_{i = 1}^n dfrac{1}{sqrt{2pi sigma^2}}e^{-frac{1}{2sigma^2}(y_i – mu)^2} \ &= dfrac{1}{(2pi sigma^2)^{n/2}}e^{-frac{1}{2sigma^2}sum_{i = 1}^n (y_i – bar{y})^2}e^{-frac{1}{2sigma^2}n(bar{y} – mu)^2} end{align}$$

From Theorem 1, it follows that where $T(mathbf{Y}) = (bar{Y}, sum_{i = 1}^n (Y_i – bar{Y})^2)$ is a sufficient statistic for $(mu, sigma)$.

Theorem 1 is presented as follows:

A statistic $T(mathbf{Y})$ is sufficient for $theta$ if, and only if, for all $theta in theta$

$$L(theta; mathbf{y}) = g(T(mathbf{y}), theta) times h(mathbf{y})$$

where the function $g(cdot)$ depends on $theta$ and the statistic $T(mathbf{Y})$, while the function $h(cdot)$ does not contain $theta$.

Theorem 1 implies that the likelihood $L(theta; mathbf{y})$ depends on the data only through $T(mathbf{y})$, $T(mathbf{Y})$ is a sufficient statistic for $theta$ and $h(mathbf{y}) equiv 1$.

For reference to another example, here is a Poisson example that I recently posted:

Let $Y_1, dots, Y_n$ be a i.i.d. $text{Pois}(lambda)$. Then

$$begin{align} L(lambda; mathbf{y} &= prod_{i = 1}^n e^{-lambda} dfrac{lambda^{y_i}}{y_i!} \ &= e^{-lambda n} dfrac{lambda^{sum_{i = 1}^n y_i}}{prod_{i = 1}^n y_i!} \ &= g(T(mathbf{y}), lambda) times h(mathbf{y}) end{align}$$

where $T(mathbf{y}) = sum_{i = 1}^n y_i$, $g(T(mathbf{y}), lambda) = e^{-lambda n} lambda^{T(mathbf{y})}$ and $h(mathbf{y}) = dfrac{1}{prod_{i = 1}^n y_i!}$

There are three things that I don’t understand here:

  1. How is it that $sum_{i = 1}^n (y_i – mu)^2 = sum_{i = 1}^n (y_i – bar{y})^2 + n(bar{y} – mu)^2$?

  2. If, for $L(theta; mathbf{y})$, we require the form $g(T(mathbf{y}), theta) times h(mathbf{y})$, then, for $L(mu, sigma; mathbf{y})$, what form do we require? Trying to think of this myself, I thought of three potentially correct forms: $g(T(mathbf{y}), (mu, sigma)) times h(mathbf{y})$, $g(T(mathbf{y}), (sigma, mu)) times h(mathbf{y})$, or $g(T(mathbf{y}), mu, sigma) times h(mathbf{y})$.

  3. Related to 2., comparing the first example to the Poisson example, I don’t understand the conclusion of the first example. How does $T(mathbf{Y}) = (bar{Y}, sum_{i = 1}^n (Y_i – bar{Y})^2)$ satisfy the form $g(T(mathbf{y}), lambda) times h(mathbf{y})$?

I would greatly appreciate it if people would please take the time to clarify these points.


Get this bounty!!!

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.