# #StackBounty: #generalized-least-squares #blue #gauss-markov-theorem Generalized Least Square When Disturbance Covariance Matrix Is Ran…

### Bounty: 50

I cannot find any general results on the following Generalized Least Square (GLS) problem.

Let $$Y = Xbeta + E$$, where $$X$$ is deterministic and of full column rank $$k$$, and $$E$$ is of zero mean, with a $$n$$-by-$$n$$ covariance matrix $$V$$ with rank $$r < n$$. Let $$V^+$$ be the unique Moore-Penrose inverse of $$V$$. It is further assumed that $$X’V^+X$$ is invertible. Question: what is the best linear unbiased estimator (BLUE) of $$beta$$?

You would think the answer must be $$(X’V^+X)^{-1}X’V^+Y$$. But that actually would be wrong.

Please note: This has nothing to do multicollinearity. This problem arises when we have "redundant" observations and/or infinitely precise observations. For the former, we generally just drop the redundant observations. For the latter, which are basically linear restrictions of the parameters, we generally re-parameterize to incorporate such restrictions directly. I would like to find a unified approach. My guess is $$(X’V^+X)^{-1}X’V^+Y$$ works for the former case only.

Get this bounty!!!

This site uses Akismet to reduce spam. Learn how your comment data is processed.