#StackBounty: #regression #pca #asymptotics Is high dimensional PCA regression consistent?

Bounty: 50

Consider a set $(y_i, x_i), i=1ldots,n$. The OLS estimator, which is a $sqrt{n}$-consistent estimator of $beta$ is obtained as $$hatbeta=(X^tX)^{-1}X^ty$$.

Now perform a PCA on the matrix $Xinmathbb{R}^{ntimes p}, n>p$. Consider the matrix of principal components $Qinmathbb{R}^{ptimes p}$ defined in a way such that the first principal component has the largest possible variance, and each succeeding component has the largest possible variance under the constraint that it is orthogonal to the preceding components. From an algebra perpective, Q is the matrix of eigenvectors from X and define an orthogonal change of basis maximizing the variance from X.

Consider $Z=XQinmathbb{R}^{ntimes p}$, the projection of $X$ into the subspace generated by $Q$ and solve,


It is trivial to see that $$hat{beta}=Qtildebeta$$ is the OLS estimator, and thus it is oracle. Now my question is: what happens when $p>n$?, this is, when we are in a high dimensional framework in which the OLS estimator cannot be solved. We can still perform a PCA and obtain a matrix Q of principal components that will in this case be equal to $$Qinmathbb{R}^{ptimes n}$$ meaning that in order to explain $100%$ of the original variability we will require $n$ PCs. $$hat{beta}=Qtildebeta$$

This $hatbeta$ will not be the OLS estimator in the high dimensional framework, but is it still $sqrt{n}$-consistent?

Get this bounty!!!

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.