# #StackBounty: #variance #kernel-smoothing Variance of Multivariate KDE

### Bounty: 300

I am struggling for 2 hours now and decided to give up. I want to compute the variance of the KDE $$hat f_H(x) = n^{-1}sum_{i=1}^ndet(H^{-1})K(H^{-1}(x – X_i)).$$
My steps:

I got to the point where

begin{align}frac{1}{ndet(H)^2}mathbb E[K(H^{-1}(x – X_i))^2] &= … \ &= …\ &= frac{1}{ndet(H)^2}int K(u)^2f(x+Hu):mathrm du.end{align}
A first order Taylor approximation of $$f(x + Hu)$$ yields (?) $$f(x + Hu) = f(x) + nabla f(x)’Hu + O(Vert HuVert^2)$$
but then I got stuck because I cant get rid of the big oh (I tried its definition and definition of $$VertcdotVert$$: $$mathrm{trace}(int K(u)^2 uu’:mathrm du)H’H$$ but the term in the trace may not exist. I just have assumptions about $$int K(u)uu’:mathrm du = nu I$$ and $$int K(u)^2:mathrm du = R(K)$$.)

I also have begin{align}frac{1}{ndet(H)^2}mathbb E[K(H^{-1}(x – X_i))]^2 &= … \ &= … \ &= f(x)^2 + f(x)O(Vert HVert^2) + O(Vert HVert^2)^2 \ &= f(x) + O(Vert HVert^2)end{align}
(which is hopefully correct). According to this page https://bookdown.org/egarpor/NP-UC3M/kde-ii-asymp.html the result should be $$frac{R(K)}{ndet(H)}f(x) + oleft(frac{1}{ndet(H)}right).$$ Note that the page defines $$H$$ slightly different but that should be neglible.

Get this bounty!!!

This site uses Akismet to reduce spam. Learn how your comment data is processed.