# #StackBounty: #regression #bayesian #gaussian-process #smoothing #semiparametric Probabilistic interpretation of Thin Plate Smoothing S…

### Bounty: 100

TLDR: Do thin plate regression splines have a probabilistic/Bayesian interpretation?

Given input-output pairs \$(x_i,y_i)\$, \$i=1,…,n\$; I want to estimate a function \$f(cdot)\$ as follows
begin{equation}f(x)approx u(x)=phi(x_i)^Tbeta +sum_{i=1}^n alpha_i k(x,x_i),end{equation}
where \$k(cdot,cdot)\$ is a kernel function and \$phi(x_i)\$ is a feature vector of size \$m<n\$. The coefficients \$alpha_i\$ and \$beta_i\$ can be found by solving
begin{equation}
{displaystyle min {alphain R^{n},beta in R^{m}}{frac {1}{n}}|Y-Phibeta -Kalpha|{R^{n}}^{2}+lambda alpha^{T}Kalpha},end{equation}
where the rows of \$Phi\$ are given by \$phi(x_i)^T\$ and, with some abuse of notation, the \$i,j\$’th entry of the kernel matrix \$K\$ is \${displaystyle k(x_{i},x_{j})} \$. This gives
begin{equation}
alpha^=lambda^{-1}(I+lambda^{-1}K)^{-1}(Y-Phibeta^)
end{equation}
begin{equation}
beta^*={Phi^T(I+lambda^{-1}K)^{-1}Phi}^{-1}Phi^T(I+lambda^{-1}K)^{-1}Y.
end{equation}
Assuming that \$k(cdot,cdot)\$ is a positive definite kernel function, this solution can be seen as the Best Linear Unbiased Predictor for the following Bayesian model:
begin{equation}
y~vert~(beta,h(cdot))~sim~N(phi(x)beta+h(x),sigma^2),
end{equation}
begin{equation}
h(cdot)~sim~GP(0,tau k(cdot,cdot)),
end{equation}
begin{equation}
betapropto1,
end{equation}
where \$sigma^2/tau=lambda\$ and \$GP\$ denotes a Gaussian process. See for example https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2665800/

My question is as follows. Suppose that I let \$k(x,x’):=|x-x’|^2 ln(|x-x’|)\$ and \$phi(x)^T=(1,x)\$, i.e. thin plate spline regression. Now, \$k(cdot,cdot)\$ is not a positive semidefinite function and the above interpretation doesn’t work. Does the above model and its solution still have a probabilistic interpretation as for the case the \$k(cdot,cdot)\$ is positive semidefinite?

Get this bounty!!!