I’m interested in how we evaluate performance of Bayesian regression (linear, multiple, logistic, etc.) The posterior distribution will capture the relative likeliness of any parameter combination. So a 2D heatmap, for example of B1 and B2 (coefficients) might give us some insight into their relationship.
Recently, a colleague of mine mentioned that the posterior’s covariance matrix is effectively "all you need." I want to ask, is this oversimplifying the matter (and even if so) what does the posterior covariance matrix tell you?
My guesses are:
(1) Along the diagonal you get single parameter’s variance. The lower the number, the more confidence we have in the estimate. Whereas high variance might indicate that we’re less confident in our estimate.
(2) Covariance between parameters might be trickier to interpret. The direction (+/-) of the covariance might give an indication of the nature of the relationship (is an increase in one parameter associated with an increase, decrease or neither in the other.)
(3) The magnitude of the covariance gives me pause. Does a small value imply high confidence in the relationship or little to no association? (Very different meanings!)
(4) I can imagine a situation where the variance of B1 is quite small, so perhaps we’re confident in the estimate, whereas the variance of B2 might be rather large, so less confident. I’m not sure how this would affect our understanding of covariance direction and magnitude.
*All the above assumes proper analysis, no multicollinearity, collider bias, etc.