#StackBounty: #estimation #mutual-information Recommended Mutual Information Estimator for Continuous Variable

Bounty: 50

The mutual information seems to be quite an interesting measure of the relationship between variables. As such I wanted to apply it to investigate the relationship of two continuous variables $$X$$ and $$Y$$ for which I only have a hundred observations. In particular, I would like to obtain a normed version of the mutual information such that it is $$1$$ in the case of perfect dependence. I guess this means that the entropy of $$X$$ and $$Y$$ also need to be estimated.

After doing some research, I realized that estimating the (unnormalized) mutual information of two continuous variables is highly nontrivial. As a result, multiple competing approaches exists. https://journals.aps.org/pre/pdf/10.1103/PhysRevE.76.026209 provides an overview of some of them. https://journals.aps.org/pre/pdf/10.1103/PhysRevE.76.026209 also compares multiple approaches under different settings and makes recommendations when to use which approach. However, this paper is already 12 years old and since then new estimators have been developed, for example, https://arxiv.org/pdf/1801.04062.pdf. So, is anybody active in this field and can provide a recommendation which estimator is currently to be preferred (in which situation)? Ideally, I would also like to obtain a confidence interval for the normed mutual information.

Get this bounty!!!