# #StackBounty: #information-theory #mutual-information A measure of redundancy in mutual information

### Bounty: 100

Mutual information quantifies to what degree $$X$$ decreases the uncertainty about $$Y$$. However, to my understanding, it does not quantify "in how many ways" $$X$$ decreases the uncertainty. E.g., consider the case where $$X$$ is a 3D vector, and consider $$X_1=[Y,0,0]$$ vs. $$X_2 = [Y,Y^2, 3.5Y]$$. Intuitively, $$X_2$$ contains "more information" about $$Y$$, or is more redundant with respect to $$Y$$, than $$X_1$$; but if I understand correctly, both have the same mutual information. Is there an alternative information-theoretic measure that can quantify this difference?

Thanks!

Get this bounty!!!

This site uses Akismet to reduce spam. Learn how your comment data is processed.