Untangling aleatoric and epistemic uncertainty
Insightful post! One question I had was why quantifying aleatoric uncertainty vía variance instead of, e.g., entropy? Is there a principled argument or is it just a matter of modeling preference?
Thanks!
Variance seems to me as a statistician a "natural" choice when you frame aleatoric uncertainty as the conditional probabilistic model P(Y|X=x).
But entropy should also be a sensible option to pick.
Makes sense. Are they monotonically correlated? If a distribution has higher variance, does it also have higher entropy?
Insightful post! One question I had was why quantifying aleatoric uncertainty vía variance instead of, e.g., entropy? Is there a principled argument or is it just a matter of modeling preference?
Thanks!
Variance seems to me as a statistician a "natural" choice when you frame aleatoric uncertainty as the conditional probabilistic model P(Y|X=x).
But entropy should also be a sensible option to pick.
Makes sense. Are they monotonically correlated? If a distribution has higher variance, does it also have higher entropy?