Once we have modeled a PDF of some random variable, we might want to quantify how certain we are with the PDF’s parameters.

Two ways

Shannon Information

Mutual Information

When and are independent, then

Any bit of dependence makes it greater than 0.

Together

probability