Renormalized Mutual Information for Extraction of Continuous Features

05/04/2020
by   Leopoldo Sarra, et al.
0

We derive a well-defined renormalized version of mutual information that allows to estimate the dependence between continuous random variables in the important case when one is deterministically dependent on the other. This is the situation relevant for feature extraction and for information processing in artificial neural networks. We illustrate in basic examples how the renormalized mutual information can be used not only to compare the usefulness of different ansatz features, but also to automatically extract optimal features of a system in an unsupervised dimensionality reduction scenario.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset