A summary of the notes I have taken while studying the intersection between these 2 topics
NOTE: This is still work in progress but please follow if you are interested in the topic
This section is to be intended as a guide to read this more effectively
Introduction to the connection between Riemannian Geometry and Information Theory (Information Geometry)
Let's take the perspective of Information Geometry to build a connection between a Probabilist Density Function and a Riemannian Manifold, the idea is to work on stat related aspects in the geometric domain so we have to build connections between the 2 domains
The next step is to define the proper metric in the geometric domain, as in inner product
Let's recall that an inner product will give us
As in the PDF domain a commonly used similarity metric between PDFs is the Kullback–Leibler divergence what can be the related one in the geometric domain?
Actually, it is worth of observing the Fisher Information Matrix can be interpreted as the curvature of the relative entropy between a pair of PDFs which are extremely similar (for more details see here) so using this as the chosen Riemannian Metric for our Manifolds builds a solid connection with the Stat Side (and also extends KL divergence from a pseudo-distance to an actual distance)