The Maximum Entropy of a Metric Space
Abstract We define a one-parameter family of entropies, each assigning a real number to any probability measure on a compact metric space (or, more generally, a compact Hausdorff space with a notion of similarity between points). These generalize the Shannon and Rényi entropies of information theory. We prove that on any space X, there is a single probability measure maximizing all these entropies simultaneously. Moreover, all the entropies have the same maximum value: the maximum entropy of X. As X is scaled up, the maximum entropy grows, and its asymptotics determine geometric information about X, including the volume and dimension. And the large-scale limit of the maximizing measure itself provides an answer to the question: what is the canonical measure on a metric space? Primarily, we work not with entropy itself but its exponential, which in its finite form is already in use as a measure of biodiversity. Our main theorem was first proved in the finite case by Leinster and Meckes.