Detecting independence of random vectors I. Generalized distance covariance and Gaussian covariance
Distance covariance is a quantity to measure the dependence of two random vectors. We show that the original concept introduced and developed by Székely, Rizzo and Bakirov can be embedded into a more general framework based on symmetric Lévy measures and the corresponding real-valued continuous negative definite functions. The Lévy measures replace the weight functions used in the original definition of distance covariance. All essential properties of distance covariance are preserved in this new framework and some proofs are streamlined. Form a practical point of view this allows less restrictive moment conditions on the underlying random variables and one can use other distance functions than the Euclidean distance, e.g. the Minkowski distance. Most importantly, it serves as the basic building block for distance multivariance, a quantity to measure and estimate dependence of multiple random vectors, which is introduced in the companion paper [Detecting independence of random vectors II: Distance multivariance and Gaussian multivariance] to the present article.
READ FULL TEXT