Volume 88, Number 6, December 2009
Article Number 68003
Number of page(s) 6
Section Interdisciplinary Physics and Related Areas of Science and Technology
Published online 01 December 2009
EPL, 88 (2009) 68003
DOI: 10.1209/0295-5075/88/68003

An information-theoretic approach to statistical dependence: Copula information

R. S. Calsaverini1 and R. Vicente2

1   Departamento de Física Geral, Instituto de Física, Universidade de São Paulo CP 66318, 05315-970, São Paulo-SP, Brazil
2   Complex Systems, Escola de Artes, Ciências e Humanidades, Universidade de São Paulo 03828-020, São Paulo-SP, Brazil

received 23 September 2009; accepted in final form 20 November 2009; published December 2009
published online 22 December 2009

We discuss the connection between information and copula theories by showing that a copula can be employed to decompose the information content of a multivariate distribution into marginal and dependence components, with the latter quantified by the mutual information. We define the information excess as a measure of deviation from a maximum-entropy distribution. The idea of marginal invariant dependence measures is also discussed and used to show that empirical linear correlation underestimates the amplitude of the actual correlation in the case of non-Gaussian marginals. The mutual information is shown to provide an upper bound for the asymptotic empirical log-likelihood of a copula. An analytical expression for the information excess of T-copulas is provided, allowing for simple model identification within this family. We illustrate the framework in a financial data set.

89.70.Cf - Entropy and other measures of information.
89.75.-k - Complex systems.
89.65.Gh - Economics; econophysics, financial markets, business and management.

© EPLA 2009