Volume 88, Number 6, December 2009
|Number of page(s)||6|
|Section||Interdisciplinary Physics and Related Areas of Science and Technology|
|Published online||22 December 2009|
An information-theoretic approach to statistical dependence: Copula information
Departamento de Física Geral, Instituto de Física, Universidade de São Paulo CP 66318, 05315-970, São Paulo-SP, Brazil
2 Complex Systems, Escola de Artes, Ciências e Humanidades, Universidade de São Paulo 03828-020, São Paulo-SP, Brazil
Corresponding author: email@example.com
Accepted: 20 November 2009
We discuss the connection between information and copula theories by showing that a copula can be employed to decompose the information content of a multivariate distribution into marginal and dependence components, with the latter quantified by the mutual information. We define the information excess as a measure of deviation from a maximum-entropy distribution. The idea of marginal invariant dependence measures is also discussed and used to show that empirical linear correlation underestimates the amplitude of the actual correlation in the case of non-Gaussian marginals. The mutual information is shown to provide an upper bound for the asymptotic empirical log-likelihood of a copula. An analytical expression for the information excess of T-copulas is provided, allowing for simple model identification within this family. We illustrate the framework in a financial data set.
PACS: 89.70.Cf – Entropy and other measures of information / 89.75.-k – Complex systems / 89.65.Gh – Economics; econophysics, financial markets, business and management
© EPLA, 2009
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.