Date of Original Version

12-2011

Type

Conference Proceeding

Abstract or Description

A fundamental problem in statistics is the estimation of dependence between random variables. While information theory provides standard measures of dependence (e.g. Shannon-, Renyi-, Tsallis-mutual information), it is still unknown how to estimate these quantities from i.i.d. samples in the most efficient way. In this presentation we review some of our recent results on copula based nonparametric dependence estimators and demonstrate their robustness to outliers both theoretically in terms of finite-sample breakdown points and by numerical experiments in independent subspace analysis and image registration.

Share

COinS
 

Published In

Proceedings of NIPS 2011 Workshop on Copulas in Machine Learning.