Date of Original Version

6-2013

Type

Conference Proceeding

Rights Management

Copyright 2013 by the author(s).

Abstract or Description

In this paper we develop new dependence and conditional dependence measures and provide their estimators. An attractive property of these measures and estimators is that they are invariant to any monotone increasing transformations of the random variables, which is important in many applications including feature selection. Under certain conditions we show the consistency of these estimators, derive upper bounds on their convergence rates, and show that the estimators do not suffer from the curse of dimensionality. However, when the conditions are less restrictive, we derive a lower bound which proves that in the worst case the convergence can be arbitrarily slow similarly to some other estimators. Numerical illustrations demonstrate the applicability of our method.

Share

COinS
 

Published In

Journal of Machine Learning Research : Workshop and Conference Proceedings, 28, 1355-1363.