Date of Original Version
Copyright 2012 by the authors
Abstract or Description
In this paper we propose new nonparametric estimators for a family of conditional mutual information and divergences. Our estimators are easy to compute; they only use simple k nearest neighbor based statistics. We prove that the proposed conditional information and divergence estimators are consistent under certain conditions, and demonstrate their consistency and applicability by numerical experiments on simulated and on real data as well.
Journal of Machine Learning Research : Workshop and Conference Proceedings, 22, 914-923.