Document Type

Article

Publication Date

2011

College/Unit

Eberly College of Arts and Sciences

Department/Program/Center

Statistics

Abstract

A consistent entropy estimator for hyperspherical data is proposed based on the k-nearest neighbor (knn) approach. The asymptotic unbiasedness and consistency of the estimator are proved. Moreover, cross entropy and Kullback-Leibler (KL) divergence estimators are also discussed. Simulation studies are conducted to assess the performance of the estimators for models including uniform and von Mises-Fisher distributions. The proposed knn entropy estimator is compared with the moment based counterpart via simulations. The results show that these two methods are comparable.

Source Citation

Li, S., Mnatsakanov, R. M., & Andrew, M. E. (2011). k-Nearest Neighbor Based Consistent Entropy Estimation for Hyperspherical Distributions. Entropy, 13(3), 650–667. https://doi.org/10.3390/e13030650

Comments

2011 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/3.0/).

Share

COinS
 
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.