International Journal For Multidisciplinary Research

E-ISSN: 2582-2160     Impact Factor: 9.24

A Widely Indexed Open Access Peer Reviewed Multidisciplinary Bi-monthly Scholarly International Journal

Call for Paper Volume 8, Issue 2 (March-April 2026) Submit your research before last 3 days of April to publish your research paper in the issue of March-April.

A Comprehensive Study on Non-Parametric Measures of Entropy and Their Mathematical Characteristics

Author(s) Anuradha Swarnkar, Dr. Rohit Verma
Country India
Abstract Entropy serves as a fundamental measure of uncertainty and information content within probability distributions. Traditional entropy models such as Shannon, Rényi, and Tsallis often rely on parametric assumptions or order parameters that limit their general applicability. This research paper develops a comprehensive theoretical and analytical framework for non-parametric measures of entropy, emphasizing parameter-free, distribution-independent formulations that retain mathematical rigor and practical interpretability. The study revisits classical non-parametric functionals, including Shannon entropy, differential entropy, Cumulative Residual Entropy (CRE), and Cumulative Past Entropy (CPE), and investigates their structural properties such as non-negativity, symmetry, continuity, concavity, and Schur-concavity.
Two fundamental theorems are established: the Schur-concavity theorem, demonstrating that entropy is maximized under randomization and uniformity, and the concavity theorem, proving that mixing of distributions increases overall uncertainty. The paper further discusses estimation techniques for non-parametric entropy from finite samples, including plug-in estimators, kernel and spacing methods, and k-nearest-neighbor approaches, along with their bias–variance characteristics and asymptotic behavior. Practical implications for reliability analysis, statistical learning, and information theory are examined, highlighting the robustness and universality of non-parametric entropy in complex, data-driven environments.
This work provides a unified foundation for entropy analysis that is model-agnostic, mathematically consistent, and computationally adaptable, positioning non-parametric entropy as a core tool for modern uncertainty quantification and information-theoretic inference.
Published In Volume 7, Issue 5, September-October 2025
Published On 2025-10-31
DOI https://doi.org/10.36948/ijfmr.2025.v07i05.59457

Share this