International Journal For Multidisciplinary Research
E-ISSN: 2582-2160
•
Impact Factor: 9.24
A Widely Indexed Open Access Peer Reviewed Multidisciplinary Bi-monthly Scholarly International Journal
Home
Research Paper
Submit Research Paper
Publication Guidelines
Publication Charges
Upload Documents
Track Status / Pay Fees / Download Publication Certi.
Editors & Reviewers
View All
Join as a Reviewer
Get Membership Certificate
Current Issue
Publication Archive
Conference
Publishing Conf. with IJFMR
Upcoming Conference(s) ↓
Conferences Published ↓
IC-AIRCM-T3-2026
SPHERE-2025
AIMAR-2025
SVGASCA-2025
ICCE-2025
Chinai-2023
PIPRDA-2023
ICMRS'23
Contact Us
Plagiarism is checked by the leading plagiarism checker
Call for Paper
Volume 8 Issue 2
March-April 2026
Indexing Partners
A Comprehensive Study on Non-Parametric Measures of Entropy and Their Mathematical Characteristics
| Author(s) | Anuradha Swarnkar, Dr. Rohit Verma |
|---|---|
| Country | India |
| Abstract | Entropy serves as a fundamental measure of uncertainty and information content within probability distributions. Traditional entropy models such as Shannon, Rényi, and Tsallis often rely on parametric assumptions or order parameters that limit their general applicability. This research paper develops a comprehensive theoretical and analytical framework for non-parametric measures of entropy, emphasizing parameter-free, distribution-independent formulations that retain mathematical rigor and practical interpretability. The study revisits classical non-parametric functionals, including Shannon entropy, differential entropy, Cumulative Residual Entropy (CRE), and Cumulative Past Entropy (CPE), and investigates their structural properties such as non-negativity, symmetry, continuity, concavity, and Schur-concavity. Two fundamental theorems are established: the Schur-concavity theorem, demonstrating that entropy is maximized under randomization and uniformity, and the concavity theorem, proving that mixing of distributions increases overall uncertainty. The paper further discusses estimation techniques for non-parametric entropy from finite samples, including plug-in estimators, kernel and spacing methods, and k-nearest-neighbor approaches, along with their bias–variance characteristics and asymptotic behavior. Practical implications for reliability analysis, statistical learning, and information theory are examined, highlighting the robustness and universality of non-parametric entropy in complex, data-driven environments. This work provides a unified foundation for entropy analysis that is model-agnostic, mathematically consistent, and computationally adaptable, positioning non-parametric entropy as a core tool for modern uncertainty quantification and information-theoretic inference. |
| Published In | Volume 7, Issue 5, September-October 2025 |
| Published On | 2025-10-31 |
| DOI | https://doi.org/10.36948/ijfmr.2025.v07i05.59457 |
Share this

E-ISSN 2582-2160
CrossRef DOI is assigned to each research paper published in our journal.
IJFMR DOI prefix is
10.36948/ijfmr
Downloads
All research papers published on this website are licensed under Creative Commons Attribution-ShareAlike 4.0 International License, and all rights belong to their respective authors/researchers.
Powered by Sky Research Publication and Journals