Articles producció científica> Enginyeria Química

Bayesian estimation of information-theoretic metrics for sparsely sampled distributions

  • Dades identificatives

    Identificador: imarina:9386996
    Autors:
    Piga, AngeloFont-Pomarol, LlucSales-Pardo, MartaGuimera, Roger
    Resum:
    Estimating the Shannon entropy of a discrete distribution from which we have only observed a small sample is challenging. Estimating other information-theoretic metrics, such as the Kullback-Leibler divergence between two sparsely sampled discrete distributions, is even harder. Here, we propose a fast, semi-analytical estimator for sparsely sampled distributions. Its derivation is grounded in probabilistic considerations and uses a hierarchical Bayesian approach to extract as much information as possible from the few observations available. Our approach provides estimates of the Shannon entropy with precision at least comparable to the benchmarks we consider, and most often higher; it does so across diverse distributions with very different properties. Our method can also be used to obtain accurate estimates of other information-theoretic metrics, including the notoriously challenging Kullback-Leibler divergence. Here, again, our approach has less bias, overall, than the benchmark estimators we consider.
  • Altres:

    Autor segons l'article: Piga, Angelo; Font-Pomarol, Lluc; Sales-Pardo, Marta; Guimera, Roger
    Departament: Enginyeria Química
    Autor/s de la URV: Font Pomarol, Lluc / Guimera Manrique, Roger / Piga, Angelo / Sales Pardo, Marta
    Paraules clau: Bayesian estimation Entropy estimation Inferenc Information theor Information theory Kullback-leibler divergence Kullback–leibler divergence Shannon entropy Sparse sampling
    Resum: Estimating the Shannon entropy of a discrete distribution from which we have only observed a small sample is challenging. Estimating other information-theoretic metrics, such as the Kullback-Leibler divergence between two sparsely sampled discrete distributions, is even harder. Here, we propose a fast, semi-analytical estimator for sparsely sampled distributions. Its derivation is grounded in probabilistic considerations and uses a hierarchical Bayesian approach to extract as much information as possible from the few observations available. Our approach provides estimates of the Shannon entropy with precision at least comparable to the benchmarks we consider, and most often higher; it does so across diverse distributions with very different properties. Our method can also be used to obtain accurate estimates of other information-theoretic metrics, including the notoriously challenging Kullback-Leibler divergence. Here, again, our approach has less bias, overall, than the benchmark estimators we consider.
    Àrees temàtiques: Applied mathematics Astronomia / física Ciência da computação Ciências biológicas i Ciências biológicas ii Direito Economia Engenharias i Engenharias ii Engenharias iii Engenharias iv General mathematics General physics and astronomy Geociências Interdisciplinar Matemática / probabilidade e estatística Materiais Mathematical physics Mathematics (all) Mathematics (miscellaneous) Mathematics, applied Mathematics, interdisciplinary applications Physics Physics and astronomy (all) Physics and astronomy (miscellaneous) Physics, mathematical Physics, multidisciplinary Química Statistical and nonlinear physics
    Accès a la llicència d'ús: https://creativecommons.org/licenses/by/3.0/es/
    Adreça de correu electrònic de l'autor: marta.sales@urv.cat lluc.fonti@estudiants.urv.cat lluc.fonti@estudiants.urv.cat roger.guimera@urv.cat
    Identificador de l'autor: 0000-0002-8140-6525 0000-0002-3597-4310
    Data d'alta del registre: 2024-10-19
    Versió de l'article dipositat: info:eu-repo/semantics/publishedVersion
    Referència a l'article segons font original: Chaos Solitons & Fractals. 180 114564-
    Referència de l'ítem segons les normes APA: Piga, Angelo; Font-Pomarol, Lluc; Sales-Pardo, Marta; Guimera, Roger (2024). Bayesian estimation of information-theoretic metrics for sparsely sampled distributions. Chaos Solitons & Fractals, 180(), 114564-. DOI: 10.1016/j.chaos.2024.114564
    URL Document de llicència: https://repositori.urv.cat/ca/proteccio-de-dades/
    Entitat: Universitat Rovira i Virgili
    Any de publicació de la revista: 2024
    Tipus de publicació: Journal Publications
  • Paraules clau:

    Applied Mathematics,Mathematical Physics,Mathematics (Miscellaneous),Mathematics, Applied,Mathematics, Interdisciplinary Applications,Physics,Physics and Astronomy (Miscellaneous),Physics, Mathematical,Physics, Multidisciplinary,Statistical and Nonlinear Physics
    Bayesian estimation
    Entropy estimation
    Inferenc
    Information theor
    Information theory
    Kullback-leibler divergence
    Kullback–leibler divergence
    Shannon entropy
    Sparse sampling
    Applied mathematics
    Astronomia / física
    Ciência da computação
    Ciências biológicas i
    Ciências biológicas ii
    Direito
    Economia
    Engenharias i
    Engenharias ii
    Engenharias iii
    Engenharias iv
    General mathematics
    General physics and astronomy
    Geociências
    Interdisciplinar
    Matemática / probabilidade e estatística
    Materiais
    Mathematical physics
    Mathematics (all)
    Mathematics (miscellaneous)
    Mathematics, applied
    Mathematics, interdisciplinary applications
    Physics
    Physics and astronomy (all)
    Physics and astronomy (miscellaneous)
    Physics, mathematical
    Physics, multidisciplinary
    Química
    Statistical and nonlinear physics
  • Documents:

  • Cerca a google

    Search to google scholar