Acta mathematica scientia, Series B >
A COMPOUND POISSON MODEL FOR LEARNING DISCRETE BAYESIAN NETWORKS
Received date: 2012-05-12
Revised date: 2012-11-03
Online published: 2013-11-20
We introduce here the concept of Bayesian networks, in compound Poisson model, which provides a graphical modeling framework that encodes the joint probability distribution for a set of random variables within a directed acyclic graph. We suggest an approach proposal which offers a new mixed implicit estimator. We show that the implicit approach applied in compound Poisson model is very attractive for its ability to understand data and does not require any prior information. A comparative study between learned estimates given by implicit and by standard Bayesian approaches is established. Under some conditions and based on minimal squared error calculations, we show that the
mixed implicit estimator is better than the standard Bayesian and the maximum likelihood estimators. We illustrate our approach by considering a simulation study in the context of mobile communication networks.
Abdelaziz GHRIBI , Afif MASMOUDI . A COMPOUND POISSON MODEL FOR LEARNING DISCRETE BAYESIAN NETWORKS[J]. Acta mathematica scientia, Series B, 2013 , 33(6) : 1767 -1784 . DOI: 10.1016/S0252-9602(13)60122-8
[1] Pearl J. Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference. San Fransisco, CA: Morgan Kaufmann, 1988
[2] Lauritzen S. Graphical models. Clarendon Press Oxford, 1996
[3] Jensen F, Lauritzen S, Olesen K. Bayesian updating in recursive graphical models by local computations.
Computational Statistical Quaterly, 1990, 4: 269–282
[4] Heckerman D. A tutorial on learning with Bayesian networks. Learning in Graphical Models. MIT Press, 1999: 301–354
[5] Heckerman D, Chickering D, Geiger D. Learning Bayesian Networks: Search methods and experimental
results//Proceedings of Fifth Conference on Artificial Intelligence and Statistics, 1995: 112–128
[6] Heckerman D, Breese J. Causal independence for probability assessment and inference using Bayesian networks. IEEE Trans Syst Man Cybem, 1996, 26: 826–831
[7] Neapolitan R E. Learning Bayesian Networks. Prentice Hall, 2003
[8] Chickering D, Heckerman D, Meek C. Large-sample learning of Bayesian networks is NP-Hard. J Mach Learn Res, 2004, 5: 1287–1330
[9] Robert C P. The Bayesian Choice: A Decision-Theoretic Motivation, New York: Springer-Verlag, 1994
[10] Bernardo JM. Reference posterior distributions for Bayesian inference (with discussion). J Royal Statistical Soc, 1979, 14B: 113–147
[11] Jeffreys H. An Invariant Form for the Prior Probability in Estimation Problems. Proceedings of the Royal Society of London. Series A, Mathematical and Physical Sciences, 1946, 186(1007): 453–461
[12] Revfeim K J A. An initial model of the relationship between rainfall events and daily rainfalls. J Hydrology, 1984, 75: 357–364
[13] Hassairi A, Masmoudi A, Kokonendji C. Implicit distributions and estimation. Comm Stat Theory Meth-ods, 2005, 34: 245–252
[14] Ben Hassen H, Masmoudi A, Rebai A. Causal inference in biomolecular pathways using a Bayesian network
approach and an implicit method. J Theoret Biol, 2008, 253(4): 717–724
[15] Ben Hassen H, Masmoudi A, Rebai A. Inference in signal transduction pathways using EM algorthm and an implicit algorithm: Incomplete data case. J Comput Biol, 2009, 16: 1227–1240
[16] Bouchaala L, Masmoudi A, Gargouri F, Rebai A. Improving algorithms for structure learning in Bayesian networks using a new implicit score. Expert Syst Appl, 2010, 37(7): 5470–5475
[17] Leray P. Automatique et Traitement du Signal R´eseaux Bay´esiens: apprentissage et mod´elisation de syst`emes complexes [Habilitation a diriger les recherches]. Universit´e de Rouen, UFR des Sciences, 2006
/
| 〈 |
|
〉 |