Learned posterior probabilities

The front end.
Post Reply
yoavp81
Posts: 1
Joined: Thu Mar 30, 2017 6:36 am

Learned posterior probabilities

Post by yoavp81 »

Hi everyone,

I'm using a BN for modelling water infiltration but new to the field of BN's.
I have a data file which cases were learned in my network.
However, since not all instances and combinations of the different variables are covered in the data, in the resulting CPT of the child node, some combinations resulted in equal probabilities (0.333, 0.333, 0.333 for example).

My question is how can I overcome these results, given the fact that I know the nature of relationships between the variables?

Thank you,
Yoav
marek [BayesFusion]
Site Admin
Posts: 430
Joined: Tue Dec 11, 2007 4:24 pm

Re: Learned posterior probabilities

Post by marek [BayesFusion] »

I will be glad to help here. Please create the probability tables based on your knowledge of the interactions between the parents and the children. It is OK to make these approximate. If you know the nature of the relationships between the parents and the children in terms of equations, there is a way of doing it semi-automatically through creating continuous nodes with equations and then using discretization.

In any case, once you have the parameters, use the data to learn the ultimate parameters but set the confidence a little higher than 1. The confidence parameters corresponds to the number of records on which the current parameters are based and it gets used to weigh between the current parameters and the data set that you are using to learn/refine them. Just make a judgment about the imaginary number of records that the network has seen. When you do all that, you should never see uniform distributions, which are placed there whenever there are no records in the data file corresponding to the column in the CPT and uniform priors or minimal confidence in the current parameters.
I hope this helps,

Marek
Post Reply