Unexpected evaluation result

The engine.
Post Reply
sverrevr
Posts: 7
Joined: Mon Aug 24, 2020 7:37 am

Unexpected evaluation result

Post by sverrevr » Tue Oct 13, 2020 2:29 pm

Hi! I'm struggling with the evaluation of the network not acting the way I would suspect.
Se the following diagram:
odd_evaluation.png
odd_evaluation.png (15.98 KiB) Viewed 610 times
This diagram mimics a dynamic net with 2 time-slices. The node res2 is of interest, this node has a definition that requires both a2 and b2 to be in state1 for it to be in state1 (see the opened definition). There is evidence on the res1 and side1 nodes.

I would have imagined that the following holds with this definition:
P(res2=state1) = P(a2=state1)*P(b2=state1)
as long as there is no evidence affecting res2 or its children (things it is pointing at).
If this was the case then P(res2=state1)=0.13, which it is not (it is 0.083).

The reason I hold this beliefe is that i thought this network should uphold the 1.order markov assumption, stating that the states at timestep 2 should only be affected by the posterior states at time-step 1 and the measurements at time step 2. Should it then not be equivalent with the following network:
correct_evaluation.png
correct_evaluation.png (82.17 KiB) Viewed 610 times
This network acts the way i suspect.

What is the cause of the discrepancy between what i expect and what happens? What am i not seeing.

Best regards
Sverre
Attachments
unexpected_evaluation.xdsl
(3.64 KiB) Downloaded 21 times

marek [BayesFusion]
Site Admin
Posts: 306
Joined: Tue Dec 11, 2007 4:24 pm

Re: Unexpected evaluation result

Post by marek [BayesFusion] » Tue Oct 13, 2020 3:59 pm

Hi Sverre,

First of all, I believe you are using an approximate algorithm, as the results are slightly off from the exact computation by clustering algorithm. This, however, should not matter much, just some precision.

The error that you are making with your calculation/formula is assuming that a2 and b2 are independent of each other and they are not. Because there is evidence in res1, a1 and b1 are dependent of each other and they make a2 and b2 dependent as well. You can check this by observing one of the two variables (a2 or b2) and seeing that the posterior marginal changes in the other. "Update immediately" option will work best. The dependence between a1 and b1 is through an observed common descendant res1 (child of both in this case). I hope this explains why the result is not what you expected.

An unrelated remark: Screen shots were the right thing to do this time but you you can copy selected nodes and then paste special into another Windows program to obtain a picture without the grid and the "Academic Use Only" watermark. Picture (enhanced metafile) will give you the highest quality when you later resize the picture.

I have also noticed that you may be using an older version of GeNIe. I recommend GeNIe 3.0, which we released a few weeks ago.

I hope this helps,

Marek

sverrevr
Posts: 7
Joined: Mon Aug 24, 2020 7:37 am

Re: Unexpected evaluation result

Post by sverrevr » Wed Oct 14, 2020 7:59 am

Hi, Marek, thanks for the great response again :)

You are off-course correct. If I set evidence on side2 it will affect b2 which will affect b1 which will affect our believe on whether a1 or b1 caused res1.
I was for some reason convinced that the Markov assumption would hold with a factored representation, which it apparently does not...

Do you have any recommendations on limiting the number of time-steps if the model is applied on-line? I am currently deleting the oldest time-steps when a new one is introduced, and using the posterior (that I saved when the time-step was generated, so without any information from later time-steps, but with information from prior time-steps) as the prior for the new oldest timestep. For a short horizon, this will as you pointed out lead to the wrong evaluations. But maybe when the horizon is long enough the effect the newest time-step has on the oldest will be low enough to not matter that much. This will of course depend on the CPTs.

I was btw using the Clustering algorithm, not that it matters :)

Thanks for the tip regarding copying the figure, did not know about that! Genie is now upgraded ;)

Thanks for the quick and thorough response
Sverre

marek [BayesFusion]
Site Admin
Posts: 306
Joined: Tue Dec 11, 2007 4:24 pm

Re: Unexpected evaluation result

Post by marek [BayesFusion] » Wed Oct 14, 2020 8:57 am

Hi Sverre,

I have never tried it myself but you could possibly reduce previous steps of a multi-step DBN by means of the "Marginalize" operator. This would work well in all situations when you have the time to perform this operation, for example, because you are waiting for input from the user. Marginalization would preserve the numerical properties of the remaining nodes while simplifying the network and de-facto removing nodes that are no longer needed in practice, e.g., nodes in the previous time steps. The removal should not be too complex computationally, as long as there is no evidence in the future slices. I don't know your problem but you could also keep the number of steps at the minimum, which is how far into the future you want to perform inference. You could have some kind of "sliding window" and recreate a smaller network at each step of your reasoning. This is something that you would need to do conceptually outside of GeNIe/SMILE.

I hope this helps,

Marek

sverrevr
Posts: 7
Joined: Mon Aug 24, 2020 7:37 am

Re: Unexpected evaluation result

Post by sverrevr » Wed Oct 14, 2020 10:15 am

Hi Marek :)

I might not completely understand how the marginalization works. I am for example not allowed to marginalize b1 or a1, but if I cut the connection from b1 to res1 (and redefine res1) then I can marginalize them. Furthermore, it does not seem like marginalization considers the inserted evidence? If I marginalize res1 with evidence then the definition of a1 and b1 do not change.

I'm working on using a DBN for making decisions. Each time-step is when a new decision must be made. The network must reason a bit backward in time to identify if a mistake was made, and a bit forward in time. For each new decision, the number of time-steps are therefore increased. The challinging point with the sliding windows is how to set the initial conditions. Preferably these should be set such that they in some way keep the knowledge of everything that is no longer in the window. The way I do it now is: whenever a new time-step is made, save the value of the nodes (a and b). When the first time step is "removed", update the intial value to be the value saved for the second time-step. Clear all evidence, and apply the same evidence one time step earlier (dropping the first evidence). The last time-step is now a "new" time-step without any evidence. This method has the problem presented over.

I noticed it since I had a "real" network with a long horizon. And "simulation" network with a window of 2, to quicker simulate forward in time. The computational time of the "real" network became very long so I wanted to avoid having to reevaluate it, I was planning to only use it when i was interested in past states. Anyways the simulation net and real net calculated different results which resulted in poor beheaviour.

marek [BayesFusion]
Site Admin
Posts: 306
Joined: Tue Dec 11, 2007 4:24 pm

Re: Unexpected evaluation result

Post by marek [BayesFusion] » Sat Oct 17, 2020 10:44 am

Hi Sverre,

Sorry for a couple of days delay -- I wanted to check what is happening because something funny was actually happening with marginalization of nodes a1 and b1 in your network. We have identify a minor bug in the code that prevents GeNIe from marginalizing these two nodes. In general, it should be possible to marginalize any node, so the fact that GeNIe did not marginalize them is in error. What happened is that some combinations of new parents of node a2 became impossible through the determinism in the node a1. We will fix it in the next release but for now you can do two things: (1) Replace the 0s and 1s in the nodes a1 and a2 by epsilon and (1-epsilon), where epsilon is some very small number, or (2) Deactivate relevance (through Network menu, Algorithm and then Deactivate Relevance).

You stated that marginalization disregards evidence. This is true, as it is a model building operation, so it will get rid of the node in question while preserving the probability distribution over the other nodes (all evidence is disregarded in this process!). Essentially, marginalizing a node that has been observed will get rid of that observation as well. So, it seems that it is not the right operation for you.

I have noticed, however, that in your network, evidence nodes, such as a1 and b1 have no parents. In that case, you could get rid of them by just modifying the CPTs of their children. So, suppose that you observe the value of State0 in a1. Then you can shrink the CPT in res1 by removing the part of the table that corresponds to State1. The same in node a2, just remove the part of the CPT that corresponds to State1 in a1. Once you have done this, you can just remove a1. If a1 has parents in addition to children, the situation gets more complex.

Does this put you on a good track?

Marek

sverrevr
Posts: 7
Joined: Mon Aug 24, 2020 7:37 am

Re: Unexpected evaluation result

Post by sverrevr » Tue Oct 20, 2020 3:02 pm

Hi Marke, thanks for cheking things out, no worries about the delay ;) Interesting that this uncovered a bug!

If I'm not misunderstanding you, I think there is a misunderstanding on where I'm putting the evidence. I'm putting the evidence on the res and side nodes, not on the a and b nodes. In this case, I don't think what you propose can be done? I guess in this case there is not much you can do without losing information?

Just something that might be interesting for someone. If you make super-nodes consisting of all possible state combinations, as I have done in the picture below, then the first order markov assumption holds, and we do not need to consider the evidence at time-step 1 if we update to posterior value of x_1. This is of course useless for any larger networks.
super_state_correct.png
super_state_correct.png (20.19 KiB) Viewed 535 times
Updated file with super-nodes:
unexpected_evaluation.xdsl
(8.56 KiB) Downloaded 20 times

marek [BayesFusion]
Site Admin
Posts: 306
Joined: Tue Dec 11, 2007 4:24 pm

Re: Unexpected evaluation result

Post by marek [BayesFusion] » Thu Oct 22, 2020 12:42 pm

Hi Sverre,

If the nodes a1 and b1 are not evidence nodes, you can marginalize them to simplify your model. This should work fine. Please just remember that marginalization removes the two nodes while preserving the properties of the rest of the model. After you have marginalized these nodes, entered evidence, and have updated the model, you should get the same results as when entering evidence and updating in the original model. To avoid the bug that I wrote about before, please turn off relevance or modify the tables so that they are not deterministic. We will fix this bug in the next (minor) release.

Your super-nodes get priors equal to their posteriors before removing the nodes from the previous slices, correct?
Cheers,

Marek

shooltz[BayesFusion]
Site Admin
Posts: 1270
Joined: Mon Nov 26, 2007 5:51 pm

Re: Unexpected evaluation result

Post by shooltz[BayesFusion] » Fri Nov 06, 2020 2:04 pm

The problems with marginalization were fixed in GeNIe 3.0.5905, which is now available for download.

Post Reply