Dear sirs,
To illustrate the query, please update the attached GeNIe example . The noisyOR collider on the left gives the trivially correct result: if two independent and deterministic causes of an effect are each one 50% probable, then the effect has 75% chance of occurring. In the example on the right the collider is pretty much the same... but the two causes are generated by a common ancestor, in a way that the probability of each cause is still 50%. Then the prob of the effect should be 75%. And yet it is 59%! I tried and tried again to understand why, but I failed Is it perhaps some strange carachteristic of the algorithm that implement the noisyMAX nodes, or is it a bug? Thanks a lot if you can help me.
query about noisyMAX algorithm
-
- Site Admin
- Posts: 430
- Joined: Tue Dec 11, 2007 4:24 pm
Re: query about noisyMAX algorithm
The problem is quite simple and quite independent of whether the node Effect is a NoisyOR or is just a regular chance variable. P(E) is equal to:
P(E)=P(E|AB)P(AB)+P(E|A~B)P(A~B)+P(E|~AB)P(~AB)+P(E|~A~B)P(~A~B)
In the first case, i.e., without the Ancestor, A and B are independent of each other and we can write:
P(E)=P(E|AB)P(A)P(B)+P(E|A~B)P(A)P(~B)+P(E|~AB)P(~A)P(B)+P(E|~A~B)P(~A)P(~B)
We cannot do that in the second case, as A and B are not independent, so the formulae for calculating P(E) are different in the two cases. Hence the numerical result will be also different in general. You can check it by modifying the CPTs in A and B. These CPTs, along with the prior distribution over Ancestor, determine the joint distribution of A and B and the final result of P(E). Does this make sense?
Marek
P(E)=P(E|AB)P(AB)+P(E|A~B)P(A~B)+P(E|~AB)P(~AB)+P(E|~A~B)P(~A~B)
In the first case, i.e., without the Ancestor, A and B are independent of each other and we can write:
P(E)=P(E|AB)P(A)P(B)+P(E|A~B)P(A)P(~B)+P(E|~AB)P(~A)P(B)+P(E|~A~B)P(~A)P(~B)
We cannot do that in the second case, as A and B are not independent, so the formulae for calculating P(E) are different in the two cases. Hence the numerical result will be also different in general. You can check it by modifying the CPTs in A and B. These CPTs, along with the prior distribution over Ancestor, determine the joint distribution of A and B and the final result of P(E). Does this make sense?
Marek
-
- Posts: 26
- Joined: Thu Mar 24, 2022 9:00 am
Re: query about noisyMAX algorithm
Hi Marek, you're totally right (I re-made the calculations in excel to be sure 🙂).
Thanks a lot.
Thanks a lot.
Re: query about noisyMAX algorithm
Hi, I am a beginner trying to learn the Bayesian network (specially the usage of Noisy-OR gates as i have child nodes with more than 10 parent nodes). I was wondering about the same issue and why it is different. I understand that now they are independent in the second case due to the ancestor node. But can you please show me the calculation about this (preferably in noisy-or). Please it would help me alot. Thank you in advance.
-
- Site Admin
- Posts: 430
- Joined: Tue Dec 11, 2007 4:24 pm
Re: query about noisyMAX algorithm
Don't the formulas in my earlier post help?
P(E)=P(E|AB)P(AB)+P(E|A~B)P(A~B)+P(E|~AB)P(~AB)+P(E|~A~B)P(~A~B)
vs.
P(E)=P(E|AB)P(A)P(B)+P(E|A~B)P(A)P(~B)+P(E|~AB)P(~A)P(B)+P(E|~A~B)P(~A)P(~B)
You can construct a simple model in GeNIe and plug in some numbers in the CPTs to see whether the theoretical difference is indeed working in practice. I bet it will :-).
Cheers,
Marek
P(E)=P(E|AB)P(AB)+P(E|A~B)P(A~B)+P(E|~AB)P(~AB)+P(E|~A~B)P(~A~B)
vs.
P(E)=P(E|AB)P(A)P(B)+P(E|A~B)P(A)P(~B)+P(E|~AB)P(~A)P(B)+P(E|~A~B)P(~A)P(~B)
You can construct a simple model in GeNIe and plug in some numbers in the CPTs to see whether the theoretical difference is indeed working in practice. I bet it will :-).
Cheers,
Marek
Re: query about noisyMAX algorithm
Hi Marek, Thank you for your reply. While I know the calculation when they are independent ex p(A) p(B), what should be the values when they are dependent ex P(AB). Does it transform to P(A/B) P(B/C) which is P(A) P(B/C), where C is the ancestor node? I would really appreciate it if you can clear this out for me or share any relevant literature about this.
-
- Site Admin
- Posts: 430
- Joined: Tue Dec 11, 2007 4:24 pm
Re: query about noisyMAX algorithm
Please look at a good Bayesian networks textbook. Pearl 1988 perhaps?
When two adjacent variables are dependent, you can see it in the CPT: the distributions are different for different states of the parent. P(A,B)=P(A|B)P(B) for A and B being dependent. When they are independent, P(A|B)=P(A), i.e., learning B tells you nothing about A. I hope this helps.
Marek
When two adjacent variables are dependent, you can see it in the CPT: the distributions are different for different states of the parent. P(A,B)=P(A|B)P(B) for A and B being dependent. When they are independent, P(A|B)=P(A), i.e., learning B tells you nothing about A. I hope this helps.
Marek
Re: query about noisyMAX algorithm
Yes it helps. Thanks! What I don't understand is that in the example case the dependency of A and B comes from the ancestral node (let's say Node C). So the P(A,B)=P(A|B)P(B) for A and B being dependent is then P(A/B, C) P(B/C)? I am still not sure how you account the Node C.
-
- Site Admin
- Posts: 430
- Joined: Tue Dec 11, 2007 4:24 pm
Re: query about noisyMAX algorithm
I see. I misunderstood your example but have have clarified it sufficiently. When A and B have an unobserved common parent C, you can write the following:
P(A,B)=Sigma_over_i P(A|B,C_i)P(B|C_i)P(C_i)
You write this formula for all possible states of C_i. Because both A and B are dependent on C, you cannot simplify this formula further. So, you have still the situation that P(A,B) is not equal to P(A)P(B). Does this help?
Marek
P(A,B)=Sigma_over_i P(A|B,C_i)P(B|C_i)P(C_i)
You write this formula for all possible states of C_i. Because both A and B are dependent on C, you cannot simplify this formula further. So, you have still the situation that P(A,B) is not equal to P(A)P(B). Does this help?
Marek
Re: query about noisyMAX algorithm
Hi again,
Yes! That clears everything. Thank you so much!
Yes! That clears everything. Thank you so much!