Entropy reduction

The engine.
Lotte Yanore
Posts: 12
Joined: Thu Feb 25, 2021 3:23 pm

Entropy reduction

Post by Lotte Yanore »

Good afternoon,

I am looking for a way to do entropy reduction using SMILE in r.
Is this possible as I do not see any info about it in the manual?

Thanks for your response!
Regards,
Lotte
marek [BayesFusion]
Site Admin
Posts: 430
Joined: Tue Dec 11, 2007 4:24 pm

Re: Entropy reduction

Post by marek [BayesFusion] »

Hi Lotte,

We support entropy-based value-of-information calculation, which is based on cross-entropy. Cross-entropy between two nodes T and E (Target and Evidence) is the expected reduction of entropy of T given that you will observe E. You can play with this in diagnostic extensions of GeNIe, QGeNIe and also BayesMobile. Is this what you are looking for?
Cheers,

Marek
Lotte Yanore
Posts: 12
Joined: Thu Feb 25, 2021 3:23 pm

Re: Entropy reduction

Post by Lotte Yanore »

Yes, but I would like to do this in R.

I am adjusting the model and at the same time studying the results. It's a lot of work to keep setting the faults and observation factors in GeNie which is necessary to run the diagnostics. Because I have to do it again every time I make an adjustment to the model through R.

Is there a way around this?

Thanks for your response!
Lotte Yanore
Posts: 12
Joined: Thu Feb 25, 2021 3:23 pm

Re: Entropy reduction

Post by Lotte Yanore »

I was wondering if what you describe here is the same as the "percent entropy reduction" used in this paper:
https://www.sciencedirect.com/science/a ... 1X13000887
Attachments
Entropy reduction.PNG
Entropy reduction.PNG (96.11 KiB) Viewed 9502 times
marek [BayesFusion]
Site Admin
Posts: 430
Joined: Tue Dec 11, 2007 4:24 pm

Re: Entropy reduction

Post by marek [BayesFusion] »

Hi Lotte,

Somebody else at BayesFusion will answer your R question. Let me just comment on the entropy reduction question. The formula used in GeNIe/SMILE is somewhat different. We calculate cross-entropy, which is the expected difference between the entropy of the target and the entropy of the target given the observation. I recognize the left-hand-side of the equation but not the right hand side. Unless I am missing something obvious, our calculations are somewhat different. Also, GeNIe calculates it for any node, not just a parent of the target variable.

I hope this helps,

Marek
Lotte Yanore
Posts: 12
Joined: Thu Feb 25, 2021 3:23 pm

Re: Entropy reduction

Post by Lotte Yanore »

Hi Marek,

Thanks a lot for your feedback.
I am looking forward to getting a response about the entropy reduction in R.

Best,
Lotte
shooltz[BayesFusion]
Site Admin
Posts: 1417
Joined: Mon Nov 26, 2007 5:51 pm

Re: Entropy reduction

Post by shooltz[BayesFusion] »

Below you can find an example using rSMILE's diagnostic functionality. The code loads the HeparII model (one of the example networks distributed with GeNIe). The pursued fault is set to Cirrhosis=compensate and two observations are instantiated. During diagnosis, always use DiagNetwork$instantiateObservation instead of Network$setEvidence. You can specify node as node handle or node id, and outcome as outcome index or outcome id.

You can call setPursuedFault, instantiateObservation or releaseObservation on the same DiagNetwork object, call DiagNetwork$update to get refreshed results.

Code: Select all

net <- Network()
net$readFile("HeparII.xdsl")
diag <- DiagNetwork(net)
faultIndex <- diag$getFaultIndex("Cirrhosis", "compensate")
diag$setPursuedFault(faultIndex)
diag$instantiateObservation("age", "age31_50");
diag$instantiateObservation("sex", 1L)
res <- diag$update()
cat("Fault probabilities:\n")
for (fi in res$faults) {
  cat(sprintf("P(%s=%s)=%f\n", net$getNodeId(fi$node), net$getOutcomeId(fi$node, fi$outcome), fi$probability))
}
cat("Diagnostic value of observations:\n")
for (oi in res$observations) {
  cat(sprintf("DiagValue(%s)=%f\n", net$getNodeId(oi$node), oi$infoGain))
}
Lotte Yanore
Posts: 12
Joined: Thu Feb 25, 2021 3:23 pm

Re: Entropy reduction

Post by Lotte Yanore »

Thank you for your response. I didn't find this in the manual, is it there? Can I find more information about it somewhere else?

I tried to use your code and apply it to my network. I keep getting the following error when I try to use getFaultIndex or instantiateObservation. Do you have any idea where I am going wrong?

Code: Select all

 diag <- DiagNetwork(net)
 faultIndex <- diag$getFaultIndex("Investment Timing", "Invest_Now")

Error in diag$getFaultIndex("Investment Timing", "Invest_Now") : 
RSmile error occured
SMILE Error Occured. Invalid node handle: -2

diag$instantiateObservation("Perceived Policy Uncertainty", "High")

Error in diag$instantiateObservation("Perceived Policy Uncertainty", "High") : 
  RSmile error occured
SMILE Error Occured. Invalid node handle: -2
Moreover, I would like to see the effect of 4 parent nodes on all 3 states of the child node 'simultaneously'. In GeNie I can select the three states I have set as faults and then run then see the effect of each parent node. Can you advise me on how to get this done?

If you prefer to explain in an online meeting, please let me know or email me at lotte.yanore@wur.nl.
shooltz[BayesFusion]
Site Admin
Posts: 1417
Joined: Mon Nov 26, 2007 5:51 pm

Re: Entropy reduction

Post by shooltz[BayesFusion] »

Error in diag$instantiateObservation("Perceived Policy Uncertainty", "High") :
RSmile error occured
SMILE Error Occured. Invalid node handle: -2
The first argument to DiagNetwork$instantiateObservation is either an integer node handle, or its identifier, which cannot contain spaces. I believe you tried to use node name. Node name does not have any constraints on its contents, but cannot be used to identify a node in SMILE API - the name is just human-readable string.

To pursue more than one fault, use DiagNetwork$setPursuedFaults.

We'll be updating the SMILE Wrappers manual to include the information on diagnostic features.
Lotte Yanore
Posts: 12
Joined: Thu Feb 25, 2021 3:23 pm

Re: Entropy reduction

Post by Lotte Yanore »

I should have told you this, but I also tried to use "i" and "InvTim" instead of: "Investment Timing" and neither of them work.
This function was created using the example in tutorial 1 from the manual.

This is how I defined this node:

Code: Select all

i <- createNMNode(net, "InvTim", "Investment Timing",
                  c("Invest_Now","Invest_Later","No_Investment"), 260L, 420L)
shooltz[BayesFusion]
Site Admin
Posts: 1417
Joined: Mon Nov 26, 2007 5:51 pm

Re: Entropy reduction

Post by shooltz[BayesFusion] »

To further troubleshoot this, please post your XDSL file. If the file contains confidential information, send me a private message.
shooltz[BayesFusion]
Site Admin
Posts: 1417
Joined: Mon Nov 26, 2007 5:51 pm

Re: Entropy reduction

Post by shooltz[BayesFusion] »

The model you've sent in the private message has no nodes with defined diagnostic roles (all nodes are diagnostic auxiliaries by default). Please refer to GeNIe manual's section 'Defining diagnostic information' for a general overview.

In SMILE wrappers, including rSMILE, the methods to set these attributes are Network$setNodeDiagType, Network$setFaultOutcome and Network$setRanked. There are also corresponding getters (getNodeDiagType, isFaultOutcome and isRanked.

As an example, consider changing the diagnostic role of the 'age' variable in HeparII.xdsl example (note that it does not make much real-world sense - age is an observation).

Code: Select all

net <- Network()
net$readFile("HeparII.xdsl")
net$getNodeDiagType("age")
net$setNodeDiagType("age", net$NodeDiagType$FAULT)
net$setFaultOutcome("age", "age0_30", TRUE);
If you build your model interactively in GeNIe and later load it into R, you can of course use node properties window to change the diagnostic attributes. No need to call setNodeDiagType in such case, as this information is already in the .xdsl file.
Lotte Yanore
Posts: 12
Joined: Thu Feb 25, 2021 3:23 pm

Re: Entropy reduction

Post by Lotte Yanore »

Hee Shooltz,

It is working now!
Very grateful for all of your help and the great service.
Would you have any advice on where to find what would be a cutting value (When is it high/low).

Best regards,
Lotte
marek [BayesFusion]
Site Admin
Posts: 430
Joined: Tue Dec 11, 2007 4:24 pm

Re: Entropy reduction

Post by marek [BayesFusion] »

Hi Lotte,

I'd love to help but I'm not 100% sure what cutting value you are referring to. Do you mean the value of cross-entropy that makes an observation worthwhile? I'm quite sure that GeNIe does not support any such threshold directly. As far as value of information in general is concerned, the cutting value is one that makes it more attractive to pursue information than not. There should be an explanation of that in the Influence Diagrams value of information. Cross-entropy is a funny measure that has no units and it is valuable only in the context of information coming from other possible sources. I don't think it is possible even theoretically to find such threshold.

Please let me know in case I misunderstood your question.
Cheers,

Marek
Lotte Yanore
Posts: 12
Joined: Thu Feb 25, 2021 3:23 pm

Re: Entropy reduction

Post by Lotte Yanore »

Hi Marek,
I would like to thank you again so much for the assistance!
One more question.. Would you be able to share some suggestions for literature I could use to describe the cross entropy reduction in my papers?
I was also wondering whether the entropy reduction in your software was previously used for scientific publications.
Thanks a lot and happy holidays.
Cheers,
Lotte
Post Reply