Entropy/Value of Information

The engine.
Post Reply
MarcinK
Posts: 8
Joined: Thu Oct 16, 2014 3:11 pm

Entropy/Value of Information

Post by MarcinK »

Hello,

I would like to get the entropy for all unobserved observable variables in the Bayesian network with given target node(s) (like in Test View in GeNIe). As far as I investigated the header files, it could be obtained from DIAG_network class using DIAG_testInfo structure.

GetTestStatistics() returns an empty vector. I tried calling the methods: ComputeTestStrengths(), UpdateFaultBeliefs() with setting target as fault - still nothng.

How can I get to the entropy? Do I have to implement it on my own?

--Marcin
shooltz[BayesFusion]
Site Admin
Posts: 1417
Joined: Mon Nov 26, 2007 5:51 pm

Re: Entropy/Value of Information

Post by shooltz[BayesFusion] »

DIAG_network (as implemented in the SMILE library that you have) requires rather complex initialization. Can you post the code you're using to initialize the DIAG_network object before ComputeTestStrengths?

setting target as fault - still nothng.
Did you call DIAG_network::SetPursuedFault or DIAG_network::SetPursuedFaults?
MarcinK
Posts: 8
Joined: Thu Oct 16, 2014 3:11 pm

Re: Entropy/Value of Information

Post by MarcinK »

Here is the code that I was experimenting with:

Code: Select all

DIAG_network diagForVOI;
diagForVOI.LinkToNetwork(&workingNet);
DSL_intArray testsNodes;
for(int handle : targets) testsNodes.Add(handle);
diagForVOI.SetPursuedFaults(testsNodes);
diagForVOI.UpdateFaultBeliefs();
     
//diagForVOI.ComputeTestStrengths();
DSL_intArray unperformedTests=diagForVOI.GetUnperformedTests();
std::vector<DIAG_testInfo> testsStats=diagForVOI.GetTestStatistics();
for(int i=0; i<testsStats.size(); ++i) {
    printf("%d: \t%lf\t%lf\n", unperformedTests[i], testsStats[i].strength, testsStats[i].cost);
}
targets contains all the target handles of the network.

I saw, that there are structures DIAG_faultyState ans DIAG_faultInfo. As I understand I should use them, but I have no idea how to pass them to DIAG_network. Should I use CalculateRankedFaults? How to use it?
shooltz[BayesFusion]
Site Admin
Posts: 1417
Joined: Mon Nov 26, 2007 5:51 pm

Re: Entropy/Value of Information

Post by shooltz[BayesFusion] »

Test nodes are not fault nodes. Let's check basics first - did you create the network in GeNIe, or it's an output from SMILE? If it's the latter, did you call DSL_extraDefinition::SetType to indicate which nodes are diagnostic faults and observations?
MarcinK
Posts: 8
Joined: Thu Oct 16, 2014 3:11 pm

Re: Entropy/Value of Information

Post by MarcinK »

It's been created with GeNIe (I am not the creator, just using it), parameters were relearned with SMILE using artificial data. It has target node (outcomes) checked in "General" tab of one of the nodes.
shooltz[BayesFusion]
Site Admin
Posts: 1417
Joined: Mon Nov 26, 2007 5:51 pm

Re: Entropy/Value of Information

Post by shooltz[BayesFusion] »

What's the breakdown of 'Nodes by diagtype' in Network Properties/Summary page?
MarcinK
Posts: 8
Joined: Thu Oct 16, 2014 3:11 pm

Re: Entropy/Value of Information

Post by MarcinK »

Code: Select all

Node count: 27
Avg indegree: 1.889
Max indegree: 2
Avg outcomes: 4.074
Max outcomes: 8

Nodes	27	110	1134 / 865	
Chance - General	27	110	1134 / 865	
Nodes by diagtype				
Target	1	2	2 / 1	
Observation	26	108	1132 / 864	
Arcs	51			
Edit: I just made a correction... It is the summary for the network
shooltz[BayesFusion]
Site Admin
Posts: 1417
Joined: Mon Nov 26, 2007 5:51 pm

Re: Entropy/Value of Information

Post by shooltz[BayesFusion] »

Try this:

Code: Select all

diagnet->LinkToNetwork(net);
diagnet->CollectNetworkInfo();
diagnet->SetDefaultStates();
diagnet->UpdateFaultBeliefs();
int mostLikelyFault = diagnet->FindMostLikelyFault();
diagnet->SetPursuedFault(mostLikelyFault);
diagnet->UpdateFaultBeliefs();
int res = diagnet->ComputeTestStrengths();
if (DSL_OKAY == res)
{
    const vector<DIAG_testInfo> &stats = diagnet->GetTestStatistics();
    // do something with stats
}   
MarcinK
Posts: 8
Joined: Thu Oct 16, 2014 3:11 pm

Re: Entropy/Value of Information

Post by MarcinK »

Thank you! Works like a charm!

Some other questions (not urgent):

If there are multiple targets (a variable and one of its outcomes) and I use the same code snippet, then only one is taken into account to calculate the statistics. Is it the one, that is more probable, i.e., that has the largest posterior probability given the evidence present in the network?
shooltz[BayesFusion]
Site Admin
Posts: 1417
Joined: Mon Nov 26, 2007 5:51 pm

Re: Entropy/Value of Information

Post by shooltz[BayesFusion] »

Is it the one, that is more probable, i.e., that has the largest posterior probability given the evidence present in the network?
That's correct - note the DIAG_network::FindMostLikelyFault call. You can pass any other fault index to DIAG_network::SetPursuedFault. You can also pursue more than one fault using DIAG_network::SetPursuedFaults. Note that you're dealing with fault indices, not the node handles here. To get the fault information, use DIAG_network::GetFaults. To convert node handle/outcome index pair to fault index, use DIAG_network::FindFault.
If there are multiple targets (a variable and one of its outcomes)
The network may have more than one fault outcome per fault node. This approach may be useful when faults are mutually exclusive.
MarcinK
Posts: 8
Joined: Thu Oct 16, 2014 3:11 pm

Re: Entropy/Value of Information

Post by MarcinK »

Thanks a lot! It is very helpful!
Post Reply