Decision Analysis

The front end.
Post Reply
jdtoellner
Posts: 71
Joined: Mon Aug 01, 2016 9:45 pm

Decision Analysis

Post by jdtoellner »

The typical method for decision analysis is to define criteria with associated weights and then score alternatives. It's a very flawed technique.

1. Weights and scores characterize subjective assessments as discrete values.
2. If multiple criteria are not independent you end up inadvertently overweighting.
3. Because the overall, weighted score is a discrete value a lot of information is lost.

I've discovered that a network of Noisy Adder nodes provides a better alternative. The attached network contains ten Criteria nodes (chance nodes). Their parents are Noisy Adder nodes (Assessment). I split the parents into two nodes (Assessment_1 and Assessment_2); ten parents is too many for a Noisy Adder node with eight states. In fact, seven parents is probably too much. The two parents have a single Assessment parent which aggregates the overall score. The Noisy Adder nodes all have weights for each of the Criteria nodes.

The states are: Negative; NegativeNeutral; NeutralNegative; Neutral; NeutralPositive; PositiveNeutral; Positive; and Disregard. The Disregard state nullifies the influence of the child node; a nice feature.

The overall Assessment node aggregates scores based on weights in the Assessment nodes. Since the top level node has to intermediate parents you have to multiply the weights to get the overall weight. If you set the top level Assessment node as a Target you can use the Sensitivity Analysis feature to see, graphically, the influence of the Criteria nodes.

I added an equation node called Distribution as a parent to the overall Assessment node. (If you want to do a Sensitivity Analysis you have to delete the Distribution node.)

The Distribution equation node will give you the mean and standard deviation of the Assessment score. Rather than having an over-simplified discrete score GeNie gives you a distribution. You might prefer an alternative with a slightly lower score that has a tighter distribution.
Evaluation Model.xdsl
(11.98 KiB) Downloaded 627 times
jdtoellner
Posts: 71
Joined: Mon Aug 01, 2016 9:45 pm

Re: Decision Analysis

Post by jdtoellner »

Something else you can do with a GeNie BBN: You may have non-independent criteria that you want to keep. With a BBN you can tier dependent criteria into a single child node so that they don't over-influence the outcome.

One other point: The use of Noisy Adder nodes in the attached network doesn't take advantage of that node's ability to be noisy. Simple weighting allows you to determine a criteria's influence. The attached network does this with CPTs that are all 1's and 0's. If you use more refined probabilities you can be more sophisticated about how a criteria influences the outcome.
marek [BayesFusion]
Site Admin
Posts: 430
Joined: Tue Dec 11, 2007 4:24 pm

Re: Decision Analysis

Post by marek [BayesFusion] »

This is a creative way of using the Noisy Adder. I have always been a proponent of making things explicit, so in decision analysis I usually suggest influence diagrams with decision and utility nodes. Multi-Attribute Utility (MAU) nodes are able to model any relationship among attributes, Additive-Linear Utility (ALU) nodes will simplify the relationships to linear combinations/dependences. It would be interesting to translate your model to an ALU model or, if this does not work, to a general purpose MAU. GeNIe offers the most comprehensive treatment of utility functions that I am aware of among graphical modeling software. Software using decision trees may be here and there more advances, although I am not even sure about this and decision trees are completely unsuitable for any serious computer-based application because of their exponential growth. GeNIe's MAU nodes are pretty much a pre-cursor of the current hybrid nodes, where you can use any function to express utility over the parent nodes.
Cheers,

Marek
jdtoellner
Posts: 71
Joined: Mon Aug 01, 2016 9:45 pm

Re: Decision Analysis

Post by jdtoellner »

Here's a network that does simple scoring with utility nodes and a decision node. Weights are equal.
Attachments
Evaluation Model (simple scoring).xdsl
(3.22 KiB) Downloaded 623 times
jdtoellner
Posts: 71
Joined: Mon Aug 01, 2016 9:45 pm

Re: Decision Analysis

Post by jdtoellner »

Here's a model that does weighted scoring.
Attachments
Evaluation Model (weighted scoring).xdsl
(3.66 KiB) Downloaded 613 times
jdtoellner
Posts: 71
Joined: Mon Aug 01, 2016 9:45 pm

Re: Decision Analysis

Post by jdtoellner »

Here's a model that contains non-independent criteria. I added MAU nodes that compute a weighted average.
Attachments
Evaluation Model (combined scoring).xdsl
(4.11 KiB) Downloaded 597 times
jdtoellner
Posts: 71
Joined: Mon Aug 01, 2016 9:45 pm

Re: Decision Analysis

Post by jdtoellner »

Here's one with options, sub options, and sub-sub options; with criteria for each.
Attachments
Evaluation Model (multiple options).xdsl
(6.52 KiB) Downloaded 592 times
jdtoellner
Posts: 71
Joined: Mon Aug 01, 2016 9:45 pm

Re: Decision Analysis

Post by jdtoellner »

Here's one where chance nodes influence the utility nodes.
Attachments
Evaluation Model (chance influenced).xdsl
(3.36 KiB) Downloaded 589 times
jdtoellner
Posts: 71
Joined: Mon Aug 01, 2016 9:45 pm

Re: Decision Analysis

Post by jdtoellner »

This gives me the building blocks. Am I missing anything?

If not, I now need to create a real-life example.
marek [BayesFusion]
Site Admin
Posts: 430
Joined: Tue Dec 11, 2007 4:24 pm

Re: Decision Analysis

Post by marek [BayesFusion] »

Each of the models that you have uploaded here is a somewhat unusual/specialized influence diagram.

The theory says it that you need a single childless utility node (this can be a single utility node or an ALU/MAU node that collects/combines utilities from multiple utility nodes). Early on (1990s), when building GeNIe, we have decided to relax this requirement and assumed that in case there are more than one childless utility nodes, there is an implicit ALU node that combines the values from these utility nodes with equal weights of 1.0 each. This, we believed (and still believe), can be handy when a complex, perhaps periodic/repetitive, model is build that essentially collects rewards/penalties with equal weights (for example, monetary rewards/penalties). Your models will work because of this assumption -- the terminal nodes are weighted to come up with the ultimate expected utility at the decision node.

Aside from that, some elements of your model are classical:

(1) Evaluation Model (simple scoring).xdsl is equivalent to Evaluation Model (weighted scoring).xdsl, when all weights in the ALU node are equal to 1. Evaluation Model (weighted scoring).xdsl itself is a classical collection of utility nodes that are combined into a MAU/ALU node.

(2) Evaluation Model (combined scoring).xdsl is somewhat unorthodox in that it has two terminal nodes but it is equivalent to a model with an additional ALU node with four weights of 1.0 each (please note that Criteria4 and Criteria5 are terminal nodes in addition to the two Combined Criteria nodes.

(3) Evaluation Model (multiple options).xdsl has three independent decision nodes but the ultimate diagram that it represents is one in which all the utility nodes are combined with weights of 1.0.

(4) Evaluation Model (chance influenced).xdsl is exactly what influence diagrams do -- combine probabilities with utilities by mathematical expectations. Two remarks here: (a) It also has an implicit ALU node, like the models above, and (b) Because the decision does not influence the chance node, it is a diagnostic decision.

I found the book of Clemen, which I use in my Decision Analysis class, quite nice in explaining influence diagram modeling. It may be a little too elementary for you (it is an advanced undergraduate textbook). Ron Howard has also published or intended to publish a textbook on influence diagrams.
I hope this helps,

Marek
charlie
Posts: 66
Joined: Wed Aug 09, 2017 10:55 pm

Re: Decision Analysis

Post by charlie »

Thanks for those posts and examples. It really helps.
Post Reply