k2 ordering

The front end.
Post Reply
frito
Posts: 32
Joined: Wed Dec 12, 2007 8:51 pm

k2 ordering

Post by frito »

How does K2 structure learning algorithm choose its ordering in Genie?
mark
Posts: 179
Joined: Tue Nov 27, 2007 4:02 pm

Post by mark »

We don't have the K2 structure learning algorithm in GeNIe (or SMILE). We do have the K2 prior over parameters that is used in Greedy Thick Thinning and Naive Bayes, but in neither case an ordering is required.
frito
Posts: 32
Joined: Wed Dec 12, 2007 8:51 pm

Post by frito »

mark wrote:We don't have the K2 structure learning algorithm in GeNIe (or SMILE). We do have the K2 prior over parameters that is used in Greedy Thick Thinning and Naive Bayes, but in neither case an ordering is required.
So it is not related to the Coopers K2 algorithm? Please see here http://www.cs.ubc.ca/~murphyk/Software/ ... ge.html#K2
this implementation needs a priory Ordering.
mark
Posts: 179
Joined: Tue Nov 27, 2007 4:02 pm

Post by mark »

They are two different things. (Although it may be that Cooper also used the K2 prior in his K2 algorithm and that's how it got its name. Not sure though.)
frito
Posts: 32
Joined: Wed Dec 12, 2007 8:51 pm

Post by frito »

mark wrote:They are two different things. (Although it may be that Cooper also used the K2 prior in his K2 algorithm and that's how it got its name. Not sure though.)

Heres the coopers algorithm (http://citeseerx.ist.psu.edu/viewdoc/su ... .1.88.4436)

g(,) function is I think the marginal likelihood for i-th node and its parents
Attachments
K2 algorithm cooper herskovits.PNG
K2 algorithm cooper herskovits.PNG (30.87 KiB) Viewed 11475 times
frito
Posts: 32
Joined: Wed Dec 12, 2007 8:51 pm

Post by frito »

mark wrote:We don't have the K2 structure learning algorithm in GeNIe (or SMILE). We do have the K2 prior over parameters that is used in Greedy Thick Thinning and Naive Bayes, but in neither case an ordering is required.
Can you please elaborate on the K2 prior, I cant find any alternative structure learnign that would not refere to the coopers method, but his method requires an ORDERING...? :shock:
thx
mark
Posts: 179
Joined: Tue Nov 27, 2007 4:02 pm

Post by mark »

Look at Heckerman's tutorial on learning with Bayesian networks for approaches that don't require an ordering over the variables.
frito
Posts: 32
Joined: Wed Dec 12, 2007 8:51 pm

Post by frito »

mark wrote:Look at Heckerman's tutorial on learning with Bayesian networks for approaches that don't require an ordering over the variables.
I am writing a thesis and I used hill climbing wiht k2 from Genie for some stuff, I need to write a little note about the methods I used so if u please can tell me more about your k2 algorithm implemented in Genie?


thx
Last edited by frito on Wed Oct 01, 2008 2:09 am, edited 1 time in total.
mark
Posts: 179
Joined: Tue Nov 27, 2007 4:02 pm

Post by mark »

The GreedyThickThinning algorithm in GeNIe is based on Heckerman's tutorial. The tutorial also discusses the BDe prior, one of the options for a prior in GeNIe, the other being the K2 prior. The K2 prior is defined in the following way. The prior distribution of every parameter associated with a node and a combination of its parents has a Dirichlet(1,...,1) distribution. In Latex notation it would be something like P(\theta_{X_i|Pa(x_i)})=Dirichlet(1,...1,). The number of ones depends on the number of states in the node. Let me know if you still have questions.
frito
Posts: 32
Joined: Wed Dec 12, 2007 8:51 pm

Post by frito »

mark wrote:The GreedyThickThinning algorithm in GeNIe is based on Heckerman's tutorial. The tutorial also discusses the BDe prior, one of the options for a prior in GeNIe, the other being the K2 prior. The K2 prior is defined in the following way. The prior distribution of every parameter associated with a node and a combination of its parents has a Dirichlet(1,...,1) distribution. In Latex notation it would be something like P(\theta_{X_i|Pa(x_i)})=Dirichlet(1,...1,). The number of ones depends on the number of states in the node. Let me know if you still have questions.
many thanks Mark!
Post Reply