Em based learning problem

The engine.
Looney
Posts: 12
Joined: Sat Jul 11, 2009 9:23 pm

Post by Looney »

Hey mate, thanks for your reply, so far from the initial test i have ran i found Asis and Epis give the highest accuracy and precision in my model but that was only close to 58% only. I also found some inconsistencies in my data and i am set out to improve on that , and probably would provide more feedback soon. I 'd still appreciate if you would help find some one to help with Gibbs vs. EPIS.
Thanks very much in advance.
:D
shooltz[BayesFusion]
Site Admin
Posts: 1473
Joined: Mon Nov 26, 2007 5:51 pm

Post by shooltz[BayesFusion] »

Looney wrote:Hey mate, thanks for your reply, so far from the initial test i have ran i found Asis and Epis give the highest accuracy and precision in my model but that was only close to 58% only.
Did you try to run EM without calling SetDefaultBNAlgorithm first to keep the default, exact inference algorithm?
Looney
Posts: 12
Joined: Sat Jul 11, 2009 9:23 pm

Post by Looney »

yes boss i did do that though the output did n't turn out to have higher accuracy than the Asis and Epis. I can n't quite remember the exact figure off the top of my head but it was in 40%-ish range. Like i said i have found some issues with my input data where erroneous and missing items in the instance vector were also being marked as false, now i thought that is big mistake as it would have a negative effect. I am running some simulations which should produce better data and i 'll retry the learning and testing. Provide more feedback.
Cheers
marek [BayesFusion]
Site Admin
Posts: 449
Joined: Tue Dec 11, 2007 4:24 pm

Post by marek [BayesFusion] »

Looney wrote:Also out of curiosity i am also interested to find out if there are any plans to implement Gibbs Sampling for belief network side of things and if not is it possible to extend smile to add it. My mentor had originally recommended i use Gibbs sampling, that's the only reason i am interested to find out ?

I know Gibbs sampling is applicable when the joint distribution is not known explicitly, but the conditional distribution of each variable is known. I guess knowing my model now could you please shed some light on if Epis Sampling would be better than Gibbs Sampling or is it hard to say, that's considering the default does not kick both of them's butt.
Hi, we have no immediate plans to add Gibbs sampling to SMILE. To put this statement in a perspective, we did test the performance of Gibbs sampling and compared it to algorithms based on importance sampling. While it should perform better than importance sampling in theory, our experiments have shown the opposite. So, I suspect that EPIS is the fastest sampling algorithm known to us.

As far as your network and speed problems are concerned, the structure of your network is the culprit. I suggest that you move away from the V-structure, as the CPT in the child node grows exponentially in the number of parent nodes. Once you have done this, I advise using the default exact algorithm in EM. It should perform better than sampling.

I hope this helps.
Cheers,

Marek
Looney
Posts: 12
Joined: Sat Jul 11, 2009 9:23 pm

Post by Looney »

Thanks very much for your very informative post. I will try learning the structure of my model this time so i can hopefully come up with a better topology and try em with default exact inference algo. Would post back with newer findings in due time.

Thanks once again.
Post Reply