File size

The engine.
Post Reply
jon.a.cooper
Posts: 1
Joined: Sun Aug 24, 2008 8:41 pm

File size

Post by jon.a.cooper »

Hi,

I am a new SMILE/GeNIe user and am currently building a bayesian network and quickly reached my "Memory Full" capacity as I try to add arcs to the network.

I am using a PC with 4 GB RAM so I was surprised to max out so quickly. Are there commone ways of dealing with this?
- i.e., algorithms that SMILE can use to minimize memory usage
- External hosting (i.e., Amazon J2EE hosting)
- Other solutions?

Also, as far as using cloud computing - my understanding is that Amazon requires changes to the source code since their framework virtualizes the hardware - can the SMILE/GENIE source code accomodate this? If so, do any changes or ad-ins need to be made to use cloud computing.

Thanks for you help.
Jon
shooltz[BayesFusion]
Site Admin
Posts: 1417
Joined: Mon Nov 26, 2007 5:51 pm

Re: File size

Post by shooltz[BayesFusion] »

jon.a.cooper wrote:I am a new SMILE/GeNIe user and am currently building a bayesian network and quickly reached my "Memory Full" capacity as I try to add arcs to the network. I am using a PC with 4 GB RAM so I was surprised to max out so quickly.
It's very easy to create a model which is impossible to solve (or even to represent): with all binary nodes, each arc added doubles the size of CPT in child node.
Are there commone ways of dealing with this?
- i.e., algorithms that SMILE can use to minimize memory usage
As far as memory used for inference (not representation) is involved - and you're interested only in the beliefs of the subset of nodes in the network - you can mark the nodes of interest as relevance targets ('Set Target' command in GeNIe and DSL_network::SetTarget method in SMILE). This may help, but there's no guarantee, since the amount of memory used is not a simple function of node/outcome/arc count.

Also, this topic on our forum contains some information - see the 'parent divorcing:
http://genie.sis.pitt.edu/forum/viewtop ... =divorcing
- External hosting (i.e., Amazon J2EE hosting)
I don't think this could help you. Usually when model is too complex, the amount of memory required simply goes through the roof and reaches the level of petabytes.
Post Reply