Search found 433 matches
- Fri Mar 18, 2022 1:38 pm
- Forum: GeNIe
- Topic: problem with unrolling DBN
- Replies: 4
- Views: 2458
Re: problem with unrolling DBN
Hi Bahman, Most definitely, you can edit the unrolled network any way you please. You can also edit the equations. We do not support equations in DBNs yet but it is a matter of time -- we just haven't had time to do that but we will most certainly return to this issue. Essentially, we need to extend...
- Tue Mar 15, 2022 10:35 pm
- Forum: SMILE
- Topic: inference alogorithm with forward operation
- Replies: 1
- Views: 4227
Re: inference alogorithm with forward operation
Hi Bahman, You should be able to create a network like the one in the picture using SMILE. It may be a challenging network in terms of the number of parameters and perhaps inference but well, SMILE is very, very fast, so you may not even notice the time that it takes to update this network. If the s...
- Thu Mar 10, 2022 1:23 pm
- Forum: GeNIe
- Topic: how to interprete the result of sensitivity analysis
- Replies: 4
- Views: 2604
Re: how to interprete the result of sensitivity analysis
Hi Yan, First of all, if you want your results to be repeatable, please set the random number seed to something that is not zero. A zero seed pretty much uses the computer clock as the seed to the random number generators. The parameters will change when you change the seed (zero seed means that you...
- Wed Feb 23, 2022 9:28 pm
- Forum: GeNIe
- Topic: PRIOR LINK PROBABILITY parameter of the Bayesian Search algorithm
- Replies: 1
- Views: 1762
Re: PRIOR LINK PROBABILITY parameter of the Bayesian Search algorithm
Hi, The "prior link probability" parameter influences the likelihood of the Bayesian Search algorithm producing arcs, as higher values favor arcs. So, it is not really surprising that when you increased its value, you got a denser network. My answer to the question whether you can modify t...
- Thu Feb 10, 2022 7:56 pm
- Forum: GeNIe
- Topic: Ordinal Data > Soft Observations
- Replies: 3
- Views: 2647
Re: Ordinal Data > Soft Observations
Dear Marco, I doubt the automatic discretization of continuous variables will be of much help to you. What it does is turning a continuous variable into a discrete variable (producing a CPT for it based on the definitions of the parents and the node in question). Since you don't have a parametric de...
- Wed Feb 09, 2022 10:37 pm
- Forum: GeNIe
- Topic: continuous nodes
- Replies: 1
- Views: 1695
Re: continuous nodes
Hi, The EM algorithm is general enough to handle all possible cases in theory, some are easier and straightforward, other complex. It is just that we have not yet implemented anything outside the discrete case. We will do it at some point but in the next few months due to other priorities. I guess y...
- Tue Feb 08, 2022 3:52 pm
- Forum: GeNIe
- Topic: Ordinal Data > Soft Observations
- Replies: 3
- Views: 2647
Re: Ordinal Data > Soft Observations
Hi, Let me try to help, although I realize that I may not know enough to give useful suggestions. I know that you want to use discrete variables but have you considered continuous variables with Gaussian distributions and linear relationships? Because your scores are averages and are continuous, the...
- Tue Jan 25, 2022 7:39 pm
- Forum: GeNIe
- Topic: Influential samples
- Replies: 1
- Views: 1267
Re: Influential samples
No, I'm afraid there is nothing like that in GeNIe. The idea is interesting, though!
- Sat Jan 22, 2022 12:17 pm
- Forum: GeNIe
- Topic: learn parameters with EM
- Replies: 3
- Views: 1777
Re: learn parameters with EM
Yes, you could view it this way.
Cheers,
Marek
Cheers,
Marek
- Mon Jan 10, 2022 3:35 pm
- Forum: GeNIe
- Topic: learn parameters with EM
- Replies: 3
- Views: 1777
Re: learn parameters with EM
Hi Heyifan, What you wrote is essentially correct, although the mechanics of this is somewhat more sophisticated than reducing the existing parameters to 100 records. If we did that, there would be a chance of never having records for some combination of values of parent nodes. The way it is impleme...
- Thu Dec 30, 2021 4:00 pm
- Forum: SMILE
- Topic: Complexity of large nodes vs many nodes
- Replies: 3
- Views: 5635
Re: Complexity of large nodes vs many nodes
Hi Sverre, Zeros are not dangerous because of the possibility of division by zero but rather multiplication. In Bayes theorem, the posterior is derived by multiplying the prior by an expression that represents the strength of evidence. If the prior is zero, then the posterior HAS TO be zero. For exa...
- Tue Dec 21, 2021 12:35 pm
- Forum: GeNIe
- Topic: learning parameter with missing values
- Replies: 3
- Views: 2024
Re: learning parameter with missing values
No missing values is just a special case for the EM algorithm, so there is no need for a special separate algorithm for data with no missing values. I hope this helps,
Marek
Marek
- Sat Dec 18, 2021 4:38 pm
- Forum: SMILE
- Topic: Entropy reduction
- Replies: 15
- Views: 15354
Re: Entropy reduction
Hi Lotte, Glad to be of help. I don't have any reference handy but you can easily find references to cross entropy using Google search. I tried "cross-entropy" and "cross-entropy" application with many interesting results. "cross-entropy" "SMILE" BayesFusion w...
- Tue Dec 14, 2021 10:25 pm
- Forum: SMILE
- Topic: Complexity of large nodes vs many nodes
- Replies: 3
- Views: 5635
Re: Complexity of large nodes vs many nodes
Hi Sverre, I don't believe I could answer your question general. My feeling is that the network with individual nodes will be better but to make an informed judgment I should see and examine the two networks. You are mentioning ease of getting the numbers. This is a very important factors, as speed ...
- Tue Dec 14, 2021 10:45 am
- Forum: GeNIe
- Topic: Creating a node which sums the results of parent nodes' states
- Replies: 3
- Views: 4626
Re: Creating a node which sums the results of parent nodes' states
I will be happy to look at this problem. Would you be willing to share the piece of our model that produces this behavior?
Cheers,
Marek
Cheers,
Marek