Search found 430 matches
- Wed Feb 23, 2022 9:28 pm
- Forum: GeNIe
- Topic: PRIOR LINK PROBABILITY parameter of the Bayesian Search algorithm
- Replies: 1
- Views: 1544
Re: PRIOR LINK PROBABILITY parameter of the Bayesian Search algorithm
Hi, The "prior link probability" parameter influences the likelihood of the Bayesian Search algorithm producing arcs, as higher values favor arcs. So, it is not really surprising that when you increased its value, you got a denser network. My answer to the question whether you can modify t...
- Thu Feb 10, 2022 7:56 pm
- Forum: GeNIe
- Topic: Ordinal Data > Soft Observations
- Replies: 3
- Views: 2355
Re: Ordinal Data > Soft Observations
Dear Marco, I doubt the automatic discretization of continuous variables will be of much help to you. What it does is turning a continuous variable into a discrete variable (producing a CPT for it based on the definitions of the parents and the node in question). Since you don't have a parametric de...
- Wed Feb 09, 2022 10:37 pm
- Forum: GeNIe
- Topic: continuous nodes
- Replies: 1
- Views: 1448
Re: continuous nodes
Hi, The EM algorithm is general enough to handle all possible cases in theory, some are easier and straightforward, other complex. It is just that we have not yet implemented anything outside the discrete case. We will do it at some point but in the next few months due to other priorities. I guess y...
- Tue Feb 08, 2022 3:52 pm
- Forum: GeNIe
- Topic: Ordinal Data > Soft Observations
- Replies: 3
- Views: 2355
Re: Ordinal Data > Soft Observations
Hi, Let me try to help, although I realize that I may not know enough to give useful suggestions. I know that you want to use discrete variables but have you considered continuous variables with Gaussian distributions and linear relationships? Because your scores are averages and are continuous, the...
- Tue Jan 25, 2022 7:39 pm
- Forum: GeNIe
- Topic: Influential samples
- Replies: 1
- Views: 983
Re: Influential samples
No, I'm afraid there is nothing like that in GeNIe. The idea is interesting, though!
- Sat Jan 22, 2022 12:17 pm
- Forum: GeNIe
- Topic: learn parameters with EM
- Replies: 3
- Views: 1319
Re: learn parameters with EM
Yes, you could view it this way.
Cheers,
Marek
Cheers,
Marek
- Mon Jan 10, 2022 3:35 pm
- Forum: GeNIe
- Topic: learn parameters with EM
- Replies: 3
- Views: 1319
Re: learn parameters with EM
Hi Heyifan, What you wrote is essentially correct, although the mechanics of this is somewhat more sophisticated than reducing the existing parameters to 100 records. If we did that, there would be a chance of never having records for some combination of values of parent nodes. The way it is impleme...
- Thu Dec 30, 2021 4:00 pm
- Forum: SMILE
- Topic: Complexity of large nodes vs many nodes
- Replies: 3
- Views: 3912
Re: Complexity of large nodes vs many nodes
Hi Sverre, Zeros are not dangerous because of the possibility of division by zero but rather multiplication. In Bayes theorem, the posterior is derived by multiplying the prior by an expression that represents the strength of evidence. If the prior is zero, then the posterior HAS TO be zero. For exa...
- Tue Dec 21, 2021 12:35 pm
- Forum: GeNIe
- Topic: learning parameter with missing values
- Replies: 3
- Views: 1500
Re: learning parameter with missing values
No missing values is just a special case for the EM algorithm, so there is no need for a special separate algorithm for data with no missing values. I hope this helps,
Marek
Marek
- Sat Dec 18, 2021 4:38 pm
- Forum: SMILE
- Topic: Entropy reduction
- Replies: 15
- Views: 12474
Re: Entropy reduction
Hi Lotte, Glad to be of help. I don't have any reference handy but you can easily find references to cross entropy using Google search. I tried "cross-entropy" and "cross-entropy" application with many interesting results. "cross-entropy" "SMILE" BayesFusion w...
- Tue Dec 14, 2021 10:25 pm
- Forum: SMILE
- Topic: Complexity of large nodes vs many nodes
- Replies: 3
- Views: 3912
Re: Complexity of large nodes vs many nodes
Hi Sverre, I don't believe I could answer your question general. My feeling is that the network with individual nodes will be better but to make an informed judgment I should see and examine the two networks. You are mentioning ease of getting the numbers. This is a very important factors, as speed ...
- Tue Dec 14, 2021 10:45 am
- Forum: GeNIe
- Topic: Creating a node which sums the results of parent nodes' states
- Replies: 3
- Views: 4262
Re: Creating a node which sums the results of parent nodes' states
I will be happy to look at this problem. Would you be willing to share the piece of our model that produces this behavior?
Cheers,
Marek
Cheers,
Marek
- Thu Dec 09, 2021 10:55 pm
- Forum: GeNIe
- Topic: Urgent Please - Imbalanced Data in Genie Software
- Replies: 4
- Views: 1149
Re: Urgent Please - Imbalanced Data in Genie Software
There are plenty of references, too many to list. A Google search on "imbalanced dataset handling" will do the job -- you will find review articles that will recommend you some methods and offer references.
Good luck!
Marek
Good luck!
Marek
- Thu Dec 09, 2021 8:49 pm
- Forum: GeNIe
- Topic: Urgent Please - Imbalanced Data in Genie Software
- Replies: 4
- Views: 1149
Re: Urgent Please - Imbalanced Data in Genie Software
GeNIe does not offer any special treatment to imbalanced data at the moment. We are planning to return to this issue quite likely in mid-late 2022. At the moment, you will have to do the pre-processing (balancing, boosting, or whatever you choose) yourself before you read your data into GeNIe. I hop...
- Mon Nov 29, 2021 3:24 pm
- Forum: GeNIe
- Topic: learning parameter with missing values
- Replies: 3
- Views: 1500
Re: learning parameter with missing values
Hi HeyFan, Uniform distributions are as good as (or as bad as) a randomly chosen starting point and you should not view them as the best! To the contrary, it often leads to worse convergence. If you are worried about different convergence points, I suggest that you try a few randomizations and compa...