You could try to use weka.classifiers.trees.M5P as the base learner in
AdditiveRegression. This normally works at least as well as AMT in terms of
accuracy (although the model will be larger and less interpretable).
In both, AMT and AdditiveRegression, tuning the shrinkage parameter is
normally required for best results. Of course, the number of iterations
also matters. With smaller values of the shrinkage parameter, more
iterations are normally required.
The IterativeClassifierOptimizer is the best tool to automatically tune the
number of iterations for AMT and AdditiveRegression.
On Fri, Feb 26, 2021 at 11:41 PM forky <padrelodewikileaks(a)gmail.com> wrote:
Understood. I found the AdditiveRegression in Weka.
I've run AMT with good
results (both accuracy and time consumption) and I'm wondering if it makes
sense to check all possible techniques within AR or AMT is somehow
Sent from: https://weka.8497.n7.nabble.com/
Wekalist mailing list -- wekalist(a)list.waikato.ac.nz
Send posts to wekalist(a)list.waikato.ac.nz
To unsubscribe send an email to wekalist-leave(a)list.waikato.ac.nz
To subscribe, unsubscribe, etc., visit