Bagging soft decision trees
Yükleniyor...
Dosyalar
Tarih
2016
Dergi Başlığı
Dergi ISSN
Cilt Başlığı
Yayıncı
Springer Verlag
Erişim Hakkı
info:eu-repo/semantics/closedAccess
Özet
The decision tree is one of the earliest predictive models in machine learning. In the soft decision tree, based on the hierarchical mixture of experts model, internal binary nodes take soft decisions and choose both children with probabilities given by a sigmoid gating function. Hence for an input, all the paths to all the leaves are traversed and all those leaves contribute to the final decision but with different probabilities, as given by the gating values on the path. Tree induction is incremental and the tree grows when needed by replacing leaves with subtrees and the parameters of the newly-added nodes are learned using gradient-descent. We have previously shown that such soft trees generalize better than hard trees; here, we propose to bag such soft decision trees for higher accuracy. On 27 two-class classification data sets (ten of which are from the medical domain), and 26 regression data sets, we show that the bagged soft trees generalize better than single soft trees and bagged hard trees. This contribution falls in the scope of research track 2 listed in the editorial, namely, machine learning algorithms.
Açıklama
Anahtar Kelimeler
Bagging, Decision trees, Regression trees, Regularization
Kaynak
WoS Q Değeri
Q4
Scopus Q Değeri
Q3
Cilt
9605
Sayı
Künye
Yıldız, O. T., İrsoy, O. & Alpaydın, A. İ. E. (2016). Bagging soft decision trees. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 9605, 25-36. doi:10.1007/978-3-319-50478-0_2