VC-dimension of univariate decision trees

Yükleniyor...
Küçük Resim

Tarih

2015-02-25

Dergi Başlığı

Dergi ISSN

Cilt Başlığı

Yayıncı

IEEE-INST Electrical Electronics Engineers Inc

Erişim Hakkı

info:eu-repo/semantics/closedAccess

Araştırma projeleri

Organizasyon Birimleri

Dergi sayısı

Özet

In this paper, we give and prove the lower bounds of the Vapnik-Chervonenkis (VC)-dimension of the univariate decision tree hypothesis class. The VC-dimension of the univariate decision tree depends on the VC-dimension values of its subtrees and the number of inputs. Via a search algorithm that calculates the VC-dimension of univariate decision trees exhaustively, we show that our VC-dimension bounds are tight for simple trees. To verify that the VC-dimension bounds are useful, we also use them to get VC-generalization bounds for complexity control using structural risk minimization in decision trees, i.e., pruning. Our simulation results show that structural risk minimization pruning using the VC-dimension bounds finds trees that are more accurate as those pruned using cross validation.

Açıklama

Anahtar Kelimeler

Computation theory, Decision trees, Learning, Machine learning, Supervised learning, Vapnik-Chervonenkis (VC)-dimension, Model selection, Classifiers, Regression, Complexity, Bounds

Kaynak

WoS Q Değeri

Q1

Scopus Q Değeri

Cilt

26

Sayı

2

Künye

Yıldız, O. T. (2015). VC-dimension of univariate decision trees. IEEE Transactions on Neural Networks and Learning Systems, 26(2), 378-387. doi:10.1109/TNNLS.2014.2385837