On the VC-dimension of univariate decision trees

Yükleniyor...
Küçük Resim

Tarih

2012

Dergi Başlığı

Dergi ISSN

Cilt Başlığı

Yayıncı

Erişim Hakkı

info:eu-repo/semantics/closedAccess

Araştırma projeleri

Organizasyon Birimleri

Dergi sayısı

Özet

In this paper, we give and prove lower bounds of the VC-dimension of the univariate decision tree hypothesis class. The VC-dimension of the univariate decision tree depends on the VC-dimension values of its subtrees and the number of inputs. In our previous work (Aslan et al., 2009), we proposed a search algorithm that calculates the VC-dimension of univariate decision trees exhaustively. Using the experimental results of that work, we show that our VC-dimension bounds are tight. To verify that the VC-dimension bounds are useful, we also use them to get VC-generalization bounds for complexity control using SRM in decision trees, i.e., pruning. Our simulation results shows that SRM-pruning using the VC-dimension bounds finds trees that are more accurate as those pruned using cross-validation.

Açıklama

Anahtar Kelimeler

Decision trees, VC-dimension, Pattern recognition, Cross validation, Lower bounds, Search algorithms, Subtrees, Univariate

Kaynak

ICPRAM 2012 - Proceedings of the 1st International Conference on Pattern Recognition Applications and Methods

WoS Q Değeri

Scopus Q Değeri

N/A

Cilt

1

Sayı

Künye

Yıldız, O. T. (2012). On the VC-dimension of univariate decision trees. Paper present et the ICPRAM 2012 - Proceedings of the 1st International Conference on Pattern Recognition Applications and Methods, 205-210. doi:10.5220/0003777202050210