Yıldız, Olcay Taner2015-07-142015-07-142015-02-25Yıldız, O. T. (2015). VC-dimension of univariate decision trees. IEEE Transactions on Neural Networks and Learning Systems, 26(2), 378-387. doi:10.1109/TNNLS.2014.23858372162-237X2162-2388https://hdl.handle.net/11729/581http://dx.doi.org/10.1109/TNNLS.2014.2385837In this paper, we give and prove the lower bounds of the Vapnik-Chervonenkis (VC)-dimension of the univariate decision tree hypothesis class. The VC-dimension of the univariate decision tree depends on the VC-dimension values of its subtrees and the number of inputs. Via a search algorithm that calculates the VC-dimension of univariate decision trees exhaustively, we show that our VC-dimension bounds are tight for simple trees. To verify that the VC-dimension bounds are useful, we also use them to get VC-generalization bounds for complexity control using structural risk minimization in decision trees, i.e., pruning. Our simulation results show that structural risk minimization pruning using the VC-dimension bounds finds trees that are more accurate as those pruned using cross validation.eninfo:eu-repo/semantics/closedAccessComputation theoryDecision treesLearningMachine learningSupervised learningVapnik-Chervonenkis (VC)-dimensionModel selectionClassifiersRegressionComplexityBoundsVC-dimension of univariate decision treesArticle262378387Q1WOS:000348856200015255949832-s2.0-84921446924N/A