7 sonuçlar
Arama Sonuçları
Listeleniyor 1 - 7 / 7
Yayın Calculating the VC-dimension of decision trees(IEEE, 2009) Aslan, Özlem; Yıldız, Olcay Taner; Alpaydın, Ahmet İbrahim EthemWe propose an exhaustive search algorithm that calculates the VC-dimension of univariate decision trees with binary features. The VC-dimension of the univariate decision tree with binary features depends on (i) the VC-dimension values of the left and right subtrees, (ii) the number of inputs, and (iii) the number of nodes in the tree. From a training set of example trees whose VC-dimensions are calculated by exhaustive search, we fit a general regressor to estimate the VC-dimension of any binary tree. These VC-dimension estimates are then used to get VC-generalization bounds for complexity control using SRM in decision trees, i.e., pruning. Our simulation results shows that SRM-pruning using the estimated VC-dimensions finds trees that are as accurate as those pruned using cross-validation.Yayın Soft decision trees(IEEE, 2012) İrsoy, Ozan; Yıldız, Olcay Taner; Alpaydın, Ahmet İbrahim EthemWe discuss a novel decision tree architecture with soft decisions at the internal nodes where we choose both children with probabilities given by a sigmoid gating function. Our algorithm is incremental where new nodes are added when needed and parameters are learned using gradient-descent. We visualize the soft tree fit on a toy data set and then compare it with the canonical, hard decision tree over ten regression and classification data sets. Our proposed model has significantly higher accuracy using fewer nodes.Yayın VC-dimension of rule sets(IEEE Computer Soc, 2014-12-04) Yıldız, Olcay TanerIn this paper, we give and prove lower bounds of the VC-dimension of the rule set hypothesis class where the input features are binary or continuous. The VC-dimension of the rule set depends on the VC-dimension values of its rules and the number of inputs.Yayın Budding trees(IEEE Computer Soc, 2014-08-24) İrsoy, Ozan; Yıldız, Olcay Taner; Alpaydın, Ahmet İbrahim EthemWe propose a new decision tree model, named the budding tree, where a node can be both a leaf and an internal decision node. Each bud node starts as a leaf node, can then grow children, but then later on, if necessary, its children can be pruned. This contrasts with traditional tree construction algorithms that only grows the tree during the training phase, and prunes it in a separate pruning phase. We use a soft tree architecture and show that the tree and its parameters can be trained using gradient-descent. Our experimental results on regression, binary classification, and multi-class classification data sets indicate that our newly proposed model has better performance than traditional trees in terms of accuracy while inducing trees of comparable size.Yayın Regularizing soft decision trees(Springer, 2013) Yıldız, Olcay Taner; Alpaydın, Ahmet İbrahim EthemRecently, we have proposed a new decision tree family called soft decision trees where a node chooses both its left and right children with different probabilities as given by a gating function, different from a hard decision node which chooses one of the two. In this paper, we extend the original algorithm by introducing local dimension reduction via L-1 and L-2 regularization for feature selection and smoother fitting. We compare our novel approach with the standard decision tree algorithms over 27 classification data sets. We see that both regularized versions have similar generalization ability with less complexity in terms of number of nodes, where L-2 seems to work slightly better than L-1.Yayın Searching for the optimal ordering of classes in rule induction(IEEE, 2012-11-15) Ata, Sezin; Yıldız, Olcay TanerRule induction algorithms such as Ripper, solve a K > 2 class problem by converting it into a sequence of K - 1 two-class problems. As a usual heuristic, the classes are fed into the algorithm in the order of increasing prior probabilities. In this paper, we propose two algorithms to improve this heuristic. The first algorithm starts with the ordering the heuristic provides and searches for better orderings by swapping consecutive classes. The second algorithm transforms the ordering search problem into an optimization problem and uses the solution of the optimization problem to extract the optimal ordering. We compared our algorithms with the original Ripper on 8 datasets from UCI repository [2]. Simulation results show that our algorithms produce rulesets that are significantly better than those produced by Ripper proper.Yayın Müşterilerin GSP analizi kullanarak kümelenmesi(Institute of Electrical and Electronics Engineers Inc., 2018-07-05) Pakyürek, Muhammet; Sezgin, Mehmet Selman; Kestepe, Sedat; Bora, Büşra; Düzağaç, Remzi; Yıldız, Olcay TanerBu çalışma ile mevcut misafir ve rezervasyon verisi kullanılarak doğal öbeklenmeleri tespit ederek misafir davranışları tespit ettik. Ayrıca verilen hizmetleri ve satış stratejilerini bu davranışlara göre özelleştirdik. K-ortalama ile kişileri öbekledikten sonra bu mevcut öbeklenmeleri sağlayan temel karakteristikler karar ağacı yaklaşımı ile çıkartılmıştır. Bu karakteristiklerin kişinin ürün alma kanalı, belirli ürün tercihleri, rezervasyon süresi, sezonsal tercihi vb. olduğu tespit edilmiştir. Bu karakteristiklerin her öbeklenmede ciddi değişiklikler göstermiş olması çözümün genel olarak doğru olduğunun ve bu karakteristiklerin başarılı bir şekilde seçildiğini göstermektedir. Bu çalışma, grup karakteristiklerine uygun kampanyalar ve ürün paketleri oluşturulmasında önemli bir rol oynamaktadır.












