Example of Decision Tree  for 3 numbers
I am taking course "Algorithms & Data Structures" on Faculty of Mathematics and Pyshics at University of Ljubljana.
I would like to have and understand the decision tree with 3 numbers (let's say a
, b
and c
). Can someone do that for me, please?
See also questions close to this topic

Decision stump in Python
I am trying to transfer Matlab code to Python, and get exactly the same result. In Matlab code, I have a decision stump as follows:
switch Model case 'DecisionStump' Weights = Pt; % Train the weak learner by weights Pt tree = fitctree(X,Y,'minparent',size(X,1)sum(Weights==0),'prune','off','mergeleaves','off','Weights', Weights, 'CategoricalPredictors', CategoricalPredictors); h = compact(tree);
and I used python code below to construct exactly the same decision stump:
clf_tree = DecisionTreeClassifier(max_depth = 1)
However, I get slightly different results by these two programs, and It would be great if any one can help me if I miss anything. (perhaps categorical predictors should be included in my python code, but I do not know how!)

How can I find the path a sample took down the decision tree using the DecisionTree class in Accord.NET
I am trying to find out the path a sample took down a DecisionTree I have created in Accord.NET but I am not sure how to do this. I have looked through the documentation and couldn't find out how to do this but as Accord.NET is used for machine learning I'm sure this functionality is available. It would be most useful to get totals of the paths taken from cross validation data, however if this is not possible then finding the path taken for new samples after the tree has been built could still work.
Below is an example of how I build a new decision tree and test a new sample:
C45Learning c45 = new C45Learning(); var c45Tree = c45.Learn(input, output); int[] predicted = c45Tree.Decide(testSamples);

How to predict a value from a decision tree formed by list() in R?
I am relatively new to R and tried to build a regression tree by using list(), what actually worked somehow. But now I have a problem with my prediction function. To build the tree I use a recursive function which searches for the best split and then splits the data (based on one attribute of the X data) and saves the splitting point in a list. For the two new groups happens the same (saved as lists in the first list) until a special limit. Here ends the recursive function and the value of this leaf is the average of the Y values. I tried to predict my values as following:
tree_predict < function(Tree, X) { #Pseudocode: # if node is a leaf YPred = value of the leaf # else split at splitting point saved in list before and # then use this function again for both of the splitted groups return(YPred[,1]) }
But I always get the warning message: "Error in YPred[IdxSplit1, 1] < tree_predict(Tree[["child1"]], X[IdxSplit1, : number of items to replace is not a multiple of replacement length"
I hope someone can help me, thanks! :)