Example of Decision Tree - for 3 numbers
I am taking course "Algorithms & Data Structures" on Faculty of Mathematics and Pyshics at University of Ljubljana.
I would like to have and understand the decision tree with 3 numbers (let's say
c). Can someone do that for me, please?
See also questions close to this topic
How to tell a machine learning algorithm to make multiple predictions instead of one?
At this moment, I have made multiple machine learning algorithms. These algorithms give a good accuracy, but I would like to investigate to which extend the accuracy increases if an algorithm does multiple predictions instead of one. For example, the input is A and the algorithm predicts B (based on the highest probabilty of the output being B), I would like the algorithm to also predict C and D. If the test output set does contain either B, C or D, then I want it to see this as a correct prediction. As a result, the accuracy will increase. Can you help me with this problem? I work with sklearn and write code in Spyder (Python). I prefer to keep using sklearn, Spyder and Python to implement this.
I have tried
.predict_proba(), which gives you an overview of all input and output options and their corresponding probabilities. It does so in table form, but it does not provide you with the class of the input data (or at least in my understanding). If you use .classes_ afterwards, then you can see the class of the output data, but not the class from the input data. Also, I would prefer to just to tell the algorithm it should make multiple predictions instead of using another table (
#the model.predict_proba is turned into a dataframe to make it more accessible #train the model X_train, X_test, y_train, y_test = train_test_split(X, Y, test_size = 0.2, random_state = 263) #test the model model = tree.DecisionTreeClassifier() model.fit(X_train, y_train) df2 = pd.DataFrame(model.predict_proba(X_test)) model.score(X_test, y_test)
Use of linear regression in decission tree
Can we use Linear Regression while training a decision tree? I am asking because I saw a youtube video whereby he used Linear Regression to train a data set. Note that the video was specifically about training and testing data. Whereas another youtube video used Decission Tree Classifier to train the decision tree.
So can I use Linear Regression instead of the Decision Tree Classifier while training a decision tree?
If not, then why?
ACL2 Decision Tree fails to execute the recursive call
For this problem, I'm assigned to implement a function that decides the output of a decision tree given the data. The logic: If it's a symbol, then that's the value to output otherwise, lookup the value of varname in the memory, and if it's less than or equal to the threshold, look for the value in the left-tree, and if it's greater than the threshold, look for the value in the right-tree
A decision tree is either: A symbol, e.g, 'versicolor or [ varname threshold left-tree right-tree ]
Here is what I've already done,
(defun decision (tree memory) (if (not (equal (len tree) 0)) (if (not (equal (first tree) (first memory))) (decision tree (rest memory)) (if (<= (second tree) (second memory)) (decision (third tree) memory) (decision (fourth tree) memory))) tree))
Here's one unit test:
(check-expect (decision *IRIS-DECISION-TREE* (search-list-to-tree '((petal-length 2) (petal-width 2) (sepal-length 5)))) 'setosa)
Here is the definition of the constant used
(defconst *IRIS-DECISION-TREE* '(petal-length 245/100 setosa (petal-width 175/100 (petal-length 495/100 (petal-width 165/100 versicolor virginica) (petal-width 155/100 virginica (sepal-length 695/100 versicolor virginica))) (petal-length 485/100 (sepal-length 595/100 versicolor virginica) virginica))))
I keep getting errors when the function reaches the recursion call. It says "ACL2 Error in ( DEFUN DECISION ...): No :MEASURE was supplied with the definition of DECISION."
I tested every if statement to see if they work, and ran the logic of the code through my head multiple times and it seems like the only problem I might have could be an error in syntax but it all seems right.