Skip to content

Instantly share code, notes, and snippets.

@Ddedalus
Created February 1, 2018 17:48
Show Gist options
  • Save Ddedalus/64cd6c407df06083fcbced2d4d4cd763 to your computer and use it in GitHub Desktop.
Save Ddedalus/64cd6c407df06083fcbced2d4d4cd763 to your computer and use it in GitHub Desktop.
Decision-tree-learning algorithm pseudocode
def decisionTreeLearning(examples, attributes, parent_examples):
if len(examples) == 0:
return pluralityValue(parent_examples)
# return most probable answer as there is no training data left
elif len(attributes) == 0:
return pluralityValue(examples)
elif (all examples classify the same):
return their classification
A = max(attributes, key(a)=importance(a, examples)
# choose the most promissing attribute to condition on
tree = new Tree(root=A)
for value in A.values():
exs = examples[e.A == value]
subtree = decisionTreeLearning(exs, attributes.remove(A), examples)
# note implementation should probably wrap the trivial case returns into trees for consistency
tree.addSubtreeAsBranch(subtree, label=(A, value)
return tree
@Ddedalus
Copy link
Author

Ddedalus commented Feb 1, 2018

Fig. 18.5, page 702

@Ddedalus
Copy link
Author

Ddedalus commented Feb 1, 2018

pluralityValue(examples) should select the most probable answer in case there are no more attributes to use or no examples for this end of the three

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment