Skip to content

Instantly share code, notes, and snippets.

  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
Star You must be signed in to star a gist
Save MartinThoma/85080a53f73661f971a7ef4dba48364b to your computer and use it in GitHub Desktop.
Decision trees might even be in $\mathcal{O}(1)$. I have to give impurity measures a closer thought.
# Time Complexity of Training and Testing
## By training samples ($n$) and samples features ($m$) and number of classes ($c$).
The following classifiers potentially have computational complexities less than or equal with $\mathcal{O}(mnc)$ complexity) in both of the training and testing phases.
* k-nearest neighbors is linear (https://stats.stackexchange.com/questions/219655/k-nn-computational-complexity),
* Naive Bayes is linear for those PDFs that can be estimated in linear time (e.g. Poisson and Multinomial PDFs).
* Approximate SVM is linear (https://stats.stackexchange.com/questions/96995/machine-learning-classifiers-big-o-or-complexity)
* Logistic Regression can be linear (https://cstheory.stackexchange.com/questions/4278/computational-complexity-of-learning-classification-algorithms-fitting-the-p)
* AdaBoost and Gradient Boosting require at least a weak learner and are linear only if their weak learner is linear (f m n log(n)) (https://stackoverflow.com/questions/22397485/what-is-the-o-runtime-complexity-of-adaboost)
* Neural Networks can be linear (https://ai.stackexchange.com/questions/5728/time-complexity-for-training-a-neural-network)
The following classifiers are nonlinear either in the training or in the testing phase:
* SVM requires polynomial training with respect to the number of samples (https://stackoverflow.com/questions/16585465/training-complexity-of-linear-svm)
* Decision tree and correspondingly Random forest are loglinear (m n log(n)) (https://stackoverflow.com/questions/34212610/why-is-the-runtime-to-construct-a-decision-tree-mnlogn)
* Linear (and Quadratic) Discriminant Analysis (LDA and QDA) are polynomial (n m^2) (https://stats.stackexchange.com/questions/211177/comparison-of-lda-vs-knn-time-complexity)
* Restricted Boltzmann Machine (RBM) is polynomial with respect to m (http://scikit-learn.org/stable/modules/generated/sklearn.neural_network.BernoulliRBM.html)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment