Skip to content

Instantly share code, notes, and snippets.

@BinarySpoon
Last active October 23, 2020 04:03
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save BinarySpoon/f2f9997d93955a59cc50a977284f1227 to your computer and use it in GitHub Desktop.
Save BinarySpoon/f2f9997d93955a59cc50a977284f1227 to your computer and use it in GitHub Desktop.
# logistic regression -->
lgr = make_pipeline(StandardScaler(), LogisticRegression(random_state=0))
lgr.fit(X_train,y_train)
# Support Vector Classifier -->
svc = make_pipeline(StandardScaler(), SVC(random_state=0, gamma='auto'))
svc.fit(X_train,y_train)
# k-nearest neighbours -->
knn = make_pipeline(StandardScaler(), KNeighborsClassifier(n_neighbors=5))
knn.fit(X_train,y_train)
# Random Forest -->
rft = make_pipeline(StandardScaler(), RandomForestClassifier(max_depth=2, random_state=0))
rft.fit(X_train,y_train)
# accuracy -->
accuracy_lgr = lgr.score(X_test,y_test)
accuracy_svc = svc.score(X_test,y_test)
accuracy_knn = knn.score(X_test,y_test)
accuracy_rft = rft.score(X_test,y_test)
# print out results -->
print('Logistic Regression: {}'.format(accuracy_lgr))
print('Support Vector Classifier: {}'.format(accuracy_svc))
print('Knearest neighbors: {}'.format(accuracy_knn))
print('Random Forests: {}'.format(accuracy_rft))
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment