Skip to content

Instantly share code, notes, and snippets.

@crised
Created July 30, 2016 12:10
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save crised/3037438d64e2d75d6dacfd6a51d679ee to your computer and use it in GitHub Desktop.
Save crised/3037438d64e2d75d6dacfd6a51d679ee to your computer and use it in GitHub Desktop.
# In this exercise we'll examine a learner which has high bias, and is incapable of
# learning the patterns in the data.
# Use the learning curve function from sklearn.learning_curve to plot learning curves
# of both training and testing error. Use plt.plot() within the plot_curve function
# to create line graphs of the values.
from sklearn.linear_model import LinearRegression
from sklearn.learning_curve import learning_curve
import matplotlib.pyplot as plt
from sklearn.metrics import explained_variance_score, make_scorer
from sklearn.cross_validation import KFold
import numpy as np
size = 1000
cv = KFold(size, shuffle=True)
score = make_scorer(explained_variance_score)
X = np.reshape(np.random.normal(scale=2, size=size), (-1, 1))
y = np.array([[1 - 2 * x[0] + x[0] ** 2] for x in X])
def plot_curve():
reg = LinearRegression()
reg.fit(X, y)
print reg.score(X, y)
# TODO: Create the learning curve with the cv and score parameters defined above.
curve = learning_curve(reg, X, y, train_sizes=np.array([0.1, 0.33, 0.55, 0.78, 1.]), cv=None, scoring=None,
exploit_incremental_learning=False, n_jobs=1, pre_dispatch='all', verbose=0)
# TODO: Plot the training and testing curves.
#plt.plot(curve)
# Sizes the window for readability and displays the plot.
plt.ylim(-.1, 1.1)
plt.show()
plot_curve()
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment