Skip to content

Instantly share code, notes, and snippets.

@aricooperdavis
Last active February 2, 2024 13:13
Show Gist options
  • Save aricooperdavis/c658fc1c5d9bdc5b50ec94602328073b to your computer and use it in GitHub Desktop.
Save aricooperdavis/c658fc1c5d9bdc5b50ec94602328073b to your computer and use it in GitHub Desktop.
Example of 3D plots illustrating Linear Regression with 2 features and 1 target
import matplotlib.pyplot as plt
import numpy as np
import sklearn.linear_model
from mpl_toolkits.mplot3d import Axes3D
X_train = np.random.rand(2000).reshape(1000,2)*60
y_train = (X_train[:, 0]**2)+(X_train[:, 1]**2)
X_test = np.random.rand(200).reshape(100,2)*60
y_test = (X_test[:, 0]**2)+(X_test[:, 1]**2)
fig = plt.figure()
ax = fig.add_subplot(111, projection='3d')
ax.scatter(X_train[:,0], X_train[:,1], y_train, marker='.', color='red')
ax.set_xlabel("X1")
ax.set_ylabel("X2")
ax.set_zlabel("y")
model = sklearn.linear_model.LinearRegression()
model.fit(X_train, y_train)
y_pred = model.predict(X_test)
print("MAE: {}".format(np.abs(y_test-y_pred).mean()))
print("RMSE: {}".format(np.sqrt(((y_test-y_pred)**2).mean())))
coefs = model.coef_
intercept = model.intercept_
xs = np.tile(np.arange(61), (61,1))
ys = np.tile(np.arange(61), (61,1)).T
zs = xs*coefs[0]+ys*coefs[1]+intercept
print("Equation: y = {:.2f} + {:.2f}x1 + {:.2f}x2".format(intercept, coefs[0],
coefs[1]))
ax.plot_surface(xs,ys,zs, alpha=0.5)
plt.show()
@aricooperdavis
Copy link
Author

aricooperdavis commented Nov 6, 2019

This image is a static image taken from an interactive Matplotlib 3D plot illustrating the results of a linear regression that has been trained on data generated using the function $y = x_{1}^{2}+x_{2}^{2}$. The regression has converged to $y = -1249.41 + 61.18x^{1} + 60.69x^{2}$

Static_Image

And this is a gif of what you can play with if you run the visualisation yourself:

Downloads

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment