Skip to content

Instantly share code, notes, and snippets.

@aflores
Created April 28, 2021 12:51
Show Gist options
  • Save aflores/47e95a00edd30594ace10b028144fb9d to your computer and use it in GitHub Desktop.
Save aflores/47e95a00edd30594ace10b028144fb9d to your computer and use it in GitHub Desktop.
Linear-regression-txt
import sys
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
X = 2 * np.random.rand(100,1)
y = 4 + 3 * X + np.random.rand(100,1)
plt.scatter(X,y)
# np.c_[constants, vector] add (prepend a column) x0 = 1 to each instance
# np.ones returns a 'matrix' of '1's of the given shape
X_b = np.c_[np.ones((100, 1)), X]
# <nparray>.T same thing as .transpose
theta_best = np.linalg.inv(X_b.T.dot(X_b)).dot(X_b.T).dot(y)
print(theta_best)
Ideally the answer is 4,3 but the noise makes it imposible to recover the exact parameters
Now we can make predictions:
X_new = np.array([[0],[2]])
X_new_b = np.c_[np.ones((2, 1)), X_new]
y_predict = X_new_b.dot(theta_best)
print("Prediction:", y_predict)
plt.plot(X_new, y_predict, "r-")
plt.plot(X, y, "b.")
plt.axis([0, 2, 0, 15])
plt.show()
print(np.c_[np.ones((2, 1)), X_new])
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment