Skip to content

Instantly share code, notes, and snippets.

@ggcr
Created August 5, 2023 10:40
Show Gist options
  • Save ggcr/39c1a9003ea923ed7dc0b9959853a8d2 to your computer and use it in GitHub Desktop.
Save ggcr/39c1a9003ea923ed7dc0b9959853a8d2 to your computer and use it in GitHub Desktop.
Univariate Linear Regression using the solution of the Normal Equations for the Least Square problem.
import numpy as np
x = np.array([-0.5, -0.43, -0.23, 0.12, 0.40, 0.71, 1]).reshape(-1, 1)
y = np.array([-2, -0.2, 0.1, 0.83, 1.4, 0.98, 2.2]).reshape(-1, 1)
# Add a column of ones to the input data for the intercept term
X = np.concatenate((np.ones_like(x), x), axis=1)
xxT = X.T.dot(X)
inv = np.linalg.inv(xxT)
xTy = X.T.dot(y)
w = inv @ xTy
y_pred = X @ w
mse_train = np.mean((y - y_pred) ** 2)
print("Mean Squared Error (MSE) on train:", mse_train)
x_test = np.array([-1.02, 0.56, 1.2]).reshape(-1, 1)
# Add a column of ones to the input data for the intercept term
X = np.concatenate((np.ones_like(x), x), axis=1)
xxT = X.T.dot(X)
inv = np.linalg.inv(xxT)
xTy = X.T.dot(y)
w = inv @ xTy
y_pred = X @ w
print("Predictions of y in test: ", y_pred[:, 0])
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment