Created
June 7, 2021 09:57
-
-
Save sameerg07/39ea6f0e8ab8f9949a479793d18593a6 to your computer and use it in GitHub Desktop.
Manual Gradient descent for given X and y with starting m and c values for an equation y= mX + c
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# Takes in X, y, current m and c (both initialised to 0), num_iterations, learning rate | |
# returns gradient at current m and c for each pair of m and c | |
def gradient(X, y, m_current=0, c_current=0, iters=1000, learning_rate=0.01): | |
N = float(len(y)) | |
gd_df = pd.DataFrame( columns = ['m_current', 'c_current','cost']) | |
for i in range(iters): | |
y_pred = (m_current * X) + c_current | |
cost = sum([data**2 for data in (y-y_pred)]) / N | |
m_gradient = -(2/N) * sum(X * (y - y_pred)) | |
c_gradient = -(2/N) * sum(y - y_pred) | |
m_current = m_current - (learning_rate * m_gradient) | |
c_current = c_current - (learning_rate * c_gradient) | |
gd_df.loc[i] = [m_current,c_current,cost] | |
return(gd_df) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment