Skip to content

Instantly share code, notes, and snippets.

@CasperCL
Last active August 11, 2019 19:26
Show Gist options
  • Save CasperCL/fb4126b9b961821831729e31f9a2dd95 to your computer and use it in GitHub Desktop.
Save CasperCL/fb4126b9b961821831729e31f9a2dd95 to your computer and use it in GitHub Desktop.
Gradient Descent (Linear regression)
import random
X_Y = [(2, 4), (3, 6), (4, 9), (5, 10), (6, 12), (7, 15), (8,17), (9,20)]
w = [1, 2, 3, 4]
a = 0.0001
iterations = 15000
h = lambda x: w[0] * x**3 + w[1] * x**2 + w[2]*x + w[3]
l = lambda x, y: (h(x) - y)**2
dl = lambda x, y, w_: 2.0 * (h(x) - y) * w_
def gradient_descent():
for _ in range(iterations):
# TODO: Use dot product to improve performance.
w[0] = w[0] - a / 2 * sum(dl(*x_i, w[0]) for x_i in X_Y)
w[1] = w[1] - a / 2 * sum(dl(*x_i, w[1]) for x_i in X_Y)
w[2] = w[2] - a / 2 * sum(dl(*x_i, w[2]) for x_i in X_Y)
w[3] = w[3] - a / 2 * sum(dl(*x_i, w[3]) for x_i in X_Y)
gradient_descent()
print("weight", [round(w_, 2) for w_ in w])
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment