Skip to content

Instantly share code, notes, and snippets.

@obengwilliam
Last active May 6, 2021 20:09
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save obengwilliam/99b1c0d7ee259e75802aeb897015a3ac to your computer and use it in GitHub Desktop.
Save obengwilliam/99b1c0d7ee259e75802aeb897015a3ac to your computer and use it in GitHub Desktop.
mbg.py
self.init_plot(self.FEATURES)
has_converged = False
iter = 1
while not has_converged:
print(f"iter {iter}")
minibatch = random.sample(range(0, self.DATAPOINTS), self.MINIBATCH_SIZE)
for k in range(self.FEATURES):
total = 0
for i in minibatch:
total += self.x[i][k] * self.conditional_prob(1, i) - self.y[i]
gradient = 1.0 / self.DATAPOINTS * total
self.gradient[k] = gradient
self.update_plot(np.sum(np.square(self.gradient)))
for k in range(0, self.FEATURES):
self.theta[k] -= self.LEARNING_RATE * self.gradient[k]
has_converged = all(abs(i) < self.CONVERGENCE_MARGIN for i in self.gradient)
iter += 1
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment