Skip to content

Instantly share code, notes, and snippets.

@phil8192
Created April 3, 2020 21:03
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save phil8192/228502dfbd4907f351a9abe717f8a818 to your computer and use it in GitHub Desktop.
Save phil8192/228502dfbd4907f351a9abe717f8a818 to your computer and use it in GitHub Desktop.
class B:
def __init__(self, x, pub_key=None):
self.x = x # Host B's X.
self.features = x.shape[1]
self.pub_key = pub_key
# Called by Host (A) with current model Theta and A's
# (encypted) part of the gradient calculation.
def gradients(self, theta, u):
# B's Theta
b_theta = theta[-self.features:]
# B's part of the gradient (result is a 1d vector of length = A's y)
v = 1/4 * np.dot(self.x, b_theta)
# Add A's u to B's v.
# Since u is encrypted, the resulting w is also encrypted.
w = u + v
# Calculate B's gradient (result is encrypted since dot-product with w)
gradient_b = np.dot(w, self.x)
# Send back (encypted) w and B gradient to A.
return w, gradient_b
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment