Skip to content

Instantly share code, notes, and snippets.

@deanwampler
Created December 20, 2019 16:18
Show Gist options
  • Save deanwampler/faf4cf1322103d9ffe754e39967d984f to your computer and use it in GitHub Desktop.
Save deanwampler/faf4cf1322103d9ffe754e39967d984f to your computer and use it in GitHub Desktop.
import time
@ray.remote
def worker(ps):
for _ in range(100): # Arbitrary number (100) of updates
# First, get the latest parameters. The following
# method call is non-blocking; it returns a future
# effectively immediately.
params_id = ps.get_params.remote()
# As before, this is a blocking call that waits for
# the task to finish and then gets the value.
params = ray.get(params_id)
# Compute a gradient update. We make a fake update,
# but in practice this would use an ML library like
# TensorFlow and would also take in a batch of data.
# We'll simulate an expensive calculation by adding
# a sleep call
grad = np.ones(10)
time.sleep(0.2)
# Update all the parameters with the same gradient.
ps.update_params.remote(grad)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment