Skip to content

Instantly share code, notes, and snippets.

@erogol
Created February 2, 2014 22:59
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
Star You must be signed in to star a gist
Embed
What would you like to do?
from multiprocessing import Pool
# parallelize function
def product(a,b):
print a*b
# auxiliary funciton to make it work
def product_helper(args):
return product(*args)
def parallel_product(list_a, list_b):
# spark given number of processes
p = Pool(5)
# set each matching item into a tuple
job_args = [(item_a, list_b[i]) for i, item_a in enumerate(list_a)]
# map to pool
p.map(product_helper, job_args)
exp_a = range(1000)
exp_b = range(1000)
parallel_product(exp_a, exp_b)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment