Skip to content

Instantly share code, notes, and snippets.

@santiagobasulto
Last active July 16, 2018 07:44
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save santiagobasulto/141cb786aa789892a3dd14cabc473482 to your computer and use it in GitHub Desktop.
Save santiagobasulto/141cb786aa789892a3dd14cabc473482 to your computer and use it in GitHub Desktop.
Despite multiple efforts, I couldn't figure out how to evaluate generators in parallel. Ideas are welcome.
def get_repo_stars(org, repo):
url = 'https://api.github.com/repos/{org}/{repo}'.format(
org=org, repo=repo)
print("GET ", url)
resp = requests.get(url)
return resp.json()['stargazers_count']
params = [
('requests', 'requests'),
('requests', 'httpbin'),
('django', 'django'),
('Lukasa', 'hyper'),
]
generator = (get_repo_stars(org, repo) for org, repo in params)
def parallel(generator):
def _next():
try:
val = next(generator)
print(val)
except StopIteration:
pass
threads = [threading.Thread(target=_next) for _ in range(2)]
[t.start() for t in threads]
[t.join() for t in threads]
parallel(generator)
@knowsuchagency
Copy link

knowsuchagency commented Jul 15, 2018

from concurrent import futures

import requests


def get_repo_stars(org, repo):
    url = 'https://api.github.com/repos/{org}/{repo}'.format(
        org=org, repo=repo)
    resp = requests.get(url)
    return (repo, resp.json()['stargazers_count'])


params = [
    ('requests', 'requests'),
    ('requests', 'httpbin'),
    ('django', 'django'),
    ('Lukasa', 'hyper'),
]

with futures.ThreadPoolExecutor() as ex:
    results = ex.map(lambda p: get_repo_stars(*p), params)
    for repo, stars in results:
        print(f'{repo} has {stars} stars')

@santiagobasulto
Copy link
Author

Hey, thanks! Yes, I know about map of concurrent.futures and multiprocessing.Pool. I was just trying to see if it was possible to make generators parallel.

@agoose77
Copy link

Generators are always synchronous, but you can make the request async (and therefore do it concurrently).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment