Skip to content

Instantly share code, notes, and snippets.

@spencerkittleson
Created July 12, 2023 14:27
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save spencerkittleson/26c21a7bf7540ef8d754c3b71cc4730e to your computer and use it in GitHub Desktop.
Save spencerkittleson/26c21a7bf7540ef8d754c3b71cc4730e to your computer and use it in GitHub Desktop.
Parallel Executions (taken from somewhere else)
import asyncio
import aiohttp
import time
async def get(url, session):
try:
async with session.get(url=url) as response:
resp = await response.read()
print("Successfully got url {} with resp of length {}.".format(url, len(resp)))
except Exception as e:
print("Unable to get url {} due to {}.".format(url, e.__class__))
async def main(urls):
async with aiohttp.ClientSession() as session:
ret = await asyncio.gather(*[get(url, session) for url in urls])
print("Finalized all. Return is a list of len {} outputs.".format(len(ret)))
urls = (str("REPLACE;") * 50).split(';')
start = time.time()
asyncio.run(main(urls))
end = time.time()
print("Took {} seconds to pull {} websites.".format(end - start, len(urls)))
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment