Skip to content

Instantly share code, notes, and snippets.

@jonluca
Last active July 17, 2024 10:58
Show Gist options
  • Save jonluca/14fe99be6204f34cbd61c950b0faf3b1 to your computer and use it in GitHub Desktop.
Save jonluca/14fe99be6204f34cbd61c950b0faf3b1 to your computer and use it in GitHub Desktop.
Fast asyncio HTTP requests
import sys
import os
import json
import asyncio
import aiohttp
# Initialize connection pool
conn = aiohttp.TCPConnector(limit_per_host=100, limit=0, ttl_dns_cache=300)
PARALLEL_REQUESTS = 100
results = []
urls = ['https://jsonplaceholder.typicode.com/todos/1' for i in range(10)] #array of urls
async def gather_with_concurrency(n):
semaphore = asyncio.Semaphore(n)
session = aiohttp.ClientSession(connector=conn)
# heres the logic for the generator
async def get(url):
async with semaphore:
async with session.get(url, ssl=False) as response:
obj = json.loads(await response.read())
results.append(obj)
await asyncio.gather(*(get(url) for url in urls))
await session.close()
loop = asyncio.get_event_loop()
loop.run_until_complete(gather_with_concurrency(PARALLEL_REQUESTS))
conn.close()
print(f"Completed {len(urls)} requests with {len(results)} results")
@007-JB
Copy link

007-JB commented Aug 21, 2022

Ta- I think I did something funny with one of the threading;/future or some other module - to allow me to increase max threads - because I'm running exact same code on other machine at it works beautifully - I'll get there - thanks so much for swift response. your code is awesome!!! thumbs up - take care, J

@eerbing
Copy link

eerbing commented Jul 17, 2024

Why I run this slower than solution 2 & 3? This costs over 1 sec, but solution 2&3 cost 0.8sec, it looks like async is always slower than multi-thread

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment