Skip to content

Instantly share code, notes, and snippets.

@daolf
Created August 20, 2020 17:14
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save daolf/02e7bfa250cd0e37a64e61fae1ae14ca to your computer and use it in GitHub Desktop.
Save daolf/02e7bfa250cd0e37a64e61fae1ae14ca to your computer and use it in GitHub Desktop.
import requests
from multiprocessing.dummy import Pool as ThreadPool
def request_scrapingbee(url):
r = requests.get(
url="https://app.scrapingbee.com/api/v1/",
params={
"api_key": "<YOUR_API_KEY>",
"url": url,
},
)
response = {
"statusCode": r.status_code,
"body": r.text,
"url": url,
}
return response
concurrency = 2
pool = ThreadPool(concurrency)
urls = ["<URL_1>", "<URL_2>"]
results = pool.map(request_scrapingbee, urls)
pool.close()
pool.join()
for result in results:
print(result)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment