Skip to content

Instantly share code, notes, and snippets.

@sergray
Last active December 6, 2022 22:32
Show Gist options
  • Save sergray/5cd4dceb9e4f40d110ef to your computer and use it in GitHub Desktop.
Save sergray/5cd4dceb9e4f40d110ef to your computer and use it in GitHub Desktop.
Asynchronous requests in Flask with gevent
"""Asynchronous requests in Flask with gevent"""
from time import time
from flask import Flask, Response
from gevent.pywsgi import WSGIServer
from gevent import monkey
import requests
# need to patch sockets to make requests async
monkey.patch_all()
CHUNK_SIZE = 1024*1024 # bytes
app = Flask(__name__) # pylint: disable=invalid-name
app.debug = True
@app.route('/Seattle.jpg')
def seattle(requests_counter=[0]): # pylint: disable=dangerous-default-value
"""Asynchronous non-blocking streaming of relatively large (14.5MB) JPG
of Seattle from wikimedia commons.
"""
requests_counter[0] += 1
request_num = requests_counter[0]
url = 'http://upload.wikimedia.org/wikipedia/commons/3/39/Seattle_3.jpg'
app.logger.debug('started %d', request_num)
rsp = requests.get(url, stream=True)
def generator():
"streaming generator logging the end of request processing"
yield '' # to make greenlet switch
for data in rsp.iter_content(CHUNK_SIZE):
yield data
app.logger.debug('finished %d', request_num)
return Response(generator(), mimetype='image/jpeg')
def main():
"Start gevent WSGI server"
# use gevent WSGI server instead of the Flask
http = WSGIServer(('', 5000), app.wsgi_app)
# TODO gracefully handle shutdown
http.serve_forever()
if __name__ == '__main__':
main()
@smandyscom
Copy link

Thank for your sharing , but i found that not worked as async. One request start to be served after previous one done.

@wiwengweng
Copy link

I found this only work for stream reading. If you want async, queue and multiprocessing would be a good start.

@sergray
Copy link
Author

sergray commented Feb 7, 2018

Yes, there is actually a problem with requests library working in a blocking mode, so while flask accept several HTTP requests, responses are streamed with concurrency 1.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment