Create a gist now

Instantly share code, notes, and snippets.

Embed
What would you like to do?
Asynchronous requests in Flask with gevent
"""Asynchronous requests in Flask with gevent"""
from time import time
from flask import Flask, Response
from gevent.pywsgi import WSGIServer
from gevent import monkey
import requests
# need to patch sockets to make requests async
monkey.patch_all()
CHUNK_SIZE = 1024*1024 # bytes
app = Flask(__name__) # pylint: disable=invalid-name
app.debug = True
@app.route('/Seattle.jpg')
def seattle(requests_counter=[0]): # pylint: disable=dangerous-default-value
"""Asynchronous non-blocking streaming of relatively large (14.5MB) JPG
of Seattle from wikimedia commons.
"""
requests_counter[0] += 1
request_num = requests_counter[0]
url = 'http://upload.wikimedia.org/wikipedia/commons/3/39/Seattle_3.jpg'
app.logger.debug('started %d', request_num)
rsp = requests.get(url, stream=True)
def generator():
"streaming generator logging the end of request processing"
yield '' # to make greenlet switch
for data in rsp.iter_content(CHUNK_SIZE):
yield data
app.logger.debug('finished %d', request_num)
return Response(generator(), mimetype='image/jpeg')
def main():
"Start gevent WSGI server"
# use gevent WSGI server instead of the Flask
http = WSGIServer(('', 5000), app.wsgi_app)
# TODO gracefully handle shutdown
http.serve_forever()
if __name__ == '__main__':
main()
@wiwengweng

This comment has been minimized.

Show comment
Hide comment
@wiwengweng

wiwengweng Oct 20, 2016

Thanks for your demo. BTW, can I achieve the same non-blocking effect in a long time-consuming work? For example, my api /longtime will do 10 same backgroud job in multi-thread mode, and each job is time-cunsuming but returns nothing.

Thanks for your demo. BTW, can I achieve the same non-blocking effect in a long time-consuming work? For example, my api /longtime will do 10 same backgroud job in multi-thread mode, and each job is time-cunsuming but returns nothing.

@smandyscom

This comment has been minimized.

Show comment
Hide comment
@smandyscom

smandyscom Oct 30, 2016

Thank for your sharing , but i found that not worked as async. One request start to be served after previous one done.

Thank for your sharing , but i found that not worked as async. One request start to be served after previous one done.

@wiwengweng

This comment has been minimized.

Show comment
Hide comment
@wiwengweng

wiwengweng Nov 1, 2016

I found this only work for stream reading. If you want async, queue and multiprocessing would be a good start.

I found this only work for stream reading. If you want async, queue and multiprocessing would be a good start.

@sergray

This comment has been minimized.

Show comment
Hide comment
@sergray

sergray Feb 7, 2018

Yes, there is actually a problem with requests library working in a blocking mode, so while flask accept several HTTP requests, responses are streamed with concurrency 1.

Owner

sergray commented Feb 7, 2018

Yes, there is actually a problem with requests library working in a blocking mode, so while flask accept several HTTP requests, responses are streamed with concurrency 1.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment