Skip to content

Instantly share code, notes, and snippets.

@omedhabib
Last active October 26, 2018 09:45
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save omedhabib/c3c8ff74ec3993740e80d7235251e73a to your computer and use it in GitHub Desktop.
Save omedhabib/c3c8ff74ec3993740e80d7235251e73a to your computer and use it in GitHub Desktop.
Bbjoern.wsgi
import bjoern
from app import application
bjoern.run(
wsgi_app=application,
host='0.0.0.0',
port=9808,
reuse_port=True
)
@allComputableThings
Copy link

allComputableThings commented Mar 15, 2018

Hi Omed -- does this really lead to being able to support 10,000 concurrent requests? I tied this - if i call some endpoint /sleep/1 that sleeps for 1 second, I get exactly 1 request per second -- only one request is processed at a time, which means that if your requests are lengthy or IO bound, performance might not look so hot.

@vzubin
Copy link

vzubin commented Jul 10, 2018

Hi @stuz5000 I too had this same issue where i had a lengthy request which made all the other request wait is there any work around for this

i tried starting multiple process of bjoern but still some request gets stalled.

Thanks

@dimperati
Copy link

dimperati commented Oct 26, 2018

one can put the web server behind reverse proxy e.g. nginx to buffer for slow connections

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment