Skip to content

Instantly share code, notes, and snippets.

Embed
What would you like to do?
WUSTL Server Load Testing 1/8/2015

1st Test – 40 users/1-4 sec/5min

40 concurrent users requesting every 1-4 seconds for 5 minutes avg was 1438 requests in 5 min (28,760/hr or 287.6/min or 4.8/sec)

  • First load is really slow ~7–9s
  • Once cache is warm things are really fast 11ms
  • Cache breaks are happening on every publish.
  • PHP seems to be the bottleneck

2nd Test - 10 users/1-4sec

10 concurrent users

  • 1-2s loads w no cache
  • Same cache breaking issues apply

Commencement loads

~2,500 requests in worst hour ~41.67 requests/minute

WUSTL.edu

Top day, 8/24 up to 4,200/hour (70/minute)

Expected averages

~9,000/day (non-weekends) 1.4 pages/session :59 avg. session duration

plan for ~600 requests/hour or 10/minute 2.5

Notes on our caching setup

Varnish

  • Using ban instead of purge for some reason. Seems like a full purge could be problematic because we’re racking up lots of bans. Actually, after some more reading, bans are pretty cool because they can sometimes serve the content and sometimes not depending on rules like matching certain HTTP headers or whatever. One idea we might look into is including all posts that are in the main query on a landing page into X-Relies-On headers which will ban any page that contains a URL that has been banned so we smartly invalidate landing pages affected by page edits. Much to think about here.

  • Static resources aren’t being cached very long (i.e. 6h vs 10m)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment