Skip to content

Instantly share code, notes, and snippets.

@montanaflynn
Last active August 29, 2015 14:07
Show Gist options
  • Save montanaflynn/b8d23bef323ee697c74c to your computer and use it in GitHub Desktop.
Save montanaflynn/b8d23bef323ee697c74c to your computer and use it in GitHub Desktop.
Latency Headers PoC
var unirest = require('unirest')
// This is the starting timestamp of when the request was sent
var requestSent = new Date().getTime()
unirest
.get('http://localhost:1337/')
.end(function(res){
if (res.status === 200) {
// Save the timestamp of when the response was received
var responseReceived = new Date().getTime()
// Save the headers set by the server
var requestReceived = res.headers['x-request-received']
var responseSent = res.headers['x-response-sent']
// The math to determine the latencies
var outgoingLatency = requestReceived - requestSent
var processingLatency = responseSent - requestReceived
var incomingLatency = responseReceived - responseSent
var roundtripLatency = outgoingLatency + processingLatency + incomingLatency
// Print out the latency in human readable format
console.log("Total outgoing network latency: " + outgoingLatency + "ms")
console.log("Total processing time latency: " + processingLatency + "ms")
console.log("Total incoming network latency: " + incomingLatency + "ms")
console.log("Total round trip latency: " + roundtripLatency + "ms")
}
})
{
"name": "latency-header-poc",
"description": "Using HTTP headers to benchmark latency",
"author": "Montana Flynn",
"scripts": {
"client": "node client.js",
"prestart": "npm install",
"start": "npm run-script test",
"pretest": "node --harmony server.js &",
"test": "npm run-script client",
"posttest": "pkill latencyPoC"
},
"version": "0.0.1",
"license": "ISC",
"dependencies": {
"koa": "^0.12.2",
"unirest": "^0.2.7"
}
}
var koa = require('koa')
var app = koa()
// Make it easy to kill from npm
process.title = "latencyPoC"
// Set network latency headers
app.use(function *(next){
var timestamp = new Date().getTime()
this.set('x-request-received', timestamp)
yield next
})
// Simulate processing time
app.use(function *(next){
function process() {
var timeout = Math.floor((Math.random() * 50) + 10);
return function (cb) {
setTimeout(cb, timeout)
}
}
yield process()
yield next
})
// Send the response
app.use(function *(){
var timestamp = new Date().getTime()
this.set('x-powered-by', 'magic')
this.set('x-response-sent', timestamp)
this.body = 'Hello World'
})
app.listen(1337)
@montanaflynn
Copy link
Author

Run the proof-of-concept by downloading the files, go to the directory and run npm start, here's a one liner:

git clone git@gist.github.com:/b8d23bef323ee697c74c.git; cd b8d23bef323ee697c74c; npm start

Basically there are two headers introduced in this proof of concept.

  • x-request-received is set by the server with the timestamp of when the request was received
  • x-response-sent is set by the server with the timestamp of when the response was sent

With these headers in place client can determine the following:

outgoing network latency is the time from the client sending the request until being received by the server
server processing latency is the time from the server getting the request until sending the response
incoming network latency is the time from the server sending the response until being received by the client
total round trip latency is the time from the client sending the request until a response back from the server

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment