You can apply functions to metric queries in the graph editor, as long as you use the JSON editor.
The general format is:
function(metric{scope} [by {filter}])
In case of binary operators (+, -, /, *), the format is:
$(function() { | |
$('div.content').live('showoff:show', function(evt) { | |
var bg_img = $('img[alt=background]', evt.target); | |
var old_bg = ''; | |
if (bg_img.size() > 0) { | |
var src = bg_img.attr('src'); | |
bg_img.hide(); | |
// Set new background on body | |
old_bg = $('body').css('background-image'); | |
$('body') |
<html> | |
<head> | |
<title>Animated Sparkline using SVG Path and d3.js</title> | |
<script src="http://mbostock.github.com/d3/d3.v2.js"></script> | |
<style> | |
/* tell the SVG path to be a thin blue line without any area fill */ | |
path { | |
stroke: steelblue; | |
stroke-width: 1; | |
fill: none; |
#!/bin/sh | |
# | |
# /etc/rc.d/init.d/supervisord | |
# | |
# Supervisor is a client/server system that | |
# allows its users to monitor and control a | |
# number of processes on UNIX-like operating | |
# systems. | |
# | |
# chkconfig: - 64 36 |
#!/usr/bin/env python | |
"""Split large file into multiple pieces for upload to S3. | |
S3 only supports 5Gb files for uploading directly, so for larger CloudBioLinux | |
box images we need to use boto's multipart file support. | |
This parallelizes the task over available cores using multiprocessing. | |
Usage: | |
s3_multipart_upload.py <file_to_transfer> <bucket_name> [<s3_key_name>] |
BEGIN { | |
skipped = 0; | |
processed = 0; | |
} | |
/Skipped/ { | |
skipped++; | |
} | |
/Saving/ { |
Latency Comparison Numbers (~2012) | |
---------------------------------- | |
L1 cache reference 0.5 ns | |
Branch mispredict 5 ns | |
L2 cache reference 7 ns 14x L1 cache | |
Mutex lock/unlock 25 ns | |
Main memory reference 100 ns 20x L2 cache, 200x L1 cache | |
Compress 1K bytes with Zippy 3,000 ns 3 us | |
Send 1K bytes over 1 Gbps network 10,000 ns 10 us | |
Read 4K randomly from SSD* 150,000 ns 150 us ~1GB/sec SSD |
2012-05-31 15:17:40,719 - root - INFO - Logging to /tmp/dd-agent.log | |
2012-05-31 15:17:40,719 - root - WARNING - Pid file: /tmp/dd-agent.pid | |
2012-05-31 15:17:40,719 - root - INFO - Running in foreground | |
2012-05-31 15:17:40,719 - agent - DEBUG - Collecting basic system stats | |
2012-05-31 15:17:40,738 - agent - DEBUG - System: {'nixV': ('Ubuntu', '10.04', 'lucid'), 'cpuCores': 2, 'machine': 'x86_64', 'platform': 'linux2', 'pythonV': '2.6.5', 'processor': ''} | |
2012-05-31 15:17:40,738 - agent - DEBUG - Creating checks instance | |
2012-05-31 15:17:40,785 - agent - INFO - Running on EC2, instanceId: i-fdf2929e | |
2012-05-31 15:17:40,786 - checks - INFO - Dogstream parsers: [] | |
2012-05-31 15:17:40,788 - checks - INFO - Starting checks | |
2012-05-31 15:17:40,788 - checks - DEBUG - SIZE: <function getApacheStatus at 0x2ca5a28> wrote 5 bytes uncompressed |
root@arcee:/var/log# curl http://localhost/server-status?auto | |
BusyWorkers: 1 | |
IdleWorkers: 9 | |
Scoreboard: _W.________..................................................................................................................................................................................................................................................... | |
VS. | |
mlambie@blitzwing:~$ curl http://localhost/server-status?auto | |
Total Accesses: 557 | |
Total kBytes: 17924 |
#!/bin/bash -e | |
# Check python 2.6 or greater | |
python -V 2>&1 | awk '$2 !~ /^2.[67]|^3/ {exit 1}' | |
# Download virtualenv | |
curl https://raw.github.com/pypa/virtualenv/master/virtualenv.py > virtualenv.py | |
python virtualenv.py $HOME/datadog | |
source $HOME/datadog/bin/activate | |
# Install dogapi | |
pip install dogapi | |
# Set up the dog alias |