Created

Embed URL

HTTPS clone URL

SSH clone URL

You can clone with HTTPS or SSH.

Download Gist

Brute force memory monitor

View profile.sh
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39
# /bin/sh
 
# Setup
datfile=$(mktemp)
echo "ElapsedTime MemUsed" > $datfile
 
starttime=$(date +%s.%N)
 
# Run the specified command in the background
$@ &
 
# While the last process is still going
while [ -n "`ps --no-headers $!`" ]
do
bytes=$(ps -o rss -C $1 --no-headers | awk '{SUM += $1} END {print SUM}')
elapsed=$(echo $(date +%s.%N) - $starttime | bc)
if [ $bytes ]
then
echo $elapsed $bytes >> $datfile
fi
sleep 0.05
done
cat $datfile
 
# Plot up the results with matplotlib
cat <<EOF | python
import pylab, sys, numpy
infile = file("$datfile")
infile.readline() # skip first line
data = numpy.loadtxt(infile)
time,mem = data[:,0], data[:,1]/1024
pylab.plot(time,mem)
pylab.title("Profile of: """ "\"%s\" """ % "$@")
pylab.xlabel('Elapsed Time (s): Total %0.5f s' % time.max())
pylab.ylabel('Memory Used (MB): Peak %0.2f MB' % mem.max())
pylab.show()
EOF
 
rm $datfile

hi,
I was reading about your answer of reading large csv files here http://stackoverflow.com/questions/8956832/python-out-of-memory-on-large-csv-file-numpy and saw your plots there. Could you please explain how to use the above code to get the plots.
Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Something went wrong with that request. Please try again.