Create a gist now

Instantly share code, notes, and snippets.

What would you like to do?
Brute force memory monitor
# /bin/sh
# Setup
datfile=$(mktemp)
echo "ElapsedTime MemUsed" > $datfile
starttime=$(date +%s.%N)
# Run the specified command in the background
$@ &
# While the last process is still going
while [ -n "`ps --no-headers $!`" ]
do
bytes=$(ps -o rss -C $1 --no-headers | awk '{SUM += $1} END {print SUM}')
elapsed=$(echo $(date +%s.%N) - $starttime | bc)
if [ $bytes ]
then
echo $elapsed $bytes >> $datfile
fi
sleep 0.05
done
cat $datfile
# Plot up the results with matplotlib
cat <<EOF | python
import pylab, sys, numpy
infile = file("$datfile")
infile.readline() # skip first line
data = numpy.loadtxt(infile)
time,mem = data[:,0], data[:,1]/1024
pylab.plot(time,mem)
pylab.title("Profile of: """ "\"%s\" """ % "$@")
pylab.xlabel('Elapsed Time (s): Total %0.5f s' % time.max())
pylab.ylabel('Memory Used (MB): Peak %0.2f MB' % mem.max())
pylab.show()
EOF
rm $datfile
@neeraj1928

This comment has been minimized.

Show comment Hide comment
@neeraj1928

neeraj1928 Jun 21, 2014

hi,
I was reading about your answer of reading large csv files here http://stackoverflow.com/questions/8956832/python-out-of-memory-on-large-csv-file-numpy and saw your plots there. Could you please explain how to use the above code to get the plots.
Thank you.

hi,
I was reading about your answer of reading large csv files here http://stackoverflow.com/questions/8956832/python-out-of-memory-on-large-csv-file-numpy and saw your plots there. Could you please explain how to use the above code to get the plots.
Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment