Skip to content

Instantly share code, notes, and snippets.

@joferkington
Created April 22, 2012 03:38
Show Gist options
  • Star 5 You must be signed in to star a gist
  • Fork 5 You must be signed in to fork a gist
  • Save joferkington/2447356 to your computer and use it in GitHub Desktop.
Save joferkington/2447356 to your computer and use it in GitHub Desktop.
Brute force memory monitor
# /bin/sh
# Setup
datfile=$(mktemp)
echo "ElapsedTime MemUsed" > $datfile
starttime=$(date +%s.%N)
# Run the specified command in the background
$@ &
# While the last process is still going
while [ -n "`ps --no-headers $!`" ]
do
bytes=$(ps -o rss -C $1 --no-headers | awk '{SUM += $1} END {print SUM}')
elapsed=$(echo $(date +%s.%N) - $starttime | bc)
if [ $bytes ]
then
echo $elapsed $bytes >> $datfile
fi
sleep 0.05
done
cat $datfile
# Plot up the results with matplotlib
cat <<EOF | python
import pylab, sys, numpy
infile = file("$datfile")
infile.readline() # skip first line
data = numpy.loadtxt(infile)
time,mem = data[:,0], data[:,1]/1024
pylab.plot(time,mem)
pylab.title("Profile of: """ "\"%s\" """ % "$@")
pylab.xlabel('Elapsed Time (s): Total %0.5f s' % time.max())
pylab.ylabel('Memory Used (MB): Peak %0.2f MB' % mem.max())
pylab.show()
EOF
rm $datfile
@neeraj1928
Copy link

hi,
I was reading about your answer of reading large csv files here http://stackoverflow.com/questions/8956832/python-out-of-memory-on-large-csv-file-numpy and saw your plots there. Could you please explain how to use the above code to get the plots.
Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment