Skip to content

@joferkington /profile.sh
Created

Embed URL

HTTPS clone URL

Subversion checkout URL

You can clone with
or
.
Download ZIP
Brute force memory monitor
# /bin/sh
# Setup
datfile=$(mktemp)
echo "ElapsedTime MemUsed" > $datfile
starttime=$(date +%s.%N)
# Run the specified command in the background
$@ &
# While the last process is still going
while [ -n "`ps --no-headers $!`" ]
do
bytes=$(ps -o rss -C $1 --no-headers | awk '{SUM += $1} END {print SUM}')
elapsed=$(echo $(date +%s.%N) - $starttime | bc)
if [ $bytes ]
then
echo $elapsed $bytes >> $datfile
fi
sleep 0.05
done
cat $datfile
# Plot up the results with matplotlib
cat <<EOF | python
import pylab, sys, numpy
infile = file("$datfile")
infile.readline() # skip first line
data = numpy.loadtxt(infile)
time,mem = data[:,0], data[:,1]/1024
pylab.plot(time,mem)
pylab.title("Profile of: """ "\"%s\" """ % "$@")
pylab.xlabel('Elapsed Time (s): Total %0.5f s' % time.max())
pylab.ylabel('Memory Used (MB): Peak %0.2f MB' % mem.max())
pylab.show()
EOF
rm $datfile
@neeraj1928

hi,
I was reading about your answer of reading large csv files here http://stackoverflow.com/questions/8956832/python-out-of-memory-on-large-csv-file-numpy and saw your plots there. Could you please explain how to use the above code to get the plots.
Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Something went wrong with that request. Please try again.