Skip to content

Instantly share code, notes, and snippets.

@Sergi030
Created June 5, 2018 09:18
Show Gist options
  • Save Sergi030/4f6e43b6ef5bac152288151a892d8f2e to your computer and use it in GitHub Desktop.
Save Sergi030/4f6e43b6ef5bac152288151a892d8f2e to your computer and use it in GitHub Desktop.
Python script that do a dd of 1k on /dev/random 10 times and tell the wasted time
import subprocess
import os
import timeit
import time
def read_dev_random():
devnull = open(os.devnull, 'w')
subprocess.check_call(['dd', 'if=/dev/random', 'of=mytmp', 'bs=1k', 'count=1'],stdout=devnull, stderr=devnull)
iterations = 10
iteration_times = []
for i in range(iterations):
iteration_time = timeit.timeit(read_dev_random, number=1)
print "Period for iterations %d is %.3f" % (i, iteration_time)
iteration_times.append(iteration_time)
average = sum(iteration_times) / float(iterations)
print "Average time reading random bytes from /dev/random %.3f seconds over %d iterations" % (average, iterations)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment