Skip to content

Instantly share code, notes, and snippets.

@jace
Created March 17, 2013 11:34
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save jace/5181187 to your computer and use it in GitHub Desktop.
Save jace/5181187 to your computer and use it in GitHub Desktop.
Just how much slower is it to add up decimals than to add up integers?
from random import randint
from decimal import Decimal
import timeit
print "Creating 30 random numbers"
ints = [randint(0, 10000) for x in range(30)]
print "Converting to decimals"
numbers = [Decimal(x) for x in ints]
# This method says sum(Decimal) takes ~1600 times more than sum(int)
print "Method 1: Adding ints"
print timeit.timeit("sum(ints)", setup="from __main__ import ints")
print "Method 1: Adding decimals"
print timeit.timeit("sum(numbers)", setup="from __main__ import numbers")
# This method shows sum(Decimal) as only ~12x longer than sum(int)
# because randint itself is rather slow
print "Method 2: Adding random ints"
print timeit.timeit("sum([randint(0, 10000) for x in range(30)])", setup="from __main__ import randint, Decimal")
print "Method 2: Adding random decimals"
print timeit.timeit("sum([Decimal(randint(0, 10000)) for x in range(30)])", setup="from __main__ import randint, Decimal")
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment