Skip to content

Instantly share code, notes, and snippets.

@ForceBru
Last active January 25, 2021 21:12
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save ForceBru/2da87f80725d62eb8f62454d53e00118 to your computer and use it in GitHub Desktop.
Save ForceBru/2da87f80725d62eb8f62454d53e00118 to your computer and use it in GitHub Desktop.
NumPy vs Julia
import numpy as np
def compute_const(x, sigma):
s = sigma.reshape(-1, 1)
return np.exp(-x**2 / (2 * s)) / (s * np.sqrt(2 * np.pi))
def log_likelihood(p, C):
return np.log(p @ C).sum()
Nx = 50_000
Np = 500
x = np.linspace(0, 1, Nx)
sigma = np.linspace(1, 5, Np)
p = np.linspace(0, 1, 500)
C = compute_const(x, sigma)
print(f"Python with Nx={Nx}, Np={Np}...")
# Using `.ipy` extension solely for the `%timeit` magic
%timeit log_likelihood(p, C)
using BenchmarkTools
function compute_const(x::Array{Float64, 2}, sigma::Array{Float64, 2})::Array{Float64, 2}
s = sigma'
exp.(-x.^2.0 ./ (2 .* s)) ./ (s .* sqrt(2 * pi))
end
log_likelihood(p::Array{Float64, 2}, C::Array{Float64, 2})::Float64 = sum(log.(p * C))
Nx = 50_000
Np = 500
x = range(0, 1, length=Nx)' |> collect
sigma = range(1, 5, length=Np)' |> collect
p = range(0, 1, length=Np)' |> collect
C = compute_const(x, sigma)
# compile
log_likelihood(p, C)
println("Julia with Nx=$Nx, Np=$Np...")
println(@benchmark log_likelihood(p, C))

Results:

forcebru ~/t/julia_numpy_bench> ipython bench.ipy 
Python with Nx=50000, Np=500...
9.84 ms ± 73 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)
forcebru ~/t/julia_numpy_bench> julia-1.6 bench.jl 
Julia with Nx=50000, Np=500...
Trial(32.285 ms)
forcebru ~/t/julia_numpy_bench> julia-1.6 -O3 bench.jl
Julia with Nx=50000, Np=500...
Trial(32.353 ms)
forcebru ~/t/julia_numpy_bench> 

So, Julia's mean is 32 ms, while NumPy's mean is 10 ms. Julia's linear algebra is somehow 3 times (!) slower than NumPy's?


Setting Nx = 200_000

forcebru ~/t/julia_numpy_bench> julia-1.6 -O3 bench.jl
Julia with Nx=200000, Np=500...
Trial(130.351 ms)
forcebru ~/t/julia_numpy_bench> julia-1.6 bench.jl
Julia with Nx=200000, Np=500...
Trial(130.097 ms)
forcebru ~/t/julia_numpy_bench> ipython bench.ipy
Python with Nx=200000, Np=500...
41 ms ± 167 µs per loop (mean ± std. dev. of 7 runs, 10 loops each)
forcebru ~/t/julia_numpy_bench> 

Still 3 times slower...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment