Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Save VictorSeven/996e0745e820dde0610ada9e9e025844 to your computer and use it in GitHub Desktop.
Save VictorSeven/996e0745e820dde0610ada9e9e025844 to your computer and use it in GitHub Desktop.
Julia's numerical derivative following Numpy gradient
function numerical_derivative(y, x)
"""
Compute the derivative of the function y(x) which has been sampled in discrete steps. The implementation here has
been adapted from Numpy's gradient function (centered finite differences). Boundaries have just forward differences.
It works only for 1D functions, but should be possible to easily extend to other things.
"""
#Create target vector
n = length(x)
derivs = Vector{Float64}(undef, n)
#Take care of the boundary conditions
derivs[1] = (y[2] - y[1]) / (x[2] - x[1])
derivs[end] = (y[end] - y[end-1]) / (x[end] - x[end-1])
#dx for non-equally spaced points
xdiff = @views x[2:end] - x[1:end-1]
dx1 = xdiff[1:end-1]
dx2 = xdiff[2:end]
#Constants for left, centered and right steps
a = @. -(dx2)/(dx1 * (dx1 + dx2))
b = @. (dx2 - dx1) / (dx1 * dx2)
c = @. dx1 / (dx2 * (dx1 + dx2))
#Slices
slice1 = 2:n-1
slice2 = 1:n-2
slice3 = 2:n-1
slice4 = 3:n
#Finish computation and return values
@. derivs[slice1] = @views a*y[slice2] + b*y[slice3] + c*y[slice4]
return derivs
end
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment