Skip to content

Instantly share code, notes, and snippets.

@terjehaukaas
Created June 12, 2019 04:57
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save terjehaukaas/353a10bf3204a7930d915ed4c20ac328 to your computer and use it in GitHub Desktop.
Save terjehaukaas/353a10bf3204a7930d915ed4c20ac328 to your computer and use it in GitHub Desktop.
Derivative of Multivariate Objective Function
# ------------------------------------------------------------------------
# The following function is implemented in Python by Professor Terje Haukaas
# at the University of British Columbia in Vancouver, Canada. It is made
# freely available online at terje.civil.ubc.ca together with notes,
# examples, and additional Python code. Please be cautious when using
# this code; it may contain bugs and comes without any form of warranty.
#
# The following notation applies:
# F(x) = objective function
# nablaF = dF/dx = gradient vector
# ------------------------------------------------------------------------
def nablaF(x, F, F0):
# Extract number of design variables
numDVs = len(x)
# Initialize the gradient vector
gradient = np.zeros(numDVs)
# Loop over design variables and do finite difference
for i in range(numDVs):
# Backup of design variable value
backup = x[i]
# Set the perturbation as a fraction of the x-value
dx = 0.001
if x[i] != 0.0:
dx = dx * x[i]
# Perturbed vector of design variables
x[i] += dx
# Perturbed function value
perturbedF = F(x)
# Sought derivative
gradient[i] = (perturbedF - F0) / dx
# Reset the design variable
x[i] = backup
# Return the gradient vector
return gradient
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment