Skip to content

Instantly share code, notes, and snippets.

@Non-Contradiction
Last active July 2, 2018 22:10
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save Non-Contradiction/6b06e541a291a38872210be7dc23aa63 to your computer and use it in GitHub Desktop.
Save Non-Contradiction/6b06e541a291a38872210be7dc23aa63 to your computer and use it in GitHub Desktop.
require(autodiffr)
ad_setup() # to ensure it is established
ores <- function(x){
x # Function will be the parameters. ofn is sum of squares
}
ores <- ad.variant(ores)
ofn <- function(x){
res <- ores(x) # returns a vector of residual values
# print(res)
val <- as.vector(crossprod(res))
# val <- crossprod(res)
# as.numeric can't be used as it is equivalent to as.double
# but when using autodiff w.r.t the function,
# the argument will be of some extension of the float number,
# so as.numeric (as.double) will cause problems.
# I'm wondering whether ad.variant should treat as.numeric (as.double) as as.vector.
# BTW, things like as.numeric shouldn't be needed because I try to make autodiffr adaptable
# to various kinds of input/output,
# at least for user interface functions like grad/jacobian/hessian,
# so if a 1x1 matrix doesn't work, it should be counted as an issue.
val
}
ofn <- ad.variant(ofn)
## Now try to generate the gradient function
ogr <- grad(ofn, mode = "forward") ## there is some problems with the default reverse mode
print(ogr) # this will be more or less meaningless link to Julia function
x0 <- c(1,2,3)
print(ofn(x0)) # should be 14
print(ogr(x0)) # should be 2, 4, 6
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment