Skip to content

Instantly share code, notes, and snippets.

@eduardszoecs
Created January 20, 2018 21:55
Show Gist options
  • Save eduardszoecs/d3a49208f8c4dffda062dfaf83ac7a4b to your computer and use it in GitHub Desktop.
Save eduardszoecs/d3a49208f8c4dffda062dfaf83ac7a4b to your computer and use it in GitHub Desktop.
# plain sample
foo1 <- function(x, n){
s <- sample(c(-1, 1), n, replace = TRUE)
min(which(cumsum(s) == 0))
}
# addaptive sample length
foo2 <- function(nmax, step = 10000, start = 0) {
s <- sample(c(-1, 1), step, replace = TRUE)
res <- which(cumsum(s) == 0)
if (length(res) > 0) {
return(start + min(res))
} else {
if (nmax < (start + step)) {
return(NA)
} else {
foo2(nmax, step = step, start = start + step)
}
}
}
require(microbenchmark)
n <- 1000000
b <- microbenchmark(foo1(n),
foo2(n))
plot(b)
# x1 <- replicate(1000, foo1(n))
x2 <- replicate(100000, foo2(n))
table(x2)
hist(x2)
table(x2)
@eduardszoecs
Copy link
Author

@hwborchers might adaptive sampling fit your needs? Additional speed-up could be via parallelisation...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment