Created
April 27, 2017 14:58
-
-
Save patternproject/0a7ade8fa3d85453076d9bafc2087127 to your computer and use it in GitHub Desktop.
How to successively combine years and months using purrr
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Hi, | |
----------------------------------- | |
PROBLEM STATEMENT | |
----------------------------------- | |
I want to generate URLS for each month for each year as follows: | |
[...] | |
https://s3.amazonaws.com/data/201611.csv.zip | |
https://s3.amazonaws.com/data/201612.csv.zip | |
https://s3.amazonaws.com/data/201701.csv.zip | |
https://s3.amazonaws.com/data/201702.csv.zip | |
[...] | |
----------------------------------- | |
CODE | |
----------------------------------- | |
Here is my attempt: | |
head.f.name = "https://s3.amazonaws.com/data/%sA" | |
tail.f.name = ".csv.zip" | |
v.i = str_pad(1:12,2,pad="0") | |
map_chr(2015:2017, ~sprintf(head.f.name,.)) %>% | |
map2_chr(v.i,~str_replace(.,"[:upper:]$") ) | |
----------------------------------- | |
ERROR | |
----------------------------------- | |
But I get an error: | |
Error: `.x` (3) and `.y` (12) are different lengths | |
Should not the smaller vector be recycled in this case. |
I think the base R solution from @jennybc is the best way to go. However, just for variety sake, here's a possible approach using purrr:
library(purrr)
url <- "https://s3.amazonaws.com/data/%d%02d.csv.zip"
d <- cross2(2015:2016, 1:12)
map_chr(d, ~ sprintf(url, .[[1]], .[[2]]))
#> [1] "https://s3.amazonaws.com/data/201501.csv.zip"
#> [2] "https://s3.amazonaws.com/data/201601.csv.zip"
#> [3] "https://s3.amazonaws.com/data/201502.csv.zip"
#> [4] "https://s3.amazonaws.com/data/201602.csv.zip"
#> ...
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I would just do this:
I put in a tweet reply, but Twitter sort of mangles because of the URL.