def extent(dimension, transform):
return [transform[0], transform[0] + dimension[0] * transform[1],
transform[3] + dimension[1] * transform[5], transform[3]]
def bbox(dimension, transform):
library(duckdbfs)
library(dplyr)
#>
#> Attaching package: 'dplyr'
#> The following objects are masked from 'package:stats':
#>
#> filter, lag
#> The following objects are masked from 'package:base':
#>
library(sf)
#> Linking to GEOS 3.12.1, GDAL 3.9.0dev-cb4d30f56d, PROJ 9.3.1; sf_use_s2() is
#> TRUE
library(raster)
#> Loading required package: sp
## threshold is roughly side length in metres of the roi, i.e. sqrt(area)
dem_source <- function(threshold = 1) {
I don't entirely trust the timings here, but it's faster (maybe 2x faster to plot).
dsn <- "WMTS:https://services.arcgisonline.com/arcgis/rest/services/World_Imagery/MapServer/WMTS/1.0.0/WMTSCapabilities.xml"
library(gdalraster)
#> GDAL 3.9.0dev-cb4d30f56d, released 2024/04/15 (debug build), GEOS 3.12.1, PROJ 9.3.1
ds_int <- new(GDALRaster, sprintf("vrt://%s?ot=Int32", dsn))
from osgeo import gdal
gdal.UseExceptions()
## we do something with vrt
dsn = "vrt:///vsicurl/https://github.com/cran/rgl/raw/master/inst/textures/worldsmall.png?a_ullr=-180,90,180,-90&expand=rgb"
docker pull ghcr.io/rocker-org/verse:4.3.3
docker run --rm -ti ghcr.io/rocker-org/verse:4.3.3 bash
quarto create project book abook
cd abook
## I turned off PDF rendering in _quarto.yml because lots of deps
quarto render
fun stuff, compare two polygons from one dataset by plotting them together (local equal area projection ensures scale is right)
library(terra)
x <- terra::vect("/vsizip//vsicurl/https://github.com/wmgeolab/geoBoundaries/raw/main/releaseData/CGAZ/geoBoundariesCGAZ_ADM0.zip")
crs_local <- function(x) {
cc <- terra::geom(terra::centroids(x))[, c("x", "y"), drop = FALSE]
sprintf("+proj=laea +lon_0=%i +lat_0=%i", round(cc[,1]), round(cc[,2]))
WIP script here: https://raw.githubusercontent.com/mdsumner/rw-book/main/experiments/big_stac_read.R
trying to emulate this Pangeo example: https://discourse.pangeo.io/t/comparing-odc-stac-load-and-stackstac-for-raster-composite-workflow/
Sys.setenv(VSI_CACHE="TRUE")
Sys.setenv(GDAL_CACHEMAX="30%")
Sys.setenv(VSI_CACHE_SIZE="10000000")
library(arrow)
#>
#> Attaching package: 'arrow'
#> The following object is masked from 'package:utils':
#>
#> timestamp
library(terra)
#> terra 1.7.71
#>
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
https://discourse.pangeo.io/t/example-which-highlights-the-limitations-of-netcdf-style-coordinates-for-large-geospatial-rasters/4140/12?u=michael_sumner | |
possible response | |
I just don't think xarray has any business here, it should read the arrays and represent them as they are. Resolving grids to regular from shear affine or gcps/rcps/geolocation arrays is the job of the warper api. As soon as you're in xarray at that point its too late, you should go back to the sources and stream them through the warper first, where gdal has already identified or been told about which georeferencing scheme is active, and target either to virtual or materialized data sets. That's why I baulked at these degenerate rectilinear coordinate arrays in netcdf and then in xarray, it's a suboptimal situation when I can see that a four-value extent is enough (even in gdal, the 1D coord arrays are tagged as geolocation arrays, ready for the warper, same with 2D cooord arrays, gcps, and rpcs, or the geotransform). I think you're sugge |
NewerOlder