Skip to content

Instantly share code, notes, and snippets.

View mdsumner's full-sized avatar

Michael Sumner mdsumner

  • Integrated Digital East Antarctica, Australian Antarctic Division
  • Hobart, Australia
View GitHub Profile
files <- tibble::tibble(filename = vsi_read_dir(vsidir)) |>
  dplyr::mutate(source = glue::glue("{vsidir}/{filename}")) |>
  dplyr::mutate(longitude = glue::glue("vrt://{source}?sd_name=longitude"),
                latitude = glue::glue("vrt://{source}?sd_name=latitude"))

files <- do.call(rbind, lapply(vars, \(.x) dplyr::filter(files, stringr::str_detect(filename, glue::glue("^{.x}_")))))

i <- 1
## we can probably save this for subsequent runs
pkgurl <- "https://cran.r-project.org/src/contrib/terra_1.8-50.tar.gz"
dsn0 <- sprintf("/vsitar//vsicurl/%s", pkgurl)
files <- gdalraster::vsi_read_dir(dsn0, recursive  = TRUE)
tif0 <- grep("\\.tif$", files, value = TRUE)[1]

tif <- sprintf("%s/%s", dsn0, tif0)

new(gdalraster::GDALRaster, tif)
library(bowerbird)
my_source <- bb_source(
  name = "Chlorophyll-a concentration in seawater (not log-transformed), generated by as a blended combination of OCI, OCI2, OC2 and OCx algorithms, depending on water class memberships",
  id = "ESACCI-OC-L3S-CHLOR_A-MERGED",
  description = "European Space Agency Climate Change Initiative composites of merged sensor (MERIS, MODIS Aqua, SeaWiFS LAC & GAC, VIIRS, OLCI) products.",
  doc_url = "http://esa-oceancolour-cci.org",
  source_url =
    c("https://www.oceancolour.org/thredds/catalog/cci/v6.0-release/geographic/monthly/chlor_a/catalog.html",
      "https://www.oceancolour.org/thredds/catalog/cci/v6.0-release/geographic/daily/chlor_a/catalog.html",

Just a bit of response to that list and claim that no library aims to port more of GDAL in a lazy way - there is osgeo.gdal/ogr - that is very nearly GDAL-complete, and includes multidimensional mode for rasters, including its ability to cast from "classic (2D) raster" to mdim, and vice versa

It's a problem that sits occluded by the user-package 'rasterio', we've had a very similar problem in R for a long time (rgdal, now sf/terra as representative of the underlying library), but now we have gdalraster, with much better API fidelity (and includes vector, despite the name, and we've made good progress on multidim too).

I'm surprised this is still a languishing topic, but I'd recommend Python/xarray folks look hard at the built-in library that GDAL ships with, and especially with the new cli (released in 3.11) that provides a lot more flexibility and consistency to craft abstracted pipelines without even rendering

https://proj.org/en/stable/operations/projections/omerc.html

  prj <- "+proj=omerc +lonc=-119 +lat_0=37 +gamma=30"
m <- do.call(cbind, maps::map("state", "california", plot = F)[1:2])
library(terra)
#> terra 1.8.50
plot(terra::project(m, to = prj, from = "EPSG:4326"), pch = ".", asp = 1/cos(37 * pi/180), type = "b")
#> Warning: [project] 3 failed transformations

https://stat.ethz.ch/pipermail/r-sig-geo/2025-May/029541.html

Below we calculate individual distances along polygon edges, and sum up. This works here only because this is a single-ring polygon.

The numbers in play are

129.248935: the planar distance in the crs

129.285384: the "true" distance, to some insanely accurate level

raw history, relevant calls are with anglr:: and rgl functions (ignore the ggplot that's interleaved I was checking against plots)

library(anglr)
as.contour(large_topo_spat)
as.contour(large_topo_spat, levels = c(-1500, -1100, -700, -300, 0))
plot3d(as.contour(large_topo_spat, levels = c(-1500, -1100, -700, -300, 0)))
plot3d(sf::st_as_sf(as.contour(large_topo_spat, levels = c(-1500, -1100, -700, -300, 0))))

Here's some examples of using the new NASA experimental json Zarr

with xarray

import xarray 
## no
#import earthaccess
#earthaccess.login()  ## use env vars ??
#fs = earthaccess.get_s3_filesystem(daac="podaac")
longstrings <- c("analysis.\\\",\\\"title\\\":\\\"RSS CCMP V3.1 6-hourly surface winds (Level 4)\\\",\\\"coordinates\\\":\\\"latitude longitude t", 
                 "dsaoijdoijdd09d9de9e9e9e00\"RSS CCMPbalsljdojasdoiewwiow8us98hasdhasd")

findregions <- function(x, pattern, window = 10) {
  index <- gregexpr(pattern, x)
  outlist <- vector("list", length(index))
  for (i in seq_along(index)) {
    outlist[[i]] <- character(length(index[[i]]))
 for (j in seq_along(index[[i]])) {