code to do a small extract in R
https://rstats.me/@mdsumner/114729203537529604
Global Ensemble Digital Terrain Model 30m (GEDTM30)
https://zenodo.org/records/15490367
code to do a small extract in R
https://rstats.me/@mdsumner/114729203537529604
Global Ensemble Digital Terrain Model 30m (GEDTM30)
https://zenodo.org/records/15490367
https://github.com/openlandmap/GEDTM30?tab=readme-ov-file
gdalinfo /vsicurl/https://s3.opengeohub.org/global/edtm/legendtm_rf_30m_m_s_20000101_20231231_go_epsg.4326_v20250130.tif
Driver: GTiff/GeoTIFF
Files: /vsicurl/https://s3.opengeohub.org/global/edtm/legendtm_rf_30m_m_s_20000101_20231231_go_epsg.4326_v20250130.tif
Size is 1440010, 600010
Coordinate System is:
x_from_col <- function(dimension, bbox, col) {
col[col < 1] <- NA
col[col > dimension[1L]] <- NA
xres <- diff(bbox[c(1, 3)]) / dimension[1]
bbox[1] - xres/2 + col * xres
}
y_from_row <- function(dimension, bbox, row) {
row[row < 1] <- NA
row[row > dimension[2]] <- NA
I have a couple of AI/ML projects related to mapping things, often conservation related, with remote sensing data. Some details and packages will vary, but the process below describes how I generally approach these types of problems. Some of these tools I have only touched briefly, but I like them, and this is more an outline of how I would like to approach a new project than a retrospective look at my previous work.
We use AWS, so it makes sense to use datasets and services that are already hosted on AWS. The data discovery and loading part of this process would look somewhat different if we were using Azure and Planetary Computer, and very different if we were using GCP and Earth Engine.
All of my analysis will be done using python on an AWS VM in the same region as my data on S3, Probably using VSCode on Sagemaker or [JupyterLab](https://docs.aws.amazo
Here is some code to explore the variety available in calculating area across various packages.
x <- rnaturalearth::ne_countries(country = c("Greenland", "Spain"), returnclass = "sf")
as.numeric(sf::st_area(x))/1e6
## Greenland 2.166e6 km² -41, 72
## Spain 505990 km² -2, 40
# 12.5 Spatial CV (with spatialsample) | |
library(tidymodels) | |
library(spatialsample) | |
library(sf) | |
data("lsl", "study_mask", package = "spDataLarge") | |
lsl <- lsl |> | |
st_as_sf(coords = c("x", "y"), crs = "EPSG:32717") | |
ta <- terra::rast(system.file("raster/ta.tif", package = "spDataLarge")) |
library(dplyr) | |
library(sf) | |
if (!file.exists("rivers_gb.rds")) { | |
download.file( | |
"https://beaver-net-app.s3.eu-west-2.amazonaws.com/gb_rivers/rivers_gb.rds", | |
"rivers_gb.rds" | |
) | |
} |
library(readr) | |
library(ggdist) | |
library(tidyverse) | |
library(magrittr) | |
library(cowplot) | |
library(ISOweek) | |
df <- readr::read_csv("https://github.com/nytimes/covid-19-data/raw/master/rolling-averages/us.csv") | |
df %<>% mutate(wday = lubridate::wday(lubridate::ymd(date))) | |
weekdays <- c('Sunday', 'Monday', 'Tuesday', 'Wednesday', 'Thursday', 'Friday', 'Saturday') |
working on the full-affine approach to warping a zoom, from the rasterio quickstart
https://rasterio.readthedocs.io/en/latest/topics/reproject.html#reprojecting-a-geotiff-dataset
this is very rough, just a bit of exploration and reference for me to expand elsewhere ...
f <- '/vsicurl/https://github.com/rasterio/rasterio/raw/master/tests/data/RGB.byte.tif'
library(sf)
#> Linking to GEOS 3.9.1, GDAL 3.3.2, PROJ 7.2.1
edge0 <- function(x, y, ndiscr = 18) {
dx <- if(length(x) > 1) diff(x) else diff(y)
sf::st_set_crs(sf::st_segmentize(sf::st_sfc(sf::st_linestring(cbind(x, y))), dx/ndiscr),
"OGC:CRS84")
}
north <- function(x = c(-180, 180), y = 90, ndiscr = 18) {
edge0(x, y, ndiscr)