Skip to content

Commit

Permalink
Merge pull request #18 from espm-157/main
Browse files Browse the repository at this point in the history
update for new redlining data
  • Loading branch information
cboettig authored Dec 19, 2023
2 parents a3f8633 + d2b6c32 commit 83bb998
Show file tree
Hide file tree
Showing 7 changed files with 40 additions and 11 deletions.
14 changes: 14 additions & 0 deletions _freeze/contents/earthdata/execute-results/html.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
{
"hash": "936e34da6da3eac5bbee72becc66c013",
"result": {
"markdown": "---\ntitle: \"NASA EarthData\"\nformat: html\n---\n\n\nThe NASA EarthData program provides access to an extensive collection of spatial data products from each of its 12 Distributed Active Archive Centers ('DAACs') on the high-performance S3 storage system of Amazon Web Services (AWS). We can take advantage of range requests with NASA EarthData URLs, but unlike the previous examples,\nNASA requires an authentication step. NASA offers several different mechanisms, including `netrc` authentication, token-based authentication, and S3 credentials, but only the first of these works equally well from locations both inside and outside of AWS-based compute, so there really is very little reason to learn the other two.\n\nThe [`earthdatalogin` package in R](https://boettiger-lab.github.io/earthdatalogin/) or the `earthaccess` package in Python handle the authentication. The R package sets up authentication behind the scenes using environmental variables.\n\n\n::: {.cell}\n\n```{.r .cell-code}\nearthdatalogin::edl_netrc()\n```\n:::\n\n\n(A default login is supplied though users are encouraged to [register](https://urs.earthdata.nasa.gov/home) for their own individual accounts.) Once this is in place, EarthData's protected URLs can be used like any other: \n\n\n::: {.cell}\n\n```{.r .cell-code}\nterra::rast(\"https://data.lpdaac.earthdatacloud.nasa.gov/lp-prod-protected/HLSL30.020/HLS.L30.T56JKT.2023246T235950.v2.0/HLS.L30.T56JKT.2023246T235950.v2.0.SAA.tif\",\n vsi=TRUE)\n```\n\n::: {.cell-output .cell-output-stdout}\n```\nclass : SpatRaster \ndimensions : 3660, 3660, 1 (nrow, ncol, nlyr)\nresolution : 30, 30 (x, y)\nextent : 199980, 309780, 7190200, 7300000 (xmin, xmax, ymin, ymax)\ncoord. ref. : WGS 84 / UTM zone 56N (EPSG:32656) \nsource : HLS.L30.T56JKT.2023246T235950.v2.0.SAA.tif \nname : HLS.L30.T56JKT.2023246T235950.v2.0.SAA \n```\n:::\n:::\n",
"supporting": [],
"filters": [
"rmarkdown/pagebreak.lua"
],
"includes": {},
"engineDependencies": {},
"preserve": {},
"postProcess": true
}
}
4 changes: 2 additions & 2 deletions _freeze/contents/intro/execute-results/html.json

Large diffs are not rendered by default.

Binary file modified _freeze/contents/intro/figure-html/unnamed-chunk-10-1.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified _freeze/contents/intro/figure-html/unnamed-chunk-14-1.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified _freeze/contents/intro/figure-html/unnamed-chunk-15-1.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
19 changes: 17 additions & 2 deletions contents/earthdata.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,21 @@ title: "NASA EarthData"
format: html
---

The NASA EarthData program provides access to an extensive collection of spatial data products from each of it's 12 Distributed Active Archive Centers ('DAACs') on the high-performance S3 storage system of Amazon Web Services (AWS).
The NASA EarthData program provides access to an extensive collection of spatial data products from each of its 12 Distributed Active Archive Centers ('DAACs') on the high-performance S3 storage system of Amazon Web Services (AWS). We can take advantage of range requests with NASA EarthData URLs, but unlike the previous examples,
NASA requires an authentication step. NASA offers several different mechanisms, including `netrc` authentication, token-based authentication, and S3 credentials, but only the first of these works equally well from locations both inside and outside of AWS-based compute, so there really is very little reason to learn the other two.

The [`earthdatalogin` package in R](https://boettiger-lab.github.io/earthdatalogin/) or the `earthaccess` package in Python handle the authentication. The R package sets up authentication behind the scenes using environmental variables.

```{r}
earthdatalogin::edl_netrc()
```

(A default login is supplied though users are encouraged to [register](https://urs.earthdata.nasa.gov/home) for their own individual accounts.) Once this is in place, EarthData's protected URLs can be used like any other:

```{r}
terra::rast("https://data.lpdaac.earthdatacloud.nasa.gov/lp-prod-protected/HLSL30.020/HLS.L30.T56JKT.2023246T235950.v2.0/HLS.L30.T56JKT.2023246T235950.v2.0.SAA.tif",
vsi=TRUE)
```

...
14 changes: 7 additions & 7 deletions contents/intro.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -220,8 +220,8 @@ In addition to large scale raster data such as satellite imagery, the analysis o

```{r}
#| message = FALSE, results="hide"
sf <- st_read("/vsicurl/https://dsl.richmond.edu/panorama/redlining/static/downloads/geojson/CASanFrancisco1937.geojson") |>
st_make_valid()
sf <- st_read("/vsicurl/https://dsl.richmond.edu/panorama/redlining/static/citiesData/CASanFrancisco1937/geojson.json") |>
st_make_valid() |> select(-label_coords)
poly <- ndvi |> extract_geom(sf, FUN = mean, reduce_time = TRUE)
sf$NDVI <- poly$NDVI
```
Expand All @@ -231,7 +231,7 @@ sf$NDVI <- poly$NDVI

```{python}
ndvi.rio.to_raster(raster_path="ndvi.tif", driver="COG")
sf_url = "/vsicurl/https://dsl.richmond.edu/panorama/redlining/static/downloads/geojson/CASanFrancisco1937.geojson"
sf_url = "/vsicurl/https://dsl.richmond.edu/panorama/redlining/static/citiesData/CASanFrancisco1937/geojson.json"
mean_ndvi = zonal_stats(sf_url, "ndvi.tif", stats="mean")
```
Expand All @@ -251,7 +251,7 @@ We plot the underlying NDVI as well as the average NDVI of each polygon, along w
#| message = FALSE, warning = FALSE
tm_shape(ndvi_stars) + tm_raster(col.scale = mako) +
tm_shape(sf) + tm_polygons('NDVI', fill.scale = fill) +
tm_shape(sf) + tm_text("holc_grade", col="darkblue", size=0.6) +
tm_shape(sf) + tm_text("grade", col="darkblue", size=0.6) +
tm_legend_hide()
```

Expand All @@ -275,7 +275,7 @@ Are historically redlined areas still less green?
```{r}
sf |>
as_tibble() |>
group_by(holc_grade) |>
group_by(grade) |>
summarise(ndvi = mean(NDVI),
sd = sd(NDVI)) |>
knitr::kable()
Expand All @@ -289,9 +289,9 @@ import polars as pl
(gpl.
from_geopandas(sf).
group_by("holc_grade").
group_by("grade").
agg(pl.col("ndvi").mean()).
sort("holc_grade")
sort("grade")
)
```

Expand Down

0 comments on commit 83bb998

Please sign in to comment.