Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cal/Val - Multiple validation reports: site reports, quarterly, and yearly #365

Merged
merged 33 commits into from
Feb 3, 2025
Merged
Show file tree
Hide file tree
Changes from 10 commits
Commits
Show all changes
33 commits
Select commit Hold shift + click to select a range
0d0096b
Q2 2024 added
awalshie Dec 18, 2024
3dbf76e
Fix typo
benji-glitsos-ga Dec 18, 2024
a682ae0
Minor punctuation
benji-glitsos-ga Dec 18, 2024
abd6df9
Minor formatting
benji-glitsos-ga Dec 18, 2024
85bc29f
Minor phrasing
benji-glitsos-ga Dec 18, 2024
7d548be
Repetition
benji-glitsos-ga Dec 18, 2024
2ba65e9
Some formatting in Introduction
benji-glitsos-ga Dec 18, 2024
47958b9
editing and removed obscure acronym
benji-glitsos-ga Dec 18, 2024
1ee69b0
Wording
benji-glitsos-ga Dec 18, 2024
b8ebfe9
Wording
benji-glitsos-ga Dec 18, 2024
1af7a1c
Wording and links
benji-glitsos-ga Dec 18, 2024
cb2a59e
Wording
benji-glitsos-ga Dec 18, 2024
294b90e
Using SR for Surface Reflectance within sections
benji-glitsos-ga Dec 18, 2024
3e65979
Reworded counts line
benji-glitsos-ga Dec 18, 2024
ccd783f
Used Calibration/Validation term
benji-glitsos-ga Dec 18, 2024
1bbdb64
Capitalising Figure and Table
benji-glitsos-ga Dec 18, 2024
af40606
q1 2024 report
awalshie Dec 19, 2024
6ecbf2e
2023 q4 report
awalshie Dec 20, 2024
a76130f
2023 Q3 Report
awalshie Dec 20, 2024
6edcf67
update with yearly reports
awalshie Jan 28, 2025
21848a5
update with yearly reports
awalshie Jan 28, 2025
b04cb30
Removed all .ipynb_checkpoints folders and added to gitignore
benji-glitsos-ga Jan 28, 2025
83fb249
Fixes
benji-glitsos-ga Jan 28, 2025
fd0a40e
Added Yearly reports to ToC
benji-glitsos-ga Jan 28, 2025
e8313c6
Removed hyphen in title date/site
benji-glitsos-ga Jan 28, 2025
05631ef
Improving user guides UI text
benji-glitsos-ga Jan 29, 2025
4864356
Changed heading
benji-glitsos-ga Jan 29, 2025
7160ada
Changed hyphens to spaces
benji-glitsos-ga Jan 29, 2025
d142866
Wording
benji-glitsos-ga Jan 29, 2025
acbc214
update numbers
awalshie Feb 2, 2025
3b11c47
Added yearly thumbnail
benji-glitsos-ga Feb 3, 2025
24b2108
update WLC for dual overpass
awalshie Feb 3, 2025
e282e7d
fix up WLC dual overpass
awalshie Feb 3, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
126 changes: 126 additions & 0 deletions docs/validation/quarterly-report/2024-q2/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,126 @@
# 2024 Q2: DEA Quarterly Validation Report

:::{contents} In this report
:local:
:backlinks: none
:::

## Executive Summary

This Quarterly report summarises validation for DEA surface reflectance products for Quarter 2 (April-June) of 2024
and presents aggregate validation results to the end of this quarter.

* During this quarter, 2 field sites were measured 3 times and can be matched to 3 overpasses.
benji-glitsos-ga marked this conversation as resolved.
Show resolved Hide resolved
* Validation of Landsat 8, Sentinel-2A and 2B all improved in accuracy, taking into account the data from this quarter. There were no Landsat 9 overpasses matched during this quarter.
* On an averaged band-by-band basis, Landsat 8 is validated to 2.5%, Landsat 9 is validated to 15% (no new data), Sentinel-2A is validated to 2.4% and Sentinel-2B is validated to 2.5%.

## Introduction

This quarterly report presents a summary of results from Q2 2024 from the Digital Earth
Calibration and Validation team. The report is presented in the following sections:

* Background — this section outlines the context around this work, with particular attention on historical work leading up to this quarter.
* Summary of Validation Work — this section presents an overall picture of the field site measurements undertaken in a table and map.
* Comments on Individual Sites of Interest — this section focuses on any sites where some aspect of the site or measurement was atypical.
* Summary of Band-by-Band Matching — this section presents comparison data for this quarter’s results, in the context of all previous results.
* Comments on How This Quarter’s Work Has Affected Combined Validation Results — this section discusses how the average results for each sensor have changed with the introduction of new validation data from this quarter. This section combines all band data for each platform to show averaged validation results.

The Q2 2024 validation report includes field site measurements that were captured as part of the winter transect work
across South Australia. Note that only one field site measurement for SA1 is part of this report, with other sites appearing in the Q3 2024 validation report.

## Background

The Digital Earth branch within Geoscience Australia offers a suite of Earth observation products, based on data from
both Landsat and Sentinel platforms. The core products are Landsat 8 and 9 and Sentinel-2A and -2B surface reflectance.
To deliver these products with surety, the Calibration and Validation (Cal/Val) team perform vicarious validation
by measuring field sites with hand-held equipment or an Unstaffed Aerial Vehicle (UAV; commonly known as drone) equipment
close to the time of an overpass. This work began with Phase 1, where measurements were performed by multiple groups
across continental Australia. Full details on the results and methodology can be found in the Phase 1 report.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we have a link to the Phase 1 report?


Data for both surface reflectance products and from field site measurements are made freely available. For surface reflectance products, you can visualise
the data at DEA Maps, or for a more in-depth understanding and direct access to data, please visit the DEA Data and
Products page. Field measurement data are made available through the National Spectral Database.

As more field sites are measured and as newer measurements are made over the same field sites, the overall validation of
surface reflectance products becomes more accurate. The purpose of this report is to provide an up-to-date status of validation accuracy,
based on the most recent measurements.

## Summary of Validation Work

2 sites were measured, with 3 individual field site captures. The table below summarises these captures.

:::{csv-table} Summary of field site captures
:header-rows: 1

"Site capture (Date, Field site, Overpasses)","Latitude, Longitude (WGS84)","Instrument","Comments"
"<a href='/validation/site-report/2024-04-16-MUL/'>2024-04-16 MUL: S-2B</a>","-35.12280, 148.86258","Hand-held ASD FR-4","Excellent matchup"
"<a href='/validation/site-report/2024-05-21-MUL/'>2024-05-21 MUL: S-2A</a>","-35.12389, 148.86283","Drone mounted SR-3500","Poor matchup in CA and blue bands."
"<a href='/validation/site-report/2024-06-30-SA1/'>2024-06-30 SA1: L8</a>","-31.81348, 140.64083","Drone mounted SR-3500","Nearby cloud noted."
:::

:::{figure} ./2024Q2_Locations.png

The Figure shows the locations of the field sites measured in this quarter.
:::

## Comments on Individual Sites of Interest

No sites of particular interest.

## Summary of Band-by-Band Matching

:::{figure} ./2024Q2-Matchup.png

The Figure shows comparison data for each platform. Black dots represent data that were collected prior to this quarter.
Coloured symbols represent data that were collected in this quarter. The diagonal line in each panel shows the
one-to-one correspondence between field and satellite data. Note that this diagonal line does NOT show the line of best
fit. It is plotted this way to highlight any trends where the data may be biased away from the line of one-to-one
correspondence. The statistics in the bottom-right corner of each panel provide details for the line of best fit
through all points up to and including this quarter’s data.
:::

The table below lists overall validation results. These are based on the standard deviation of the scatter that we find
for each band of each sensor. This is when taking all the validation results together, up to and including this quarter’s
results. The band-by-band scatter is representative of the validation performance of each band. Rather than providing
values for each individual band, we characterise all results by looking at the mean and maximum scatter for each
platform.

:::{csv-table} Validation Results
:header-rows: 1

"Satellite platform","Mean band-by-band scatter","Maximum band-by-band scatter"
"Landsat 8","2.5%","3.1%"
"Landsat 9","15%","29%"
"Sentinel-2A","2.4%","2.9%"
"Sentinel-2B","2.5%","4.5%"
:::

The table indicates that, for example, each Landsat 8 band is typically validated to 2-3%, with the worst performance
of a band being 3.1%. Note that there is much larger scatter for Landsat 9, indicating higher uncertainty in validation.
This is because there have been fewer field site measurements to coincide with the relatively new Landsat 9 platform.

## Effect on Cumulative Validation Results

This section discusses the effect that this quarter’s validation results have made on the total all-time validation
results.

For Landsat 8, this quarter has seen an slight improvement in validation results. There was 1 field site comparison
measurement. Overall, the field data for Landsat 8 overpasses continue to improve the
validation reliability.

For Landsat 9, this quarter has not seen any change in validation results: there were no field site comparison
measurements. The larger uncertainty of Landsat 9, when compared to Landsat 8 above, is most likely due to few
overall field site comparisons with the newer Landsat 9 OLI2 sensor.

For Sentinel-2A, this quarter has seen a slight improvement in validation results. There was 1 field site comparison
measurement at Mullion on 21 May, 2024. This measurement shows an excellent match. Overall, the field data for Sentinel-2A
overpasses continue to improve the validation reliability.

For Sentinel-2B, this quarter has seen a slight improvement in validation results. There was 1 field site comparison
measurement at Mullion on 16 April, 2024.


## Acknowledgments

The field validation data were collected by Geoscience Australia.

43 changes: 21 additions & 22 deletions docs/validation/quarterly-report/2024-q3/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@

## Executive Summary

This Quarterly report summarises validation for DEA surface reflectance products for quarter 3 (July-September), 2024
This Quarterly report summarises validation for DEA surface reflectance products for Quarter 3 (July-September) of 2024
and presents aggregate validation results to the end of this quarter.

* During this quarter, 6 field sites were measured 9 times and can be matched to 11 overpasses.
Expand All @@ -17,39 +17,39 @@ and presents aggregate validation results to the end of this quarter.

## Introduction

This quarterly report presents a summary of results from Q3 2024 (July-September) from the Digital Earth
Calibration/Validation team. The report is presented in the following sections:
This quarterly report presents a summary of results from Q3 2024 from the Digital Earth
Calibration and Validation team. The report is presented in the following sections:

* Background outlines the context around this work, with particular attention on historical work leading up to this quarter.
* Summary of Validation Work presents an overall picture of the field site measurements undertaken in a table and map.
* Comments on Individual Sites of Interest focuses on any sites where some aspect of the site or measurement was atypical.
* Summary of Band-by-Band Matching presents comparison data for this quarter’s results, in the context of all previous results.
* Comments on How This Quarter’s Work Has Affected Combined Validation Results discusses how the average results for each sensor have changed with the introduction of new validation data from this quarter. This section combines all band data for each platform to show averaged validation results.
* Background &mdash; this section outlines the context around this work, with particular attention on historical work leading up to this quarter.
* Summary of Validation Work &mdash; this section presents an overall picture of the field site measurements undertaken in a table and map.
* Comments on Individual Sites of Interest &mdash; this section focuses on any sites where some aspect of the site or measurement was atypical.
* Summary of Band-by-Band Matching &mdash; this section presents comparison data for this quarter’s results, in the context of all previous results.
* Comments on How This Quarter’s Work Has Affected Combined Validation Results &mdash; this section discusses how the average results for each sensor have changed with the introduction of new validation data from this quarter. This section combines all band data for each platform to show averaged validation results.

The Q3, 2024 validation report includes field site measurements that were captured as part of the winter transect work
The Q3 2024 validation report includes field site measurements that were captured as part of the winter transect work
across South Australia and New South Wales. Note that one field site measurement for SA1 is part of the Q2 2024
validation report and not shown here. No other field site measurements were conducted during this quarter.

## Background

The Digital Earth branch within Geoscience Australia offers a suite of Earth observation products, based on data from
both Landsat and Sentinel platforms. The core products are Landsat 8 and 9 and Sentinel-2A and -2B surface reflectance
(SR). To deliver these products with surety, the Calibration and Validation (Cal/Val) team perform vicarious validation
by measuring field sites with hand-held or Unstaffed Aerial Vehicle (UAV, commonly known as drones)-based equipment
both Landsat and Sentinel platforms. The core products are Landsat 8 and 9 and Sentinel-2A and -2B surface reflectance.
To deliver these products with surety, the Calibration and Validation (Cal/Val) team perform vicarious validation
by measuring field sites with hand-held equipment or an Unstaffed Aerial Vehicle (UAV; commonly known as drone) equipment
close to the time of an overpass. This work began with Phase 1, where measurements were performed by multiple groups
across continental Australia. Full details on the results and methodology can be found in the Phase 1 report.

Data for both SR products and from field site measurements are made freely available. For SR products, you can visualise
Data for both surface reflectance products and from field site measurements are made freely available. For surface reflectance products, you can visualise
the data at DEA Maps, or for a more in-depth understanding and direct access to data, please visit the DEA Data and
Products page. Field measurement data are made available through the National Spectral Database.

As more field sites are measured and as newer measurements are made over the same field sites, the overall validation of
SR products becomes more accurate. The purpose of this report is to provide an up-to-date status of validation accuracy,
surface reflectance products becomes more accurate. The purpose of this report is to provide an up-to-date status of validation accuracy,
based on the most recent measurements.

## Summary of Validation Work

6 sites were measured, with 9 individual field site captures. The table below summarises these captures:
6 sites were measured, with 9 individual field site captures. The table below summarises these captures.

:::{csv-table} Summary of field site captures
:header-rows: 1
Expand Down Expand Up @@ -109,30 +109,29 @@ The Figure shows comparison data for each platform. Black dots represent data th
Coloured symbols represent data that were collected in this quarter. The diagonal line in each panel shows the
one-to-one correspondence between field and satellite data. Note that this diagonal line does NOT show the line of best
fit. It is plotted this way to highlight any trends where the data may be biased away from the line of one-to-one
correspondence. Statistics, given in the bottom right-hand corner of each panel, show details for the line of best fit
correspondence. The statistics in the bottom-right corner of each panel provide details for the line of best fit
through all points up to and including this quarter’s data.
:::

The table below lists overall validation results. These are based on the standard deviation of the scatter that we find
for each band, for each sensor, when taking all the validation results together, up to, and including, this quarter’s
for each band of each sensor. This is when taking all the validation results together, up to and including this quarter’s
results. The band-by-band scatter is representative of the validation performance of each band. Rather than providing
values for each individual band, we characterise all results by looking at the mean and maximum scatter for each
platform.

:::{csv-table} Validation Results
:header-rows: 1

"Satellite Platform","Mean band-by-band scatter","Maximum band-by-band scatter"
"Satellite platform","Mean band-by-band scatter","Maximum band-by-band scatter"
"Landsat 8","2.4%","3.1%"
"Landsat 9","13.4%","35.9%"
"Sentinel-2A","2.2%","2.7%"
"Sentinel-2B","2.6%","4.4%"
:::

The Table indicates that, for example, each Landsat 8 band is validated to typically 2-3%, with the worst band
performance being 3.1%. Note that there is much larger scatter (ie. uncertainty in validation) for Landsat 9. This is
because there have been fewer field site measurements to coincide with the relatively new Landsat 9 platform.

The table indicates that, for example, each Landsat 8 band is typically validated to 2-3%, with the worst performance
of a band being 3.1%. Note that there is much larger scatter for Landsat 9, indicating higher uncertainty in validation.
This is because there have been fewer field site measurements to coincide with the relatively new Landsat 9 platform.

## Effect on Cumulative Validation Results

Expand Down
Loading