-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add first iteration of tests to repo #42
Conversation
Welcome to Codecov 🎉Once merged to your default branch, Codecov will compare your coverage reports and display the results in this comment. Thanks for integrating Codecov - We've got you covered ☂️ |
pytz==2023.3 | ||
rioxarray |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Had to add back rioxarray
- xarray
used it under the hood after all
tests/data/satellite_ds.pickle
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pickle file of satellite data - to replace with database access later on
max_cloudcover=90, | ||
skip_broken_datasets=True, | ||
) | ||
if study_area == "testing": |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Use pickle file instead of loading data from datacube if study area == "testing"
codecov.yaml
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ignore the validation and composites modules in code coverage for now
|
||
- name: Build DEA Intertidal image | ||
timeout-minutes: 20 | ||
shell: bash | ||
run: | | ||
docker-compose build | ||
|
||
- name: Run tests |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This downloads tide data from Dropbox (to replace with S3 access to tide modelling files), then runs some tests using docker compose run
. Then finally upload code coverage to codecov
@@ -30,13 +33,33 @@ jobs: | |||
steps: | |||
- name: Checkout code | |||
uses: actions/checkout@v3 | |||
with: | |||
fetch-depth: 0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe makes the git clone faster?
@@ -1,4 +1,4 @@ | |||
name: DEA Intertidal Image Push | |||
name: Image build and test |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Renamed so it shows up nicely on the readme badges
@@ -77,7 +77,7 @@ def extents( | |||
freq, | |||
dem, | |||
corr, | |||
land_use_mask="/gdata1/data/land_use/ABARES_CLUM/geotiff_clum_50m1220m/clum_50m1220m.tif", | |||
land_use_mask="https://dea-public-data-dev.s3-ap-southeast-2.amazonaws.com/abares_clum_2020/clum_50m1220m.tiff", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Load data from S3 so it works in the tests
@@ -14,8 +14,6 @@ | |||
# from pyproj import Transformer | |||
# from scipy.signal import argrelmax, argrelmin | |||
|
|||
from scipy.interpolate import interp1d |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not used below; removing for now
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great Robbi!
This adds some very rudimentary tests to the repo!

Currently they use a small pickle file of satellite data to test against the
elevation
function, and downloads a zip of tide modelling data. In the future we can improve it by:Ignore the long list of commits - we can merge this with squash and merge. The ECR sync step will fail on this branch, but hopefully work on
main
.