Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[1pt] PR: Updating bathymetry-preprocessing script #1125

Merged
merged 9 commits into from
Aug 2, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
15 changes: 14 additions & 1 deletion data/bathymetry/preprocess_bathymetry.py
Original file line number Diff line number Diff line change
Expand Up @@ -52,6 +52,7 @@ def preprocessing_ehydro(tif, bathy_bounds, survey_gdb, output, min_depth_thresh
bathy_affine = bathy_ft.transform
bathy_ft = bathy_ft.read(1)
bathy_ft[np.where(bathy_ft == -9999.0)] = np.nan
bathy_ft[np.where(bathy_ft <= 0.0)] = 0.000001
survey_min_depth = np.nanmin(bathy_ft)

assert survey_min_depth < min_depth_threshold, (
Expand Down Expand Up @@ -81,6 +82,10 @@ def preprocessing_ehydro(tif, bathy_bounds, survey_gdb, output, min_depth_thresh
zs_area = zs_area.set_crs(nwm_streams.crs)
zs_area = zs_area.rename(columns={"sum": "missing_volume_m3"})

# print("------------------------------")
# print(zs_area.ID)
# print("------------------------------")

# Derive slope tif
output_slope_tif = os.path.join(os.path.dirname(tif), 'bathy_slope.tif')
slope_tif = gdal.DEMProcessing(output_slope_tif, bathy_gdal, 'slope', format='GTiff')
Expand Down Expand Up @@ -115,14 +120,22 @@ def preprocessing_ehydro(tif, bathy_bounds, survey_gdb, output, min_depth_thresh
)

# Add survey meta data
bathy_nwm_streams['SurveyDateStamp'] = bathy_bounds.loc[0, 'SurveyDateStamp']
time_stamp = bathy_bounds.loc[0, 'SurveyDateStamp']
time_stamp_obj = str(time_stamp)

bathy_nwm_streams['SurveyDateStamp'] = time_stamp_obj # bathy_bounds.loc[0, 'SurveyDateStamp']
bathy_nwm_streams['SurveyId'] = bathy_bounds.loc[0, 'SurveyId']
bathy_nwm_streams['Sheet_Name'] = bathy_bounds.loc[0, 'Sheet_Name']
bathy_nwm_streams["Bathymetry_source"] = 'USACE eHydro'

# Export geopackage with bathymetry
num_streams = len(bathy_nwm_streams)
bathy_nwm_streams = bathy_nwm_streams.to_crs(epsg=5070)

# schema = gpd.io.file.infer_schema(bathy_nwm_streams)
# print(schema)
# print("---------------------------")

if os.path.exists(output):
print(f"{output} already exists. Concatinating now...")
existing_bathy_file = gpd.read_file(output, engine="pyogrio", use_arrow=True)
Expand Down
11 changes: 10 additions & 1 deletion docs/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,11 +1,20 @@
All notable changes to this project will be documented in this file.
We follow the [Semantic Versioning 2.0.0](http://semver.org/) format.

## v4.5.4.2 - 2024-08-02 - [PR#1125](https://github.com/NOAA-OWP/inundation-mapping/pull/1125)

This PR focuses on updating the preprocess_bathymetry.py for 3 issues: 1) the capability of preprocessing SurveyJobs that have negative depth values, 2) changing the SurveyDateStamp format, and 3) the capability of including multiple SurveyJobs for one NWM feature-id if needed.

### Changes
`data/bathymetry/preprocess_bathymetry.py`: Addressing 3 issues including, the capability of preprocessing SurveyJobs that have negative depth values, changing the SurveyDateStamp format, and the capability of including multiple SurveyJobs for one NWM feature-id.


<br/><br/>

## v4.5.4.1 - 2024-08-02 - [PR#1185](https://github.com/NOAA-OWP/inundation-mapping/pull/1185)

This PR brings back the `preprocess_ahps_nws.py` code to FIM4 and generates new AHPS benchmark datasets for sites SXRA2 and SKLA2 in Alaska. The new AHPS benchmark datasets are available on dev1 here: "/dev_fim_share/foss_fim/outputs/ali_ahps_alaska/AHPS_Results_Alaska/19020302/"

This PR closes issue #1130.

To process a new station, follow these steps:

Expand Down
Loading