DJS is an open-source python library for adjusting parameters and completing concurrent WRF-Hydro/National Water Model simulations. It provides tools that simplify model usage and executions that are operable across operating systems. The framework consists of two main pieces:
- Perturbation Engine for scaling and stochastically generating model parameter sets.
- Job Scheduler for spawning and managing concurrent model simulations.
Most users should utilize DJS's click command line interface to access and use the perturbation engine and job scheduler. The CLI takes yaml input files that tell the perturbation engine and job scheduler what files to work on and how to work on them. See the example and docs for greater detail.
Note, if you just want to vary model parameters, you can skip to package installation.
Prior to the package installation, first install Docker. Windows and macOS users should install Docker Desktop, Linux users should install Docker Developer Tools.
macOS users with homebrew installed can install Docker as such.
brew cask install docker
Pull the docker images from docker hub, so DJS can use them to run models. View all available images on Docker Hub.
# For NWM version 2.0
docker pull aaraney/nwm-djs:2.0
# For NWM version 1.2
docker pull aaraney/nwm-djs:1.2
# For WRF-Hydro
docker pull aaraney/wrf-hydro:5.1.1
docker pull aaraney/wrf-hydro:5.1.0
docker pull aaraney/wrf-hydro:5.0.3
DJS requires Python 3.5 or higher and is available using pip
. It's recommend to install DJS into a virtual environment or a conda environment as a best practice. Personally, I think conda is a little easier, but admittedly a little more bloated.
To create a virtual environment run the following in terminal
python3 -m venv djs-env
# If on Mac or Linux
source djs-env/bin/activate
# If on Windows
. \djs-env\Scripts\activate
To create an environment using conda do the following in terminal. You can install miniconda here.
conda create --name djs python=3.8
conda activate djs
Install DJS using pip
pip install --upgrade djs
Files for this example NCAR Pocono, PA test case can be downloaded and extracted for recreation purposes.
In this example, a set of varied model parameter files will be created and used to run NWM version 2.0 simulations.
Let's take a look inside pe_setup.yml
before varying the parameters. Documentation on the perturbation engine yaml setup file can be found by running djs perturbation-engine --help
or in the docs.
# Perturbation Engine setup yml file
primary/DOMAIN/Route_Link.nc:
- nCC:
scalar: '* 1.3'
output: 'nCC_route_link.nc'
primary/DOMAIN/soil_properties.nc:
- mfsno:
uniform: True
output: 'mfsno_soil_properties.nc'
primary/DOMAIN/Fulldom_hires.nc:
- OVROUGHRTFAC:
scalar: '- 0.2'
output: 'OVROUGHRTFAC_fulldom_hires.nc'
- multi_group:
output: 'RETDEPRTFAC_OVROUGHRTFAC_fulldom_hires.nc'
OVROUGHRTFAC:
norm: False
scalar: '- 0.3'
RETDEPRTFAC:
scalar: '- 0.05'
For those following along navigate to the location you extracted the pocono_test_case.zip
using the terminal/command prompt.
Create a varied parameter set is as simple as running:
djs perturbation-engine pe_setup.yml
Varied parameter files should now exist in the pocono_test_case
directory.
First, ensure that Docker is running. Next, let's take a peak at the Job Scheduler setup files. Documentation on the perturbation engine yaml setup file can be found by running djs job-scheduler --help
or in the docs.
# Job Scheduler setup yml file
primary: 'primary'
alternative-files:
-
- 'nCC_route_link.nc'
- 'mfsno_soil_properties.nc'
- 'OVROUGHRTFAC_fulldom_hires.nc'
- 'RETDEPRTFAC_OVROUGHRTFAC_fulldom_hires.nc'
image: 'aaraney/nwm-djs:2.0'
max-jobs: 2
cpus: '0-2'
mpi-np: 2
To start the simulations run:
djs job-scheduler js_setup.yml
Notice above that 'nCC_route_link.nc'
and 'mfsno_soil_properties.nc'
are nested within a list, this tells the job scheduler to use both files within the same simulation. Whereas the other files will have their own simulations.
New directories starting with the prefix rep-
should appear in the pocono_test_case
directory. These directories hold each of the model simulations.
Contributors: Austin Raney, Iman Maghami, Yenchia Feng