Skip to content

Commit

Permalink
Use "xsdba" in docs and option (#72)
Browse files Browse the repository at this point in the history
<!-- Please ensure the PR fulfills the following requirements! -->
<!-- If this is your first PR, make sure to add your details to the
AUTHORS.rst! -->
### Pull Request Checklist:
- [ ] This PR addresses an already opened issue (for bug fixes /
features)
  - This PR fixes #xyz
- [x] (If applicable) Documentation has been added / updated (for bug
fixes / features).
- [ ] (If applicable) Tests have been added.
- [x] CHANGELOG.rst has been updated (with summary of main changes).
- [x] Link to issue (:issue:`number`) and pull request (:pull:`number`)
has been added.

### What kind of change does this PR introduce?

* Changes in docs / global options to reflect xsdba is its own module

### Does this PR introduce a breaking change?
Yes, it removes SDBA_ENCODE_CF
Renames XSDBA_EXTRA_OUTPUT to EXTRA_OUTPUT


### Other information:
  • Loading branch information
coxipi authored Feb 18, 2025
2 parents 5ca6552 + 1574cb9 commit 29423d8
Show file tree
Hide file tree
Showing 22 changed files with 510 additions and 344 deletions.
2 changes: 2 additions & 0 deletions CHANGELOG.rst
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,9 @@ Changes
Fixes
^^^^^
* Gave credits to the package to all previous contributors of ``xclim.sdba`` (:issue:`58`, :pull:`59`).
* Pin `sphinx-codeautolink` to fix ReadTheDocs and correct some docs errors (:pull:`40`).
* Removed reliance on the `netcdf4` package for testing purposes. The `h5netcdf` engine is now used for file IO operations. (:pull:`71`).
* Changes to reflect the change of library name `xsdba` (:pull:`72`)

.. _changes_0.2.0:

Expand Down
4 changes: 2 additions & 2 deletions docs/notebooks/advanced_example.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -314,7 +314,7 @@
"\n",
"<!-- TODO : check xsdba_extra_output works -->\n",
"\n",
"To fully understand what is happening during the bias-adjustment process, `xsdba` can output _diagnostic_ variables, giving more visibility to what the adjustment is doing behind the scene. This behaviour, a `verbose` option, is controlled by the `xsdba_extra_output` option, set with `xsdba.set_options`. When `True`, `train` calls are instructed to include additional variables to the training datasets. In addition, the `adjust` calls will always output a dataset, with `scen` and, depending on the algorithm, other diagnostics variables. See the documentation of each `Adjustment` objects to see what extra variables are available.\n",
"To fully understand what is happening during the bias-adjustment process, `xsdba` can output _diagnostic_ variables, giving more visibility to what the adjustment is doing behind the scene. This behaviour, a `verbose` option, is controlled by the `extra_output` option, set with `xsdba.set_options`. When `True`, `train` calls are instructed to include additional variables to the training datasets. In addition, the `adjust` calls will always output a dataset, with `scen` and, depending on the algorithm, other diagnostics variables. See the documentation of each `Adjustment` objects to see what extra variables are available.\n",
"\n",
"For the moment, this feature is still under construction and only a few `Adjustment` actually provide these extra outputs. Please open issues on the GitHub repo if you have needs or ideas of interesting diagnostic variables.\n",
"\n",
Expand All @@ -329,7 +329,7 @@
"source": [
"from xsdba import set_options\n",
"\n",
"with set_options(xsdba_extra_output=True):\n",
"with set_options(extra_output=True):\n",
" QDM = QuantileDeltaMapping.train(\n",
" ref, hist, nquantiles=15, kind=\"+\", group=\"time.dayofyear\"\n",
" )\n",
Expand Down
3 changes: 2 additions & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -324,7 +324,8 @@ line-ending = "auto"

[tool.ruff.lint]
extend-select = [
"RUF022" # unsorted-dunder-all
"RUF022", # unsorted-dunder-all
"D213" # multi-line-summary-second-line
]
ignore = [
"COM", # commas
Expand Down
65 changes: 43 additions & 22 deletions src/xsdba/_adjustment.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
# pylint: disable=no-value-for-parameter
"""# noqa: SS01
"""
# noqa: SS01
Adjustment Algorithms
=====================
Expand Down Expand Up @@ -49,7 +50,8 @@ def dqm_train(
adapt_freq_thresh: str | None = None,
jitter_under_thresh_value: str | None = None,
) -> xr.Dataset:
"""Train step on one group.
"""
Train step on one group.
Parameters
----------
Expand Down Expand Up @@ -111,7 +113,8 @@ def eqm_train(
adapt_freq_thresh: str | None = None,
jitter_under_thresh_value: str | None = None,
) -> xr.Dataset:
"""EQM: Train step on one group.
"""
EQM: Train step on one group.
Parameters
----------
Expand Down Expand Up @@ -154,7 +157,8 @@ def eqm_train(


def _npdft_train(ref, hist, rots, quantiles, method, extrap, n_escore, standardize):
r"""Npdf transform to correct a source `hist` into target `ref`.
r"""
Npdf transform to correct a source `hist` into target `ref`.
Perform a rotation, bias correct `hist` into `ref` with QuantileDeltaMapping, and rotate back.
Do this iteratively over all rotations `rots` and conserve adjustment factors `af_q` in each iteration.
Expand Down Expand Up @@ -212,7 +216,8 @@ def mbcn_train(
extrapolation: str,
n_escore: int,
) -> xr.Dataset:
"""Npdf transform training.
"""
Npdf transform training.
Adjusting factors obtained for each rotation in the npdf transform and conserved to be applied in
the adjusting step in :py:func:`mcbn_adjust`.
Expand Down Expand Up @@ -299,7 +304,8 @@ def mbcn_train(


def _npdft_adjust(sim, af_q, rots, quantiles, method, extrap):
"""Npdf transform adjusting.
"""
Npdf transform adjusting.
Adjusting factors `af_q` obtained in the training step are applied on the simulated data `sim` at each iterated
rotation, see :py:func:`_npdft_train`.
Expand Down Expand Up @@ -351,7 +357,8 @@ def mbcn_adjust(
adj_kws: dict,
period_dim: str | None,
) -> xr.DataArray:
"""Perform the adjustment portion MBCn multivariate bias correction technique.
"""
Perform the adjustment portion MBCn multivariate bias correction technique.
The function :py:func:`mbcn_train` pre-computes the adjustment factors for each rotation
in the npdf portion of the MBCn algorithm. The rest of adjustment is performed here
Expand Down Expand Up @@ -425,7 +432,7 @@ def mbcn_adjust(
scen_block = xr.zeros_like(sim[{"time": ind_gw}])
for iv, v in enumerate(sim[pts_dims[0]].values):
sl = {"time": ind_gw, pts_dims[0]: iv}
with set_options(xsdba_extra_output=False):
with set_options(extra_output=False):
ADJ = base.train(
ref[sl], hist[sl], **base_kws_vars[v], skip_input_checks=True
)
Expand Down Expand Up @@ -470,7 +477,8 @@ def mbcn_adjust(
def qm_adjust(
ds: xr.Dataset, *, group: Grouper, interp: str, extrapolation: str, kind: str
) -> xr.Dataset:
"""QM (DQM and EQM): Adjust step on one block.
"""
QM (DQM and EQM): Adjust step on one block.
Parameters
----------
Expand Down Expand Up @@ -517,7 +525,8 @@ def dqm_adjust(
extrapolation: str,
detrend: int | PolyDetrend,
) -> xr.Dataset:
"""DQM adjustment on one block.
"""
DQM adjustment on one block.
Parameters
----------
Expand Down Expand Up @@ -576,7 +585,8 @@ def dqm_adjust(

@map_blocks(reduces=[Grouper.PROP, "quantiles"], scen=[], sim_q=[])
def qdm_adjust(ds: xr.Dataset, *, group, interp, extrapolation, kind) -> xr.Dataset:
"""QDM: Adjust process on one block.
"""
QDM: Adjust process on one block.
Parameters
----------
Expand Down Expand Up @@ -605,7 +615,8 @@ def qdm_adjust(ds: xr.Dataset, *, group, interp, extrapolation, kind) -> xr.Data
hist_thresh=[Grouper.PROP],
)
def loci_train(ds: xr.Dataset, *, group, thresh) -> xr.Dataset:
"""LOCI: Train on one block.
"""
LOCI: Train on one block.
Parameters
----------
Expand All @@ -631,7 +642,8 @@ def loci_train(ds: xr.Dataset, *, group, thresh) -> xr.Dataset:

@map_blocks(reduces=[Grouper.PROP], scen=[])
def loci_adjust(ds: xr.Dataset, *, group, thresh, interp) -> xr.Dataset:
"""LOCI: Adjust on one block.
"""
LOCI: Adjust on one block.
Parameters
----------
Expand All @@ -652,7 +664,8 @@ def loci_adjust(ds: xr.Dataset, *, group, thresh, interp) -> xr.Dataset:

@map_groups(af=[Grouper.PROP])
def scaling_train(ds: xr.Dataset, *, dim, kind) -> xr.Dataset:
"""Scaling: Train on one group.
"""
Scaling: Train on one group.
Parameters
----------
Expand All @@ -670,7 +683,8 @@ def scaling_train(ds: xr.Dataset, *, dim, kind) -> xr.Dataset:

@map_blocks(reduces=[Grouper.PROP], scen=[])
def scaling_adjust(ds: xr.Dataset, *, group, interp, kind) -> xr.Dataset:
"""Scaling: Adjust on one block.
"""
Scaling: Adjust on one block.
Parameters
----------
Expand All @@ -686,7 +700,8 @@ def scaling_adjust(ds: xr.Dataset, *, group, interp, kind) -> xr.Dataset:


def npdf_transform(ds: xr.Dataset, **kwargs) -> xr.Dataset:
r"""N-pdf transform : Iterative univariate adjustment in random rotated spaces.
r"""
N-pdf transform : Iterative univariate adjustment in random rotated spaces.
Parameters
----------
Expand Down Expand Up @@ -836,7 +851,8 @@ def extremes_train(
dist,
quantiles: np.ndarray,
) -> xr.Dataset:
"""Train extremes for a given variable series.
"""
Train extremes for a given variable series.
Parameters
----------
Expand Down Expand Up @@ -901,7 +917,8 @@ def extremes_adjust(
extrapolation: str,
cluster_thresh: float,
) -> xr.Dataset:
"""Adjust extremes to reflect many distribution factors.
"""
Adjust extremes to reflect many distribution factors.
Parameters
----------
Expand Down Expand Up @@ -964,7 +981,8 @@ def _otc_adjust(
jitter_inside_bins: bool = True,
normalization: str | None = "max_distance",
):
"""Optimal Transport Correction of the bias of X with respect to Y.
"""
Optimal Transport Correction of the bias of X with respect to Y.
Parameters
----------
Expand Down Expand Up @@ -1063,7 +1081,8 @@ def otc_adjust(
adapt_freq_thresh: dict | None = None,
normalization: str | None = "max_distance",
):
"""Optimal Transport Correction of the bias of `hist` with respect to `ref`.
"""
Optimal Transport Correction of the bias of `hist` with respect to `ref`.
Parameters
----------
Expand Down Expand Up @@ -1162,7 +1181,8 @@ def _dotc_adjust(
kind: dict | None = None,
normalization: str | None = "max_distance",
):
"""Dynamical Optimal Transport Correction of the bias of X with respect to Y.
"""
Dynamical Optimal Transport Correction of the bias of X with respect to Y.
Parameters
----------
Expand Down Expand Up @@ -1292,7 +1312,8 @@ def dotc_adjust(
adapt_freq_thresh: dict | None = None,
normalization: str | None = "max_distance",
):
"""Dynamical Optimal Transport Correction of the bias of X with respect to Y.
"""
Dynamical Optimal Transport Correction of the bias of X with respect to Y.
Parameters
----------
Expand Down
9 changes: 6 additions & 3 deletions src/xsdba/_processing.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
"""# noqa: SS01
"""
# noqa: SS01
Compute Functions Submodule
===========================
Expand Down Expand Up @@ -107,7 +108,8 @@ def _normalize(
dim: Sequence[str],
kind: str = ADDITIVE,
) -> xr.Dataset:
"""Normalize an array by removing its mean.
"""
Normalize an array by removing its mean.
Parameters
----------
Expand Down Expand Up @@ -143,7 +145,8 @@ def _normalize(

@map_groups(reordered=[Grouper.DIM], main_only=False)
def _reordering(ds: xr.Dataset, *, dim: str) -> xr.Dataset:
"""Group-wise reordering.
"""
Group-wise reordering.
Parameters
----------
Expand Down
Loading

0 comments on commit 29423d8

Please sign in to comment.