-
Notifications
You must be signed in to change notification settings - Fork 94
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add aggregate/downsample, upsample, extend and reduce methods to AreaDefinition and SwathDefinition #413
base: main
Are you sure you want to change the base?
Conversation
There are 238 errors:
|
pyresample/geometry.py
Outdated
raise ValueError('x and y arguments must be positive integers.') | ||
if x >= np.floor(width / 2): | ||
max_x = int(np.floor(width / 2)) - 1 | ||
raise ValueError("""You can at maximum reduce the along-track direction (x) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
W291 trailing whitespace
pyresample/geometry.py
Outdated
|
||
def extend(self, x=0, y=0): | ||
"""Extend the swath definition along x (along-track) and y (across-track) dimensions. | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
W293 blank line contains whitespace
pyresample/geometry.py
Outdated
def _convert_2D_array(arr, to, dims=None): | ||
""" | ||
Convert a 2D array to a specific format. | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
W293 blank line contains whitespace
pyresample/geometry.py
Outdated
|
||
def _get_extended_lonlats(lon_start, lat_start, lon_end, lat_end, npts, transpose=True): | ||
"""Utils employed by SwathDefinition.extend. | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
W293 blank line contains whitespace
pyresample/geometry.py
Outdated
def _convert_2D_array(arr, to, dims=None): | ||
""" | ||
Convert a 2D array to a specific format. | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
W293 blank line contains whitespace
pyresample/geometry.py
Outdated
else: | ||
raise NotImplementedError | ||
|
||
def _get_extended_lonlats(lon_start, lat_start, lon_end, lat_end, npts, transpose=True): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
E302 expected 2 blank lines, found 1
pyresample/geometry.py
Outdated
|
||
def _get_extended_lonlats(lon_start, lat_start, lon_end, lat_end, npts, transpose=True): | ||
"""Utils employed by SwathDefinition.extend. | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
W293 blank line contains whitespace
pre-commit.ci autofix |
for more information, see https://pre-commit.ci
Codecov Report
@@ Coverage Diff @@
## main #413 +/- ##
==========================================
+ Coverage 93.86% 94.18% +0.31%
==========================================
Files 65 65
Lines 11125 11706 +581
==========================================
+ Hits 10443 11025 +582
+ Misses 682 681 -1
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
This paragraph is to discuss the possibility of using
|
The code below documents an uncommon SwathDefinition edge case where
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you very much for this PR. I can see that a lot of work went into this, I am grateful for that.
In general, I think there is a lot of duplication of code or functionality that could be avoided, and be careful about not breaking backwards compatibility. Pyresample is used by many, so we have to be very careful when deprecating functions for example.
Some more detailed comments inline.
Thanks again for the hard work!
pyresample/geometry.py
Outdated
# Check input validity | ||
x = int(x) | ||
y = int(y) | ||
if x == 1 and y == 1: | ||
return self | ||
if x < 1 or y < 1: | ||
raise ValueError('x and y arguments must be positive integers larger or equal to 1.') | ||
# Downsample |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is duplicated code in the following function. However, why do we need to make sure these are integers? I think downsampling by a factor 1.5 is a perfectly valid usecase...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure what do you mean by splitting a pixel of 1.5.
The idea of this function is to split each pixel in x
x y
other pixels so you would require integers.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm thinking if I have an AreaDefinition that is 1500x1500 pixels and I want to downsample it to 1000x1000 pixels.
pyresample/geometry.py
Outdated
width = int(self.width * x) | ||
height = int(self.height * y) | ||
return self.copy(height=height, width=width) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This function is a close duplicate of the one before, could we have instead
def upsample(self, x=1, y=1):
return self.downsample(1/x, 1/y)
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done.
pyresample/geometry.py
Outdated
@@ -1423,10 +1901,119 @@ def copy(self, **override_kwargs): | |||
|
|||
def aggregate(self, **dims): | |||
"""Return an aggregated version of the area.""" | |||
warnings.warn("'aggregate' is deprecated, use 'downsample' or 'upsample' instead.", PendingDeprecationWarning) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why is this deprecated? Is there a problem with this method?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No. It was just to have method name consistency with SwathDefinition.
If the input arguments are -1<x<1 the method currently fails later on when calling AreaDef methods ...
downsample
is the opposite of upsample
.
The opposite of aggregate
would be disaggregate
... thinking to image processing is more straightforward up/down sampling in my opinion. I remove the DeprecationWarning?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If aggregate is breaking stuff it should definitely be fixed. I'm not against having upsample/downsample, but if aggregate and downsample are synonyms, one should call the other.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed
@@ -1872,26 +1872,357 @@ def test_compute_optimal_bb(self): | |||
assert_np_dict_allclose(res.proj_dict, proj_dict) | |||
self.assertEqual(res.shape, (6, 3)) | |||
|
|||
def test_aggregation(self): | |||
def test_downsampling(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These test functions are way too long and have a lot of duplicated code. Please refactor these, and try to stick with the principle that one test function should assert one thing.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you elaborate a bit more? I am new to tests.
Should I remove all testing of different array types?
The other tests were necessary to not reduce coverage ...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you don't need to remove anything, everything you test here is valid of course. What I mean is that it's easier to read and debug if you keep small tests. So this test function here should be split in many smaller test functions, for example test_downsampling_keeps_ndarrays
that would just do assert isinstance(res_np.lats, np.ndarray)
, then another function called eg test_downsampling_keeps_DataArray
with assert isinstance(res_xr.lats, xr.DataArray)
, etc.
All the code that will be duplicated between the smaller test should be put in a common setup
method, so you might need to create more test cases.
I'm advising that each test should just test one behaviour of what function you are testing.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tried to follow your suggestions @mraspaud. Have a look at the TestSwathDefinitionDownsampling
class in test_geometry
and tell me if that is what you expect.
I wait for your feedbacks ... and then I will adapt also the rest of the tests.
Is not clear to me if I need to create a test class for each method in the end ...
pyresample/geometry.py
Outdated
geod = pyproj.Geod(ellps='sphere') # TODO: sphere or WGS84? | ||
# geod = pyproj.Geod(ellps='WGS84') # sphere |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why not have this as a kwarg to let the user choose?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
pyresample/geometry.py
Outdated
if self.is_geostationary: | ||
raise NotImplementedError("'extend' method is not implemented for GEO AreaDefinition.") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why can't the area extents of a geos area def be extended?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can extend the shape of the area, but this might not affect the actual disc within.
In the case of full disc geo area, it just adds inf
row/columns around the original full disc ...
In the case we want to shrink/reduce the full disc geo area, doing
area_to_cover.shrink(x=1000,y=1000)
or area_to_cover[1000:-1000,1000:-1000]
results in the attached image.
I am not sure this is the expected behaviour ...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I also replaced x
and y
arguments with left, right, bottom, top
arguments to be consistent with SwathDefinition
methods
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also, left, right, bottom, top
make not lot of sense with geos disc area ...
Also, this review took me more than 2 hours, in the future please consider making more incremental changes in each PR 😅 |
Better leave them open if it's not trivial, so we can have another look after it's fixed and approve. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ghiggi You marked a ton of stuff as done, but I don't see any new commits. Did you maybe forget to push?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I reviewed what I could of the main code. I skipped over the tests as I see that they are pretty long and I had larger scale questions/comments on the main code. I think you have some really good ideas in this PR. I'm not sure I understand everything you're going for, but that's where I asked questions. I'm a little worried about performance in the different array cases and error handling/logging when a user doesn't have xarray or dask installed but the method requires it.
Part of me thinks the upsample/downsample stuff should be put in a separate module as a generic geocentric (or geographic?) upsample/downsample pair of functions. The methods in SwathDefinition could then be simple calls to these functions and keep the amount of code in geometry.py
down a bit (it is already way too long before this PR).
|
||
Docs: https://pyproj4.github.io/pyproj/stable/advanced_examples.html#multithreading | ||
""" | ||
if float(pyproj.__version__[0:3]) >= 3.1: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a reason you have to convert this to float? Won't string comparison work fine? Otherwise, it may be better to use the packaging
's version parsing and Version object: https://packaging.pypa.io/en/latest/version.html. If I remember correctly this is a dependency of setuptools which we also depend on so it should be fine dependency-wise.
Lastly, @mraspaud I'd be OK with just doing a hard require on pyproj >=3.1
for pyresample. It is 6 months old and pyproj moves pretty fast. I don't think it is too much to ask for the user to use a new-ish version. Granted we are only limiting to pyproj 2.2.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What if pyproj gets to version 3.10? in floats, that's 3.1...
@djhoese I could live with that too.
""" | ||
if float(pyproj.__version__[0:3]) >= 3.1: | ||
from pyproj import Transformer | ||
transformer = Transformer.from_crs(src.crs, dst.crs) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We may want to use always_xy=True
flag for either from_crs
or transform
. @mraspaud since you most recently ran into this, what do you think?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, for our use cases, always_xy=True
is a must
import dask.array as da | ||
import pyproj | ||
import xarray as xr |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
DataArray
and da
are already imported at the top of the module. Can we use those instead? And/or maybe have an if statement here if it requires xarray and/or dask then it can raise an ImportError saying that downsample
requires those.
to : TYPE | ||
The desired array output format. | ||
Accepted formats are: ['Numpy','Dask', 'DataArray_Numpy','DataArray_Dask'] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I really don't like this custom sub-language for this parameter. I would rather have separate explicit functions that the user knows it has to call. The function can convert between what it was given to what is needed, but we can avoid all this complex string parsing.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To add to this and clarify a little, you could use actually class objects for this (np.ndarray
, da.Array
, xr.DataArray
, etc). Or even better (at least that's my current feeling) do the split functions like I mentioned and now that I realize you return the input type, you could instead return the function needed to reverse the conversion.
Or...you could do something like a decorator on the methods that replaces your use of these functions so that the internal implementation of the methods only deals with the object types they want. All the conversion complexity is kept outside of the internals/implementation. I do realize that there are some edge cases where a conversion would be done when nothing was really modified in the data, but if done correctly that shouldn't be too big of a deal.
These are just ideas. I can definitely be convinced otherwise, but the new string names for these things feels like the wrong approach.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK I will take some time to think of your suggestions and try implementing it.
Just to be sure @djhoese:
- You mean to pass in the function argument the class object (
np.ndarray, da.Array, xr.DataArray
) we want instead of a string-acronym right? - Should I add a
chunked/lazy=True/False
argument to decide whether to return an xr.DataArray with dask or numpy array? Or default to a xr.DataArray with numpy?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Passing the class object was one option. My separate functions is still what I'd prefer. For example a to_data_array_dask
, to_data_array_numpy
, to_numpy
, or to_dask
. Then the return values of these would be (converted_obj, to_numpy)
depending on what it detected was given.
Although I'm not sure how xarray coords work in a situation like this. Like if you gave it a DataArray, it converted to numpy, then converted back to a DataArray, you wouldn't know the dimension names or the coordinates or any attributes. Xarray typically loses attributes when you do some operation so that's not a huge deal to mimic that, but coords and dims would be something the main function wouldn't be aware of. I guess that's another reason a decorator would be nice for this.
"""Downsample the SwathDefinition along x (columns) and y (lines) dimensions. | ||
|
||
Builds upon xarray.DataArray.coarsen averaging function. | ||
To downsample of a factor of 2, call swath_def.downsample(x=2, y=2) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To downsample of a factor of 2, call swath_def.downsample(x=2, y=2) | |
To downsample of a factor of 2, call ``swath_def.downsample(x=2, y=2)``. |
|
||
Builds upon xarray.DataArray.coarsen averaging function. | ||
To downsample of a factor of 2, call swath_def.downsample(x=2, y=2) | ||
swath_def.downsample(x=1, y=1) simply returns the current swath_def. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
swath_def.downsample(x=1, y=1) simply returns the current swath_def. | |
``swath_def.downsample(x=1, y=1)`` simply returns the current swath_def. |
warnings.warn("'aggregate' is deprecated, use 'downsample' instead.", PendingDeprecationWarning) | ||
return self.downsample(x=dims.get('x', 1), y=dims.get('y', 1)) | ||
|
||
def downsample(self, x=1, y=1, **kwargs): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you add a google-style docstring and explain what kwargs
is?
res = xr.DataArray(res, dims=['y', 'x', 'xyz'], coords=src_lons.coords) | ||
|
||
# Aggregating | ||
res = res.coarsen(x=x, y=y, **kwargs).mean() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should probably wait for a separate PR, but it'd be nice if mean
was a keyword argument that the user could control.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can do also this.
There is a small set of possible functions that can be called from an xarray.core.rolling.DataArrayCoarsen object. I can add the default argument fun="mean"
and then do:
res = res.coarsen(x=x, y=y, **kwargs)
func = getattr(res, fun')
res = func(res)
Should I check for valid fun
values in set("min", "max", "sum", "std", "var", "count", "mean", "median")
? Or defer this to the xarray function?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd say add an explicit keyword argument in the method signature for coarsen_func
or something like that and document in the docstring to look at the xarray docs (link to the exact page if possible, intersphinx may allow you to do something like:
See the :doc:`Xarray coarsen documentation <xarray:coarsen-docs-page>` for more information.
And I think you could leave the error handling up to the getattr
. You could shorten this to:
coarsen_res = res.coarsen(...)
res = getattr(coarsen_res, coarsen_func)()
This assumes the coarsen function doesn't take additional kwargs.
def extend(self, left=0, right=0, bottom=0, top=0): | ||
"""Extend the SwathDefinition of n pixels on specific boundary sides. | ||
|
||
By default, it does not extend on any side. | ||
""" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You may have had this discussion with Martin in previous comment sections, but what is this method used for? What is the real world use case?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
def extend(self, left=0, right=0, bottom=0, top=0): | |
"""Extend the SwathDefinition of n pixels on specific boundary sides. | |
By default, it does not extend on any side. | |
""" | |
def extend(self, left=0, right=0, bottom=0, top=0): | |
"""Extend the SwathDefinition by n pixels on specific boundary sides. | |
By default, it does not extend on any side. | |
""" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am currently implementing a pipeline for collocating and remapping GEO imagery on LEO sensor swath.
It's very useful to actually have GEO imagery outside the original LEO swath for downstream image processing/ML applications.
As an example, having a LEO sensor with 5 km footprint resolution, I first extend the LEO SwathDefinition of N pixels on both sides along-track, then I upsample the SwathDefinition by a factor of 5 to match the 1km resolution of GEO imagery. --> swathdef.extend(top=N, bottom=N).upsample(x=5,y=5)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So the end result is a SwathDefinition that is the same relative resolution as the GEO data and because of the along-track extend
the SwathDefinition is now "longer". Is that useful because then resampled GEO data has an overlap with multiple LEO swath definitions (multiple individual granules of the LEO data)? Or do you just want as much GEO data in the area of the LEO data? If so, then why not extend it cross-track too?
My non-ML-experienced instinct would be to resample the GEO data directly to the LEO swath, but I suppose that loses too much of the GEO data in the preparation for the ML stages rather than keeping that data for the ML stages to work with. Do you also resample the LEO data to the 1km SwathDefinition? Or just upsample it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I stupidly mentioned that I am extending along-track while I am extending the SwathDefinition cross-track ;)
I essentially want to have neighbor information around the swath and upscale it:
- in order to not lose potential fine-scale information available from GEO data (i.e. overshooting tops, inhomogeneity/non-uniform beam filling degree,...).
- and so that if I apply a specific series of convolutions/pooling operations without padding (i.e. with CNN), which cause the extended and upscaled SwathDefintion grids to get shrunk, I can come up with an output that is aligned with the original LEO SwathDefinition.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does the ML code handle the geolocation of the data at all? Or are you telling it (in some way) that the top-left 2x2 pixels of the resampled GEO pixels represent pixel [0, 0] of the LEO data? Or I guess in this extended case the extrapolated pixels of the GEO data don't have any corresponding LEO pixel.
If the ML has a concept of the location of the data, then why not simply slice the GEO data to the bounds of the LEO data instead of resampling?
Note: I'm not trying to argue against this extend
method I just really want to understand how it is used so I can argue for/against changes in the future.
# Retrieve new centroids | ||
# TODO: make it dask compatible using _upsample_centroid_dask [HELP WANTED] | ||
# res1 = da.apply_along_axis(_upsample_centroid_dask, | ||
# 2, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@djhoese have a look at this portion of code when you will have half an hour.
I am not sure what is the best way to use dask here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I do not fully understand the logic here, but I understand the overall goal. To summarize we have an overall array of (M rows, N columns) and we want to produce a final array with a shape of (M * y, N * x), right? To do this we need convert from lon/lat to x/y/z geocentric, do some math, then convert back x/y/z to lon/lat. In the "do some math" portion, do we need to the values of all the pixels next to the current pixel? Or can we operate on groups of y
by x
pixels?
Given what I've learned about number of tasks in a dask graph, I think it would be best to do this entire thing in a giant map_blocks call. By "entire thing" I mean lon/lat -> x/y/z, the math, x/y/z -> lon/lat. If you require to work on groups of y
by x
pixels then you could make sure that the input chunks are divisible by those numbers (2x2 -> 1024x1024 would be ok). We don't want to make really small chunks. If you need overlap between the computations (in the case where each pixel needs to know about all the values around it) then map_overlap
may be something to look into: https://docs.dask.org/en/stable/array-overlap.html
def upsample(self, x=1, y=1): | ||
"""Upsample the SwathDefinition along x (columns) and y (lines) dimensions. | ||
|
||
To upsample of a factor of 2 (each pixel splitted in 2x2 pixels), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To upsample of a factor of 2 (each pixel splitted in 2x2 pixels), | |
To upsample by a factor of 2 (each pixel split into 2x2 pixels), |
This PR aims to provide the methods to aggregate/downsample, upsample, extend and reduce AreaDefinition and SwathDefinition objects.
For SwathDefinition objects, it returns a new objects with the same inputs lats/lons type (numpy/dask/xr.DataArray).
SwathDef.aggregate
bugs and enhancements #407git diff origin/main **/*py | flake8 --diff
Yet to be done/defined:
downsample
method and added a deprecation warning toAreaDefinition.aggregate
?pyproj.transform
instead of the more efficientpyproj.Transformer
inSwathDefinition._do_transform
?SwathDefinition.extend
, do we define which pyproj.Geod?pyproj.Geod(ellps='sphere')
orpyproj.Geod(ellps='WGS84')
?SwathDefinition.upsample
dask-compatible. But I need some help/guidance to finish out the work. See between L828-L850 and L870-L880 ofgeometry.py
.SwathDefinition.upsample
could also use insteadgeotiepoints.GeoInterpolator
but I am not sure I understood its possible correct usage for upsampling a swath grid. Additionally it seems not dask compatible. See below PR comment for a code attempt ...PS: Is my first PR to a package, so please be patient and don't hesitate to provide a lot of feedback/reproaches :)