Skip to content

Commit b150dcb

Browse files
committed
fixing things, deprecating python 3.9
1 parent 0158168 commit b150dcb

16 files changed

+333
-199
lines changed

.github/workflows/unit_test.yml

Lines changed: 6 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ jobs:
2121
strategy:
2222
fail-fast: false
2323
matrix:
24-
python-version: ["3.9", "3.12"] #NOTE: min and max Python versions supported by icepyx
24+
python-version: ["3.10", "3.12"] #NOTE: min and max Python versions supported by icepyx
2525

2626
steps:
2727
- uses: "actions/checkout@v4"
@@ -33,9 +33,12 @@ jobs:
3333
python-version: "${{ matrix.python-version }}"
3434

3535
- name: "Run tests"
36+
env:
37+
EARTHDATA_PASSWORD: "${{ secrets.EARTHDATA_PASSWORD }}"
38+
EARTHDATA_USERNAME: ${{ secrets.EARTHDATA_USERNAME }}
39+
NSIDC_LOGIN: "${{ secrets.EARTHDATA_PASSWORD }}" # remove this
3640
run: |
37-
pytest icepyx/ --verbose --cov app \
38-
--ignore=icepyx/tests/integration
41+
pytest icepyx/unit --verbose --cov app
3942
4043
- name: "Upload coverage report"
4144
uses: "codecov/[email protected]"

doc/source/example_notebooks/IS2_cloud_data_access.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -131,7 +131,7 @@
131131
},
132132
"outputs": [],
133133
"source": [
134-
"reg.order_vars.avail()"
134+
"reg.order_vars"
135135
]
136136
},
137137
{
@@ -384,7 +384,7 @@
384384
"name": "python",
385385
"nbconvert_exporter": "python",
386386
"pygments_lexer": "ipython3",
387-
"version": "3.10.10"
387+
"version": "3.11.11"
388388
}
389389
},
390390
"nbformat": 4,

doc/source/example_notebooks/IS2_data_access.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -624,7 +624,7 @@
624624
],
625625
"metadata": {
626626
"kernelspec": {
627-
"display_name": "icepyx",
627+
"display_name": "Python 3 (ipykernel)",
628628
"language": "python",
629629
"name": "python3"
630630
},
@@ -638,7 +638,7 @@
638638
"name": "python",
639639
"nbconvert_exporter": "python",
640640
"pygments_lexer": "ipython3",
641-
"version": "3.10.10"
641+
"version": "3.11.11"
642642
}
643643
},
644644
"nbformat": 4,

doc/source/example_notebooks/IS2_data_access2-subsetting.ipynb

Lines changed: 41 additions & 82 deletions
Original file line numberDiff line numberDiff line change
@@ -105,7 +105,7 @@
105105
},
106106
"outputs": [],
107107
"source": [
108-
"region_a.show_custom_options(dictview=True)"
108+
"region_a.show_custom_options()"
109109
]
110110
},
111111
{
@@ -153,73 +153,55 @@
153153
},
154154
{
155155
"cell_type": "markdown",
156-
"metadata": {
157-
"user_expressions": []
158-
},
156+
"metadata": {},
159157
"source": [
160-
"### Determine what variables are available for your data product\n",
161-
"There are multiple ways to get a complete list of available variables.\n",
162-
"To increase readability, some display options (2 and 3, below) show the 200+ variable + path combinations as a dictionary where the keys are variable names and the values are the paths to that variable.\n",
158+
"## _Why not just download all the data and subset locally? What if I need more variables/granules?_\n",
163159
"\n",
164-
"1. `region_a.order_vars.avail`, a list of all valid path+variable strings\n",
165-
"2. `region_a.show_custom_options(dictview=True)`, all available subsetting options\n",
166-
"3. `region_a.order_vars.parse_var_list(region_a.order_vars.avail)`, a dictionary of variable:paths key:value pairs"
160+
"_Taking advantage of the NSIDC subsetter is a great way to reduce your download size and thus your download time and the amount of storage required, especially if you're storing your data locally during analysis. By downloading your data using icepyx, it is easy to go back and get additional data with the same, similar, or different parameters (e.g. you can keep the same spatial and temporal bounds but change the variable list). Related tools (e.g. [`captoolkit`](https://github.com/fspaolo/captoolkit)) will let you easily merge files if you're uncomfortable merging them during read-in for processing._"
167161
]
168162
},
169163
{
170164
"cell_type": "code",
171165
"execution_count": null,
172-
"metadata": {
173-
"tags": []
174-
},
166+
"metadata": {},
175167
"outputs": [],
176168
"source": [
177-
"region_a.order_vars.avail()"
178-
]
179-
},
180-
{
181-
"cell_type": "markdown",
182-
"metadata": {
183-
"user_expressions": []
184-
},
185-
"source": [
186-
"By passing the boolean `options=True` to the `avail` method, you can obtain lists of unique possible variable inputs (var_list inputs) and path subdirectory inputs (keyword_list and beam_list inputs) for your data product. These can be helpful for building your wanted variable list."
169+
"short_name = 'ATL06'\n",
170+
"spatial_extent = './supporting_files/simple_test_poly.gpkg'\n",
171+
"date_range = ['2019-10-01','2019-10-05']"
187172
]
188173
},
189174
{
190175
"cell_type": "code",
191176
"execution_count": null,
192-
"metadata": {
193-
"tags": []
194-
},
195-
"outputs": [],
196-
"source": [
197-
"region_a.order_vars.avail(options=True)"
198-
]
199-
},
200-
{
201-
"cell_type": "markdown",
202177
"metadata": {},
178+
"outputs": [],
203179
"source": [
204-
"## _Why not just download all the data and subset locally? What if I need more variables/granules?_\n",
180+
"region_a = ipx.Query(short_name, spatial_extent\n",
181+
", \n",
182+
" cycles=['03','04','05','06'], tracks=['0849','0902'])\n",
205183
"\n",
206-
"_Taking advantage of the NSIDC subsetter is a great way to reduce your download size and thus your download time and the amount of storage required, especially if you're storing your data locally during analysis. By downloading your data using icepyx, it is easy to go back and get additional data with the same, similar, or different parameters (e.g. you can keep the same spatial and temporal bounds but change the variable list). Related tools (e.g. [`captoolkit`](https://github.com/fspaolo/captoolkit)) will let you easily merge files if you're uncomfortable merging them during read-in for processing._"
184+
"print(region_a.product)\n",
185+
"print(region_a.product_version)\n",
186+
"print(region_a.cycles)\n",
187+
"print(region_a.tracks)\n",
188+
"print(region_a.spatial_extent)"
207189
]
208190
},
209191
{
210-
"cell_type": "markdown",
192+
"cell_type": "code",
193+
"execution_count": null,
211194
"metadata": {},
195+
"outputs": [],
212196
"source": [
213-
"### Building the default wanted variable list"
197+
"region_a.visualize_spatial_extent()"
214198
]
215199
},
216200
{
217-
"cell_type": "code",
218-
"execution_count": null,
201+
"cell_type": "markdown",
219202
"metadata": {},
220-
"outputs": [],
221203
"source": [
222-
"region_a.order_vars.wanted"
204+
"We can still print a list of available granules for our query"
223205
]
224206
},
225207
{
@@ -228,8 +210,7 @@
228210
"metadata": {},
229211
"outputs": [],
230212
"source": [
231-
"region_a.order_vars.append(defaults=True)\n",
232-
"pprint(region_a.order_vars.wanted)"
213+
"region_a.avail_granules(cloud=True)"
233214
]
234215
},
235216
{
@@ -247,17 +228,15 @@
247228
"metadata": {},
248229
"outputs": [],
249230
"source": [
250-
"region_a.subsetparams(Coverage=region_a.order_vars.wanted)"
231+
"order = region_a.order_granules(subset=True) \n",
232+
"order"
251233
]
252234
},
253235
{
254236
"cell_type": "markdown",
255237
"metadata": {},
256238
"source": [
257-
"Or, you can put the `Coverage` parameter directly into `order_granules`:\n",
258-
"`region_a.order_granules(Coverage=region_a.order_vars.wanted)`\n",
259-
"\n",
260-
"However, then you cannot view your subset parameters (`region_a.subsetparams`) prior to submitting your order."
239+
"### Checking an order status"
261240
]
262241
},
263242
{
@@ -266,8 +245,14 @@
266245
"metadata": {},
267246
"outputs": [],
268247
"source": [
269-
"region_a.order_granules()# <-- you do not need to include the 'Coverage' kwarg to\n",
270-
" # order if you have already included it in a call to subsetparams"
248+
"order.status()"
249+
]
250+
},
251+
{
252+
"cell_type": "markdown",
253+
"metadata": {},
254+
"source": [
255+
"### Downloading subsetted granules"
271256
]
272257
},
273258
{
@@ -276,8 +261,7 @@
276261
"metadata": {},
277262
"outputs": [],
278263
"source": [
279-
"region_a.download_granules('/home/jovyan/icepyx/dev-notebooks/vardata') # <-- you do not need to include the 'Coverage' kwarg to\n",
280-
" # download if you have already submitted it with your order"
264+
"files = order.download_granules(\"./data\")"
281265
]
282266
},
283267
{
@@ -300,22 +284,13 @@
300284
"Compare the available variables associated with the full product relative to those in your downloaded data file."
301285
]
302286
},
303-
{
304-
"cell_type": "code",
305-
"execution_count": null,
306-
"metadata": {},
307-
"outputs": [],
308-
"source": [
309-
"# put the full filepath to a data file here. You can get this in JupyterHub by navigating to the file,\n",
310-
"# right clicking, and selecting copy path. Then you can paste the path in the quotes below.\n",
311-
"fn = ''"
312-
]
313-
},
314287
{
315288
"cell_type": "markdown",
316289
"metadata": {},
317290
"source": [
318291
"## Check the downloaded data\n",
292+
"\n",
293+
"### Note: this needs to be updated\n",
319294
"Get all `latitude` variables in your downloaded file:"
320295
]
321296
},
@@ -341,22 +316,6 @@
341316
" if vn==varname: print(tvar) "
342317
]
343318
},
344-
{
345-
"cell_type": "markdown",
346-
"metadata": {},
347-
"source": [
348-
"### Compare to the variable paths available in the original data"
349-
]
350-
},
351-
{
352-
"cell_type": "code",
353-
"execution_count": null,
354-
"metadata": {},
355-
"outputs": [],
356-
"source": [
357-
"region_a.order_vars.parse_var_list(region_a.order_vars.avail)[0][varname]"
358-
]
359-
},
360319
{
361320
"cell_type": "markdown",
362321
"metadata": {},
@@ -369,9 +328,9 @@
369328
],
370329
"metadata": {
371330
"kernelspec": {
372-
"display_name": "icepyx-dev",
331+
"display_name": "Python 3 (ipykernel)",
373332
"language": "python",
374-
"name": "icepyx-dev"
333+
"name": "python3"
375334
},
376335
"language_info": {
377336
"codemirror_mode": {
@@ -383,7 +342,7 @@
383342
"name": "python",
384343
"nbconvert_exporter": "python",
385344
"pygments_lexer": "ipython3",
386-
"version": "3.11.4"
345+
"version": "3.11.11"
387346
}
388347
},
389348
"nbformat": 4,

icepyx/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
from _icepyx_version import version as __version__
2-
from icepyx.core.base_query import GenQuery, BaseQuery, LegacyQuery
2+
from icepyx.core.base_query import BaseQuery, GenQuery, LegacyQuery
33
from icepyx.core.query import Query
44
from icepyx.core.read import Read
55
from icepyx.core.variables import Variables

icepyx/core/base_query.py

Lines changed: 5 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,8 @@
11
from functools import cached_property
22
import pprint
33
from typing import Optional, Union, cast
4-
from deprecated import deprecated
5-
import earthaccess
64

5+
import earthaccess
76
import geopandas as gpd
87
import matplotlib.pyplot as plt
98
from typing_extensions import Never
@@ -184,7 +183,7 @@ def temporal(self) -> Union[tp.Temporal, list[str]]:
184183
['No temporal parameters set']
185184
"""
186185

187-
if hasattr(self, "_temporal"):
186+
if hasattr(self, "_temporal") and self._temporal is not None:
188187
return self._temporal
189188
else:
190189
return ["No temporal parameters set"]
@@ -275,7 +274,7 @@ def dates(self) -> list[str]:
275274
>>> reg_a.dates
276275
['No temporal parameters set']
277276
"""
278-
if not hasattr(self, "_temporal"):
277+
if not hasattr(self, "_temporal") or self._temporal is None:
279278
return ["No temporal parameters set"]
280279
else:
281280
return [
@@ -302,7 +301,7 @@ def start_time(self) -> Union[list[str], str]:
302301
>>> reg_a.start_time
303302
['No temporal parameters set']
304303
"""
305-
if not hasattr(self, "_temporal"):
304+
if not hasattr(self, "_temporal") or self._temporal is None:
306305
return ["No temporal parameters set"]
307306
else:
308307
return self._temporal._start.strftime("%H:%M:%S")
@@ -326,7 +325,7 @@ def end_time(self) -> Union[list[str], str]:
326325
>>> reg_a.end_time
327326
['No temporal parameters set']
328327
"""
329-
if not hasattr(self, "_temporal"):
328+
if not hasattr(self, "_temporal") or self._temporal is None:
330329
return ["No temporal parameters set"]
331330
else:
332331
return self._temporal._end.strftime("%H:%M:%S")

icepyx/core/granules.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,6 @@
2020
from icepyx.core.types import (
2121
CMRParams,
2222
EGIRequiredParamsDownload,
23-
EGIRequiredParamsSearch,
2423
)
2524
from icepyx.core.urls import DOWNLOAD_BASE_URL, GRANULE_SEARCH_BASE_URL, ORDER_BASE_URL
2625

0 commit comments

Comments
 (0)