-
Hello! In my spare time, I've been learning about natural resource datasets, and stumbled upon hyriver - what an awesome project. Thanks to all the contributors for putting this together! I'm interested in inputing a list of coordinates representing various observation locations for a species and outputting the land use info (impervious, cover, canopy, and descriptor) from the NLCD database. The list for the species I'm using as an example has about 3800 entries, but ideally I would like to do the same thing for much longer lists too. I've been using I tried iterating over the list
and
The output I did get from the second one before it errored out was exactly what I was looking for. Thanks in advance - I am a beginner with this package and appreciate any help. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Hi! The issue with large requests is that it hammers the web service and can lead to the service blocking the request. I recommend an approach similar to your for-loop but submitting your requests in batches, instead. For example, for a batch size of 100, you do something like this: from pathlib import Path
import cytoolz as tlz
for i, c in enumerate(tlz.partition_all(100, lookup)):
if not Path(f"nlcd_{i}.parquet").exists():
gh.nlcd_bycoords(coords=c).to_parquet(f"nlcd_{i}.parquet") Whenever it fails, you can rerun it again, and it will continue from where it failed. You can run this a couple of times, until it finishes. Once it's done, you can just read and concat all the files into a single dataframe. |
Beta Was this translation helpful? Give feedback.
Hi! The issue with large requests is that it hammers the web service and can lead to the service blocking the request. I recommend an approach similar to your for-loop but submitting your requests in batches, instead. For example, for a batch size of 100, you do something like this:
Whenever it fails, you can rerun it again, and it will continue from where it failed. You can run this a couple of times, until it finishes. Once it's done, you can just read and concat all the fi…