Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Missing scenes in output timeseries (Only 15/34 scenes showing) #104

Open
pbrotoisworo opened this issue Jan 25, 2025 · 1 comment
Open

Comments

@pbrotoisworo
Copy link

pbrotoisworo commented Jan 25, 2025

Hi,

I've managed to create an analysis using 34 Sentinel-1 images. However, around 50% of the data is missing when I did a single reference network.
I put in 34 SLC images and the output timeseries and network only contains 15 data points. Is that normal? The network is still nicely distributed but the temporal resolution is less than expected due to the missing data.

Image

I've checked the output of stackSentinel.py and the missing dates are there in the output folders such as merged/interferograms, baselines, coreg_secondarys, etc.

I've inspected the slcStack.h5 file and the "slc" key has shape (34, 1029, 5864) which I assume means all Sentinel-1 data was able to be ingested.

Input stackSentinel code

stackSentinel.py -s /mnt/e/data/insar-highways/demak \
--workflow interferogram \
--working_directory /mnt/e/data/insar-highways/demak_v5 \
-n 1 --bbox "-6.980585 -6.896600 110.435772 110.636444" \
-o /mnt/e/data/insar-highways/demak_v5/orbits \
-a /mnt/e/data/insar-highways/demak_v5/auxfiles \
-d /mnt/e/data/insar-highways/demak_v5/dem/dem.geo \
-V False \
-z 4 \
-r 20

Then I ran the miaplpyApp using miaplpyApp.py demak.cfg --dir /mnt/e/data/insar-highways/demak_v5/miaplpy using the cfg below.

################
miaplpy.load.processor      = isce  #[isce,snap,gamma,roipac], auto for isceTops
miaplpy.load.updateMode     = no  #[yes / no], auto for yes, skip re-loading if HDF5 files are complete
miaplpy.load.compression    = auto  #[gzip / lzf / no], auto for no.
miaplpy.load.autoPath       = no    # [yes, no] auto for no
        
		
miaplpy.load.slcFile        = /mnt/e/data/insar-highways/demak_v5/merged/SLC/*/*.slc.full  #[path2slc_file]
##---------for ISCE only:
miaplpy.load.metaFile       = /mnt/e/data/insar-highways/demak_v5/reference/IW*.xml
miaplpy.load.baselineDir    = /mnt/e/data/insar-highways/demak_v5/baselines
##---------geometry datasets:
miaplpy.load.demFile          = /mnt/e/data/insar-highways/demak_v5/merged/geom_reference/hgt.rdr.full
miaplpy.load.lookupYFile      = /mnt/e/data/insar-highways/demak_v5/merged/geom_reference/lat.rdr.full
miaplpy.load.lookupXFile      = /mnt/e/data/insar-highways/demak_v5/merged/geom_reference/lon.rdr.full
miaplpy.load.incAngleFile     = /mnt/e/data/insar-highways/demak_v5/merged/geom_reference/los.rdr.full
miaplpy.load.azAngleFile      = /mnt/e/data/insar-highways/demak_v5/merged/geom_reference/los.rdr.full
miaplpy.load.shadowMaskFile   = /mnt/e/data/insar-highways/demak_v5/merged/geom_reference/shadowMask.rdr.full
##---------miaplpy.load.waterMaskFile    = /mnt/e/data/insar-highways/demak_v4/water_mask/swbdLat_S08_S06_Lon_E110_E111.wbd
##---------interferogram datasets:
miaplpy.load.unwFile        = /mnt/e/data/insar-highways/demak_v5/miaplpy/inverted/interferograms_single_reference/*/*fine*.unw
miaplpy.load.corFile        = /mnt/e/data/insar-highways/demak_v5/miaplpy/inverted/interferograms_single_reference/*/*fine*.cor
miaplpy.load.connCompFile   = /mnt/e/data/insar-highways/demak_v5/miaplpy/inverted/interferograms_single_reference/*/*.unw.conncomp
        
##---------subset (optional):
## if both yx and lalo are specified, use lalo option unless a) no lookup file AND b) dataset is in radar coord
miaplpy.subset.lalo         = -6.980585:-6.896600,110.435772:110.636444

# MiaplPy options 
miaplpy.multiprocessing.numProcessor   = 10
miaplpy.interferograms.type = single_reference

## Mintpy options
mintpy.compute.cluster     = local  # if dask is not available, set this option to no 
mintpy.compute.numWorker   = 4

mintpy.reference.lalo     = -6.9062397501293855, 110.62864532047873
mintpy.troposphericDelay.method = no

@pbrotoisworo
Copy link
Author

pbrotoisworo commented Jan 25, 2025

Just an update. I fixed the error.

I saw output in step 5 unwrap_ifgram where it says Killed multiple times which I think my computer just ran out of memory. So I think this resulted in the downstream processing assuming there were only 15 datasets because of many SNAPHU failures.

Checking my run_05_miaplpy_unwrap_ifgram file I see it has a lot of run commands executing at the same time. There are 20 run commands then a wait. After that 13 run commands before the last wait. I rewrote the file so there is a wait command after every 4 SNAPHU command. I'm not sure which parameter I originally used in the cfg, maybe miaplpy.compute.numCores. I set it to 20 because I have 20 CPU cores.

I run again with no problem. But then I had to delete numInvIfgram.h5, timeseries.h5, and temporalCoherence.h5 due to a mismatch in dataset sizes in later steps but the resulting output is good.

My thoughts on this for the project team:

  1. Could there be a specific parameter for number of jobs for SNAPHU? I was fine tuning the CFG file for the phase linking step since it takes so long so I wanted to maximize the CPU cores. But if I'm understanding correctly the same parameter was used for number of jobs for SNAPHU which lead to unsafe process terminations.
  2. There should be a way to safely catch the SNAPHU out of memory error in unwrap_ifgram. It doesn't raise an exception and the rest of the MiaplPy was able to run and just assumed it was valid despite 50% of the dataset missing.

Image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant