-
-
Notifications
You must be signed in to change notification settings - Fork 96
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Help]: Yamchi Dam SBAS-PSInSAR -Trend Correction : Addressing Long Processing Times #108
Comments
Hi, how large is your area? It's better to start with a small area and enlarge it later if needed when everything is working well. Also, see issue #98 regarding speeding up trend computation. |
Thanks for the reply, Maybe it is large, given that you use a point and then a buffer around it. I used the bounding box covering I'll attempt the process again with a smaller area to see if that improves performance. Best, |
Remote download deactivated - is it sometimes possible or unusual?define the area of interest (AOI) to speedup the processingsbas.download_dem(AOI) DEM Downloading: 0%0/1 [00:10<?, ?it/s] grdcut [ERROR]: Remote download is currently deactivatedgrdcut [ERROR]: Unable to obtain remote file @earth_relief_01s_gGMTCLibError Traceback (most recent call last) 8 frames GMTCLibError: Module 'grdcut' failed with status code 72: |
It might be a temporal SRTM DEM downloading issue, you can try 90m product instead: sbas.download_dem(AOI, product='3s') |
Issue persists GMTCLibError: Module 'grdcut' failed with status code 72: |
Hmm, it looks as incorrect maximum download size limit, try to increase it for 3s (~90m, smaller) or 1s SRTM datasets (~30m, ~10 times larger):
|
Alexey, so, the DEM download is unavailable at the moment? Here also presented an error:
|
The error message explains that this tile cannot be downloaded:
You can load external DEM instead. |
Previously I ran analysis over this area with no problems. I'm running now for the second time with more scenes and a smaller AOI when this error appeared. Anyway, i've downloaded the COP-30 but:
|
Here is the solution: https://www.patreon.com/posts/pygmtsar-new-99185442 The recent PyGMTSAR version supports Copernicus DEM downloading. |
Here is the solution: https://www.patreon.com/posts/pygmtsar-new-99185442
—
Best regards, Alexey Pechnikov
… On 23 Feb 2024, at 03:00, Thamires ***@***.***> wrote:
Previously I ran analysis over this area with no problems. I'm running now for the second time with more scenes and a smaller AOI when this error appeared. Anyway, i've downloaded the COP-30 but:
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
Cell In[17], line 1
----> 1 sbas.load_dem('SaoRomao_COP30.tif',geometry=AOI)
File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmtsar/Stack_dem.py:294](http://localhost:8930/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmtsar/Stack_dem.py#line=293), in Stack_dem.load_dem(self, filename, geometry)
291 return
293 if os.path.splitext(filename)[-1] in ['.tiff', '.tif', '.TIF']:
--> 294 ortho = rio.open_rasterio(filename, chunks=self.chunksize).squeeze(drop=True)\
295 .rename({'y': 'lat', 'x': 'lon'})\
296 .drop('spatial_ref')
297 if ortho.lat.diff('lat')[0].item() < 0:
298 ortho = ortho.reindex(lat=ortho.lat[::-1])
File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/rioxarray/_io.py:1264](http://localhost:8930/anaconda3/envs/interferometry/lib/python3.11/site-packages/rioxarray/_io.py#line=1263), in open_rasterio(filename, parse_coordinates, chunks, cache, lock, masked, mask_and_scale, variable, group, default_name, decode_times, decode_timedelta, band_as_variable, **open_kwargs)
1261 result.rio.write_gcps(*riods.gcps, inplace=True)
1263 if chunks is not None:
-> 1264 result = _prepare_dask(result, riods, filename, chunks)
1265 else:
1266 result.encoding["preferred_chunks"] = {
1267 result.rio.y_dim: riods.block_shapes[0][0],
1268 result.rio.x_dim: riods.block_shapes[0][1],
1269 coord_name: 1,
1270 }
File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/rioxarray/_io.py:929](http://localhost:8930/anaconda3/envs/interferometry/lib/python3.11/site-packages/rioxarray/_io.py#line=928), in _prepare_dask(result, riods, filename, chunks)
927 token = tokenize(filename, mtime, chunks)
928 name_prefix = f"open_rasterio-{token}"
--> 929 return result.chunk(chunks, name_prefix=name_prefix, token=token)
File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/util/deprecation_helpers.py:115](http://localhost:8930/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/util/deprecation_helpers.py#line=114), in _deprecate_positional_args.<locals>._decorator.<locals>.inner(*args, **kwargs)
111 kwargs.update({name: arg for name, arg in zip_args})
113 return func(*args[:-n_extra_args], **kwargs)
--> 115 return func(*args, **kwargs)
File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataarray.py:1390](http://localhost:8930/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataarray.py#line=1389), in DataArray.chunk(self, chunks, name_prefix, token, lock, inline_array, chunked_array_type, from_array_kwargs, **chunks_kwargs)
1387 else:
1388 chunks = either_dict_or_kwargs(chunks, chunks_kwargs, "chunk")
-> 1390 ds = self._to_temp_dataset().chunk(
1391 chunks,
1392 name_prefix=name_prefix,
1393 token=token,
1394 lock=lock,
1395 inline_array=inline_array,
1396 chunked_array_type=chunked_array_type,
1397 from_array_kwargs=from_array_kwargs,
1398 )
1399 return self._from_temp_dataset(ds)
File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataset.py:2699](http://localhost:8930/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataset.py#line=2698), in Dataset.chunk(self, chunks, name_prefix, token, lock, inline_array, chunked_array_type, from_array_kwargs, **chunks_kwargs)
2696 if from_array_kwargs is None:
2697 from_array_kwargs = {}
-> 2699 variables = {
2700 k: _maybe_chunk(
2701 k,
2702 v,
2703 chunks_mapping,
2704 token,
2705 lock,
2706 name_prefix,
2707 inline_array=inline_array,
2708 chunked_array_type=chunkmanager,
2709 from_array_kwargs=from_array_kwargs.copy(),
2710 )
2711 for k, v in self.variables.items()
2712 }
2713 return self._replace(variables)
File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataset.py:2700](http://localhost:8930/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataset.py#line=2699), in <dictcomp>(.0)
2696 if from_array_kwargs is None:
2697 from_array_kwargs = {}
2699 variables = {
-> 2700 k: _maybe_chunk(
2701 k,
2702 v,
2703 chunks_mapping,
2704 token,
2705 lock,
2706 name_prefix,
2707 inline_array=inline_array,
2708 chunked_array_type=chunkmanager,
2709 from_array_kwargs=from_array_kwargs.copy(),
2710 )
2711 for k, v in self.variables.items()
2712 }
2713 return self._replace(variables)
File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataset.py:299](http://localhost:8930/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataset.py#line=298), in _maybe_chunk(name, var, chunks, token, lock, name_prefix, overwrite_encoded_chunks, inline_array, chunked_array_type, from_array_kwargs)
296 chunks = {dim: chunks[dim] for dim in var.dims if dim in chunks}
298 if var.ndim:
--> 299 chunked_array_type = guess_chunkmanager(
300 chunked_array_type
301 ) # coerce string to ChunkManagerEntrypoint type
302 if isinstance(chunked_array_type, DaskManager):
303 from dask.base import tokenize
File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/namedarray/parallelcompat.py:119](http://localhost:8930/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/namedarray/parallelcompat.py#line=118), in guess_chunkmanager(manager)
117 return manager
118 else:
--> 119 raise TypeError(
120 f"manager must be a string or instance of ChunkManagerEntrypoint, but received type {type(manager)}"
121 )
TypeError: manager must be a string or instance of ChunkManagerEntrypoint, but received type <class 'xarray.core.daskmanager.DaskManager'>
—
Reply to this email directly, view it on GitHub <#108 (comment)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/ABYASK2Q46QA2O2EIBZA37TYU6PUVAVCNFSM6AAAAABDU7AZS6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSNRQGE4DAMZZGA>.
You are receiving this because you commented.
|
Hey Alexey, I saw your post early and when I could I updated my notebook but when I run i'm still getting errors. Good to clarify that I've deleted the raw directory from previously runs, its a fresh start: First tentative, with filename = DEM:
Second tentative, without filename = DEM:
Third, using GMT:
After all this tries, one DEM was downloaded in the data directory. I tried to load it:
Tried to load with other function:
Also tried load an external one (COP-30 downloaded by me):
I'm sorry to bother you with those comments but I just don't know what to do. It was functional before all this DEM download issue :( |
Your DEM file is located in the data directory instead of the raw directory. You can remove it or use the suggested function argument to rewrite it. |
I adjusted the directory from data to raw and deleted the previous DEM file from data. Looks like COP DEM from AWS its not downloading (same error from my above comment) but GMT function gives an .nc archive.
|
You shouldn't do that because the 'raw' directory is recreated for every run, and your DEM file is dropped before you initialize the main stack object. Please refer to the example notebooks for the correct pipeline. |
The dem.nc was in data as I saw in the notebooks:
I've restarted everything and run a new notebook from zero, following the updated pipeline (used Imperial_Valley as reference) but got the same errors when loading the DEM. The notebook is attached. |
If you need assistance, feel free to load it on Google Colab. |
It's ok on Google Colab: https://colab.research.google.com/drive/1JFMNVCDRRDScN240YKJarjRuXNfNV7Z0?usp=sharing Edit: Solved downgrading the Python to 3.10.13!!! |
What Python version were you using previously? By the way, Google Colab currently uses Python 3.10.12 and locally I use 3.11.6. |
I was using 3.11.8 and I ran some analysis in this meantime. But later it didn't work anymore till downgrade to 3.10.13 and this solve the issues |
Hi Alexey,
I wanted to reach out to discuss some challenges I've encountered while running the SBAS-PSINSAR script on Google Colab. Specifically, I've been working with a dataset consisting of 15 Sentinel-1 scenes covering a region in Florida.
My issue arises during the step of trend correction, where the script seems to take an exceptionally long time to run. Initially, I attempted running the script on a standard Google Colab runtime, but it stopped unexpectedly. In an effort to address this, I upgraded to Google Colab Pro. However, despite the upgrade, the script continued to run for over 6 hours without completion, prompting me to stop it manually.
I'm reaching out to seek guidance on what would be considered a reasonable processing time for this step. Additionally, I would greatly appreciate any insights or recommendations you may have for optimizing the script's performance on Google Colab.
Thank you very much for your assistance and support.
The text was updated successfully, but these errors were encountered: