Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Help]: Yamchi Dam SBAS-PSInSAR -Trend Correction : Addressing Long Processing Times #108

Closed
junaidlonez11 opened this issue Feb 22, 2024 · 20 comments

Comments

@junaidlonez11
Copy link

junaidlonez11 commented Feb 22, 2024

Hi Alexey,

I wanted to reach out to discuss some challenges I've encountered while running the SBAS-PSINSAR script on Google Colab. Specifically, I've been working with a dataset consisting of 15 Sentinel-1 scenes covering a region in Florida.

My issue arises during the step of trend correction, where the script seems to take an exceptionally long time to run. Initially, I attempted running the script on a standard Google Colab runtime, but it stopped unexpectedly. In an effort to address this, I upgraded to Google Colab Pro. However, despite the upgrade, the script continued to run for over 6 hours without completion, prompting me to stop it manually.

I'm reaching out to seek guidance on what would be considered a reasonable processing time for this step. Additionally, I would greatly appreciate any insights or recommendations you may have for optimizing the script's performance on Google Colab.

Thank you very much for your assistance and support.

@AlexeyPechnikov
Copy link
Owner

Hi, how large is your area? It's better to start with a small area and enlarge it later if needed when everything is working well. Also, see issue #98 regarding speeding up trend computation.

@junaidlonez11
Copy link
Author

Thanks for the reply,

Maybe it is large, given that you use a point and then a buffer around it.

I used the bounding box covering
(-81.7187, 28.1698),
(-81.0744, 28.1698),
(-81.0744, 29.2631),
(-81.7187, 29.2631),
(-81.7187, 28.1698)

I'll attempt the process again with a smaller area to see if that improves performance.

Best,
Junaid

@junaidlonez11
Copy link
Author

junaidlonez11 commented Feb 22, 2024

Remote download deactivated - is it sometimes possible or unusual?

define the area of interest (AOI) to speedup the processing

sbas.download_dem(AOI)

DEM Downloading:   0%

 0/1 [00:10<?, ?it/s]

grdcut [ERROR]: Remote download is currently deactivated
grdcut [ERROR]: Unable to obtain remote file @earth_relief_01s_g

GMTCLibError Traceback (most recent call last)
in <cell line: 2>()
1 define the area of interest (AOI) to speedup the processing
----> 2 sbas.download_dem(AOI)

8 frames
/usr/local/lib/python3.10/dist-packages/pygmt/clib/session.py in call_module(self, #### module, args)
622 )
623 if status != 0:
--> 624 raise GMTCLibError(
625 f"Module '{module}' failed with status code {status}:\n{self._error_message}"
626 )

GMTCLibError: Module 'grdcut' failed with status code 72:
grdcut [ERROR]: Remote download is currently deactivated
grdcut [ERROR]: Unable to obtain remote file @earth_relief_01s_g

@AlexeyPechnikov
Copy link
Owner

It might be a temporal SRTM DEM downloading issue, you can try 90m product instead:

sbas.download_dem(AOI, product='3s')

@junaidlonez11
Copy link
Author

Issue persists

GMTCLibError: Module 'grdcut' failed with status code 72:
grdcut [ERROR]: Remote download is currently deactivated
grdcut [ERROR]: Unable to obtain remote file @earth_relief_03s_g

@AlexeyPechnikov
Copy link
Owner

Hmm, it looks as incorrect maximum download size limit, try to increase it for 3s (~90m, smaller) or 1s SRTM datasets (~30m, ~10 times larger):

import pygmt
# Set the GMT data server limit to N Mb to allow for remote downloads
pygmt.config(GMT_DATA_SERVER_LIMIT=1e6)
sbas.download_dem(AOI, product='3s')

@teagamrs
Copy link

Alexey, so, the DEM download is unavailable at the moment? Here also presented an error:

  1. With the 1s
--------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
Cell In[22], line 2
      1 # 90m DEM downloads faster than the default 30m DEM
----> 2 sbas.download_dem(AOI)

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmtsar/Stack_dem.py:246](http://localhost:8864/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmtsar/Stack_dem.py#line=245), in Stack_dem.download_dem(self, geometry, product)
    244 ortho = pygmt.datasets.load_earth_relief(resolution=resolution, region=[minx, maxx, miny, maxy])
    245 # heights correction
--> 246 geoid = self.get_geoid(ortho)
    247 if os.path.exists(dem_filename):
    248     os.remove(dem_filename)

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmtsar/Stack_dem.py:69](http://localhost:8864/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmtsar/Stack_dem.py#line=68), in Stack_dem.get_geoid(self, grid)
     67 gmtsar_sharedir = PRM().gmtsar_sharedir()
     68 geoid_filename = os.path.join(gmtsar_sharedir, 'geoid_egm96_icgem.grd')
---> 69 geoid = xr.open_dataarray(geoid_filename, engine=self.netcdf_engine, chunks=self.netcdf_chunksize).rename({'y': 'lat', 'x': 'lon'})
     70 if grid is not None:
     71     geoid = geoid.interp_like(grid, method='cubic')

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/backends/api.py:749](http://localhost:8864/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/backends/api.py#line=748), in open_dataarray(filename_or_obj, engine, chunks, cache, decode_cf, mask_and_scale, decode_times, decode_timedelta, use_cftime, concat_characters, decode_coords, drop_variables, inline_array, chunked_array_type, from_array_kwargs, backend_kwargs, **kwargs)
    596 def open_dataarray(
    597     filename_or_obj: str | os.PathLike[Any] | BufferedIOBase | AbstractDataStore,
    598     *,
   (...)
    614     **kwargs,
    615 ) -> DataArray:
    616     """Open an DataArray from a file or file-like object containing a single
    617     data variable.
    618 
   (...)
    746     open_dataset
    747     """
--> 749     dataset = open_dataset(
    750         filename_or_obj,
    751         decode_cf=decode_cf,
    752         mask_and_scale=mask_and_scale,
    753         decode_times=decode_times,
    754         concat_characters=concat_characters,
    755         decode_coords=decode_coords,
    756         engine=engine,
    757         chunks=chunks,
    758         cache=cache,
    759         drop_variables=drop_variables,
    760         inline_array=inline_array,
    761         chunked_array_type=chunked_array_type,
    762         from_array_kwargs=from_array_kwargs,
    763         backend_kwargs=backend_kwargs,
    764         use_cftime=use_cftime,
    765         decode_timedelta=decode_timedelta,
    766         **kwargs,
    767     )
    769     if len(dataset.data_vars) != 1:
    770         raise ValueError(
    771             "Given file dataset contains more than one data "
    772             "variable. Please read with xarray.open_dataset and "
    773             "then select the variable you want."
    774         )

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/backends/api.py:579](http://localhost:8864/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/backends/api.py#line=578), in open_dataset(filename_or_obj, engine, chunks, cache, decode_cf, mask_and_scale, decode_times, decode_timedelta, use_cftime, concat_characters, decode_coords, drop_variables, inline_array, chunked_array_type, from_array_kwargs, backend_kwargs, **kwargs)
    572 overwrite_encoded_chunks = kwargs.pop("overwrite_encoded_chunks", None)
    573 backend_ds = backend.open_dataset(
    574     filename_or_obj,
    575     drop_variables=drop_variables,
    576     **decoders,
    577     **kwargs,
    578 )
--> 579 ds = _dataset_from_backend_dataset(
    580     backend_ds,
    581     filename_or_obj,
    582     engine,
    583     chunks,
    584     cache,
    585     overwrite_encoded_chunks,
    586     inline_array,
    587     chunked_array_type,
    588     from_array_kwargs,
    589     drop_variables=drop_variables,
    590     **decoders,
    591     **kwargs,
    592 )
    593 return ds

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/backends/api.py:372](http://localhost:8864/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/backends/api.py#line=371), in _dataset_from_backend_dataset(backend_ds, filename_or_obj, engine, chunks, cache, overwrite_encoded_chunks, inline_array, chunked_array_type, from_array_kwargs, **extra_tokens)
    370     ds = backend_ds
    371 else:
--> 372     ds = _chunk_ds(
    373         backend_ds,
    374         filename_or_obj,
    375         engine,
    376         chunks,
    377         overwrite_encoded_chunks,
    378         inline_array,
    379         chunked_array_type,
    380         from_array_kwargs,
    381         **extra_tokens,
    382     )
    384 ds.set_close(backend_ds._close)
    386 # Ensure source filename always stored in dataset object

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/backends/api.py:337](http://localhost:8864/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/backends/api.py#line=336), in _chunk_ds(backend_ds, filename_or_obj, engine, chunks, overwrite_encoded_chunks, inline_array, chunked_array_type, from_array_kwargs, **extra_tokens)
    335 for name, var in backend_ds.variables.items():
    336     var_chunks = _get_chunk(var, chunks, chunkmanager)
--> 337     variables[name] = _maybe_chunk(
    338         name,
    339         var,
    340         var_chunks,
    341         overwrite_encoded_chunks=overwrite_encoded_chunks,
    342         name_prefix=name_prefix,
    343         token=token,
    344         inline_array=inline_array,
    345         chunked_array_type=chunkmanager,
    346         from_array_kwargs=from_array_kwargs.copy(),
    347     )
    348 return backend_ds._replace(variables)

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataset.py:299](http://localhost:8864/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataset.py#line=298), in _maybe_chunk(name, var, chunks, token, lock, name_prefix, overwrite_encoded_chunks, inline_array, chunked_array_type, from_array_kwargs)
    296     chunks = {dim: chunks[dim] for dim in var.dims if dim in chunks}
    298 if var.ndim:
--> 299     chunked_array_type = guess_chunkmanager(
    300         chunked_array_type
    301     )  # coerce string to ChunkManagerEntrypoint type
    302     if isinstance(chunked_array_type, DaskManager):
    303         from dask.base import tokenize

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/namedarray/parallelcompat.py:119](http://localhost:8864/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/namedarray/parallelcompat.py#line=118), in guess_chunkmanager(manager)
    117     return manager
    118 else:
--> 119     raise TypeError(
    120         f"manager must be a string or instance of ChunkManagerEntrypoint, but received type {type(manager)}"
    121     )

TypeError: manager must be a string or instance of ChunkManagerEntrypoint, but received type <class 'xarray.core.daskmanager.DaskManager'>
  1. With alternative (AOI, products='3s')
    grdblend [NOTICE]: Remote data courtesy of GMT data server oceania [http://oceania.generic-mapping-tools.org]
    grdblend [NOTICE]: Earth Relief at 3x3 arc seconds tiles provided by SRTMGL3 (land only) [NASA/USGS].
    grdblend [NOTICE]: -> Download 1x1 degree grid tile (earth_relief_03s_g): S17W046
    grdblend [ERROR]: Libcurl Error: HTTP response code said error
    grdblend [ERROR]: Probably means @S17W046.earth_relief_03s_g.nc does not exist on the remote server
    grdblend [ERROR]: Unable to obtain remote file @S17W046.earth_relief_03s_g.nc
    grdblend [NOTICE]: -> Download 1x1 degree grid tile (earth_relief_03s_g): S17W046
    grdblend [ERROR]: Libcurl Error: HTTP response code said error
    grdblend [ERROR]: Probably means @S17W046.earth_relief_03s_g.nc does not exist on the remote server
    grdblend [ERROR]: Unable to obtain remote file @S17W046.earth_relief_03s_g.nc
    grdblend [NOTICE]: -> Download 1x1 degree grid tile (earth_relief_03s_g): S17W046
    grdblend [ERROR]: Libcurl Error: HTTP response code said error
    grdblend [ERROR]: Probably means @S17W046.earth_relief_03s_g.nc does not exist on the remote server
    grdblend [ERROR]: Unable to obtain remote file @S17W046.earth_relief_03s_g.nc
    grdblend [ERROR]: File @S17W046.earth_relief_03s_g.nc not found
    [Session pygmt-session (14)]: Error returned from GMT API: GMT_FILE_NOT_FOUND (16)
    grdcut [ERROR]: ERROR - Unable to produce blended grid from /tmp/=tiled_330_GO_ohOFN6
    [Session pygmt-session (14)]: Error returned from GMT API: GMT_GRID_READ_ERROR (18)
    [Session pygmt-session (14)]: Error returned from GMT API: GMT_GRID_READ_ERROR (18)
    [Session pygmt-session (14)]: Error returned from GMT API: GMT_GRID_READ_ERROR (18)
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
Cell In[21], line 2
      1 # 90m DEM downloads faster than the default 30m DEM
----> 2 sbas.download_dem(AOI, product='3s')

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmtsar/Stack_dem.py:244](http://localhost:8864/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmtsar/Stack_dem.py#line=243), in Stack_dem.download_dem(self, geometry, product)
    240 #print ('minx, miny, maxx, maxy', minx, miny, maxx, maxy)
    242 with tqdm(desc='DEM Downloading', total=1) as pbar:
    243     # download DEM using GMT extent W E S N
--> 244     ortho = pygmt.datasets.load_earth_relief(resolution=resolution, region=[minx, maxx, miny, maxy])
    245     # heights correction
    246     geoid = self.get_geoid(ortho)

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmt/helpers/decorators.py:776](http://localhost:8864/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmt/helpers/decorators.py#line=775), in kwargs_to_strings.<locals>.converter.<locals>.new_module(*args, **kwargs)
    773             bound.arguments["kwargs"][arg] = newvalue
    775 # Execute the original function and return its output
--> 776 return module_func(*bound.args, **bound.kwargs)

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmt/datasets/earth_relief.py:164](http://localhost:8864/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmt/datasets/earth_relief.py#line=163), in load_earth_relief(resolution, region, registration, data_source, use_srtm)
    161 else:
    162     dataset_prefix = earth_relief_sources[data_source]
--> 164 grid = _load_remote_dataset(
    165     dataset_name="earth_relief",
    166     dataset_prefix=dataset_prefix,
    167     resolution=resolution,
    168     region=region,
    169     registration=registration,
    170 )
    171 return grid

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmt/helpers/decorators.py:776](http://localhost:8864/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmt/helpers/decorators.py#line=775), in kwargs_to_strings.<locals>.converter.<locals>.new_module(*args, **kwargs)
    773             bound.arguments["kwargs"][arg] = newvalue
    775 # Execute the original function and return its output
--> 776 return module_func(*bound.args, **bound.kwargs)

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmt/datasets/load_remote_dataset.py:419](http://localhost:8864/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmt/datasets/load_remote_dataset.py#line=418), in _load_remote_dataset(dataset_name, dataset_prefix, resolution, region, registration)
    417     grid = load_dataarray(fname, engine="netcdf4")
    418 else:
--> 419     grid = grdcut(f"@{dataset_prefix}{resolution}{reg}", region=region)
    421 # Add some metadata to the grid
    422 grid.name = dataset.name

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmt/helpers/decorators.py:603](http://localhost:8864/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmt/helpers/decorators.py#line=602), in use_alias.<locals>.alias_decorator.<locals>.new_module(*args, **kwargs)
    596     msg = (
    597         "Parameters 'Y' and 'yshift' are deprecated since v0.8.0. "
    598         "and will be removed in v0.12.0. "
    599         "Use Figure.shift_origin(yshift=...) instead."
    600     )
    601     warnings.warn(msg, category=SyntaxWarning, stacklevel=2)
--> 603 return module_func(*args, **kwargs)

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmt/helpers/decorators.py:776](http://localhost:8864/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmt/helpers/decorators.py#line=775), in kwargs_to_strings.<locals>.converter.<locals>.new_module(*args, **kwargs)
    773             bound.arguments["kwargs"][arg] = newvalue
    775 # Execute the original function and return its output
--> 776 return module_func(*bound.args, **bound.kwargs)

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmt/src/grdcut.py:112](http://localhost:8864/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmt/src/grdcut.py#line=111), in grdcut(grid, **kwargs)
    107             kwargs["G"] = outgrid = tmpfile.name  # output to tmpfile
    108         lib.call_module(
    109             module="grdcut", args=build_arg_string(kwargs, infile=infile)
    110         )
--> 112 return load_dataarray(outgrid) if outgrid == tmpfile.name else None

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmt/io.py:42](http://localhost:8864/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmt/io.py#line=41), in load_dataarray(filename_or_obj, **kwargs)
     39 if "cache" in kwargs:
     40     raise TypeError("cache has no effect in this context")
---> 42 with xr.open_dataarray(filename_or_obj, **kwargs) as dataarray:
     43     result = dataarray.load()
     44     _ = result.gmt  # load GMTDataArray accessor information

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/backends/api.py:749](http://localhost:8864/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/backends/api.py#line=748), in open_dataarray(filename_or_obj, engine, chunks, cache, decode_cf, mask_and_scale, decode_times, decode_timedelta, use_cftime, concat_characters, decode_coords, drop_variables, inline_array, chunked_array_type, from_array_kwargs, backend_kwargs, **kwargs)
    596 def open_dataarray(
    597     filename_or_obj: str | os.PathLike[Any] | BufferedIOBase | AbstractDataStore,
    598     *,
   (...)
    614     **kwargs,
    615 ) -> DataArray:
    616     """Open an DataArray from a file or file-like object containing a single
    617     data variable.
    618 
   (...)
    746     open_dataset
    747     """
--> 749     dataset = open_dataset(
    750         filename_or_obj,
    751         decode_cf=decode_cf,
    752         mask_and_scale=mask_and_scale,
    753         decode_times=decode_times,
    754         concat_characters=concat_characters,
    755         decode_coords=decode_coords,
    756         engine=engine,
    757         chunks=chunks,
    758         cache=cache,
    759         drop_variables=drop_variables,
    760         inline_array=inline_array,
    761         chunked_array_type=chunked_array_type,
    762         from_array_kwargs=from_array_kwargs,
    763         backend_kwargs=backend_kwargs,
    764         use_cftime=use_cftime,
    765         decode_timedelta=decode_timedelta,
    766         **kwargs,
    767     )
    769     if len(dataset.data_vars) != 1:
    770         raise ValueError(
    771             "Given file dataset contains more than one data "
    772             "variable. Please read with xarray.open_dataset and "
    773             "then select the variable you want."
    774         )

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/backends/api.py:554](http://localhost:8864/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/backends/api.py#line=553), in open_dataset(filename_or_obj, engine, chunks, cache, decode_cf, mask_and_scale, decode_times, decode_timedelta, use_cftime, concat_characters, decode_coords, drop_variables, inline_array, chunked_array_type, from_array_kwargs, backend_kwargs, **kwargs)
    551     kwargs.update(backend_kwargs)
    553 if engine is None:
--> 554     engine = plugins.guess_engine(filename_or_obj)
    556 if from_array_kwargs is None:
    557     from_array_kwargs = {}

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/backends/plugins.py:197](http://localhost:8864/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/backends/plugins.py#line=196), in guess_engine(store_spec)
    189 else:
    190     error_msg = (
    191         "found the following matches with the input file in xarray's IO "
    192         f"backends: {compatible_engines}. But their dependencies may not be installed, see:\n"
    193         "https://docs.xarray.dev/en/stable/user-guide/io.html \n"
    194         "https://docs.xarray.dev/en/stable/getting-started-guide/installing.html"
    195     )
--> 197 raise ValueError(error_msg)

ValueError: did not find a match in any of xarray's currently installed IO backends ['netcdf4', 'h5netcdf', 'scipy', 'rasterio']. Consider explicitly selecting one of the installed engines via the ``engine`` parameter, or installing additional IO dependencies, see:
https://docs.xarray.dev/en/stable/getting-started-guide/installing.html
https://docs.xarray.dev/en/stable/user-guide/io.html

@AlexeyPechnikov
Copy link
Owner

The error message explains that this tile cannot be downloaded:

grdblend [ERROR]: Probably means @S17W046.earth_relief_03s_g.nc does not exist on the remote server

You can load external DEM instead.

@teagamrs
Copy link

Previously I ran analysis over this area with no problems. I'm running now for the second time with more scenes and a smaller AOI when this error appeared. Anyway, i've downloaded the COP-30 but:

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
Cell In[17], line 1
----> 1 sbas.load_dem('SaoRomao_COP30.tif',geometry=AOI)

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmtsar/Stack_dem.py:294](http://localhost:8930/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmtsar/Stack_dem.py#line=293), in Stack_dem.load_dem(self, filename, geometry)
    291     return
    293 if os.path.splitext(filename)[-1] in ['.tiff', '.tif', '.TIF']:
--> 294     ortho = rio.open_rasterio(filename, chunks=self.chunksize).squeeze(drop=True)\
    295         .rename({'y': 'lat', 'x': 'lon'})\
    296         .drop('spatial_ref')
    297     if ortho.lat.diff('lat')[0].item() < 0:
    298         ortho = ortho.reindex(lat=ortho.lat[::-1])

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/rioxarray/_io.py:1264](http://localhost:8930/anaconda3/envs/interferometry/lib/python3.11/site-packages/rioxarray/_io.py#line=1263), in open_rasterio(filename, parse_coordinates, chunks, cache, lock, masked, mask_and_scale, variable, group, default_name, decode_times, decode_timedelta, band_as_variable, **open_kwargs)
   1261     result.rio.write_gcps(*riods.gcps, inplace=True)
   1263 if chunks is not None:
-> 1264     result = _prepare_dask(result, riods, filename, chunks)
   1265 else:
   1266     result.encoding["preferred_chunks"] = {
   1267         result.rio.y_dim: riods.block_shapes[0][0],
   1268         result.rio.x_dim: riods.block_shapes[0][1],
   1269         coord_name: 1,
   1270     }

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/rioxarray/_io.py:929](http://localhost:8930/anaconda3/envs/interferometry/lib/python3.11/site-packages/rioxarray/_io.py#line=928), in _prepare_dask(result, riods, filename, chunks)
    927 token = tokenize(filename, mtime, chunks)
    928 name_prefix = f"open_rasterio-{token}"
--> 929 return result.chunk(chunks, name_prefix=name_prefix, token=token)

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/util/deprecation_helpers.py:115](http://localhost:8930/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/util/deprecation_helpers.py#line=114), in _deprecate_positional_args.<locals>._decorator.<locals>.inner(*args, **kwargs)
    111     kwargs.update({name: arg for name, arg in zip_args})
    113     return func(*args[:-n_extra_args], **kwargs)
--> 115 return func(*args, **kwargs)

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataarray.py:1390](http://localhost:8930/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataarray.py#line=1389), in DataArray.chunk(self, chunks, name_prefix, token, lock, inline_array, chunked_array_type, from_array_kwargs, **chunks_kwargs)
   1387 else:
   1388     chunks = either_dict_or_kwargs(chunks, chunks_kwargs, "chunk")
-> 1390 ds = self._to_temp_dataset().chunk(
   1391     chunks,
   1392     name_prefix=name_prefix,
   1393     token=token,
   1394     lock=lock,
   1395     inline_array=inline_array,
   1396     chunked_array_type=chunked_array_type,
   1397     from_array_kwargs=from_array_kwargs,
   1398 )
   1399 return self._from_temp_dataset(ds)

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataset.py:2699](http://localhost:8930/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataset.py#line=2698), in Dataset.chunk(self, chunks, name_prefix, token, lock, inline_array, chunked_array_type, from_array_kwargs, **chunks_kwargs)
   2696 if from_array_kwargs is None:
   2697     from_array_kwargs = {}
-> 2699 variables = {
   2700     k: _maybe_chunk(
   2701         k,
   2702         v,
   2703         chunks_mapping,
   2704         token,
   2705         lock,
   2706         name_prefix,
   2707         inline_array=inline_array,
   2708         chunked_array_type=chunkmanager,
   2709         from_array_kwargs=from_array_kwargs.copy(),
   2710     )
   2711     for k, v in self.variables.items()
   2712 }
   2713 return self._replace(variables)

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataset.py:2700](http://localhost:8930/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataset.py#line=2699), in <dictcomp>(.0)
   2696 if from_array_kwargs is None:
   2697     from_array_kwargs = {}
   2699 variables = {
-> 2700     k: _maybe_chunk(
   2701         k,
   2702         v,
   2703         chunks_mapping,
   2704         token,
   2705         lock,
   2706         name_prefix,
   2707         inline_array=inline_array,
   2708         chunked_array_type=chunkmanager,
   2709         from_array_kwargs=from_array_kwargs.copy(),
   2710     )
   2711     for k, v in self.variables.items()
   2712 }
   2713 return self._replace(variables)

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataset.py:299](http://localhost:8930/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataset.py#line=298), in _maybe_chunk(name, var, chunks, token, lock, name_prefix, overwrite_encoded_chunks, inline_array, chunked_array_type, from_array_kwargs)
    296     chunks = {dim: chunks[dim] for dim in var.dims if dim in chunks}
    298 if var.ndim:
--> 299     chunked_array_type = guess_chunkmanager(
    300         chunked_array_type
    301     )  # coerce string to ChunkManagerEntrypoint type
    302     if isinstance(chunked_array_type, DaskManager):
    303         from dask.base import tokenize

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/namedarray/parallelcompat.py:119](http://localhost:8930/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/namedarray/parallelcompat.py#line=118), in guess_chunkmanager(manager)
    117     return manager
    118 else:
--> 119     raise TypeError(
    120         f"manager must be a string or instance of ChunkManagerEntrypoint, but received type {type(manager)}"
    121     )

TypeError: manager must be a string or instance of ChunkManagerEntrypoint, but received type <class 'xarray.core.daskmanager.DaskManager'>

@AlexeyPechnikov
Copy link
Owner

Here is the solution: https://www.patreon.com/posts/pygmtsar-new-99185442 The recent PyGMTSAR version supports Copernicus DEM downloading.

@AlexeyPechnikov
Copy link
Owner

AlexeyPechnikov commented Feb 27, 2024 via email

@teagamrs
Copy link

teagamrs commented Feb 27, 2024

Hey Alexey,

I saw your post early and when I could I updated my notebook but when I run i'm still getting errors. Good to clarify that I've deleted the raw directory from previously runs, its a fresh start:

First tentative, with filename = DEM:

AWS().download_dem(AOI)
NOTE: DEM file exists, ignore the command. Use "skip_exist=False" or omit the filename to allow new downloading

Second tentative, without filename = DEM:

---------------------------------------------------------------------------
_RemoteTraceback                          Traceback (most recent call last)
_RemoteTraceback: 
"""
Traceback (most recent call last):
  File "[/home/thamires/anaconda3/envs/interferometry/lib/python3.11/site-packages/joblib/externals/loky/process_executor.py", line 463](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/joblib/externals/loky/process_executor.py#line=462), in _process_worker
    r = call_item()
        ^^^^^^^^^^^
  File "[/home/thamires/anaconda3/envs/interferometry/lib/python3.11/site-packages/joblib/externals/loky/process_executor.py", line 291](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/joblib/externals/loky/process_executor.py#line=290), in __call__
    return self.fn(*self.args, **self.kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "[/home/thamires/anaconda3/envs/interferometry/lib/python3.11/site-packages/joblib/parallel.py", line 589](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/joblib/parallel.py#line=588), in __call__
    return [func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^
  File "[/home/thamires/anaconda3/envs/interferometry/lib/python3.11/site-packages/joblib/parallel.py", line 589](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/joblib/parallel.py#line=588), in <listcomp>
    return [func(*args, **kwargs)
            ^^^^^^^^^^^^^^^^^^^^^
  File "[/home/thamires/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmtsar/AWS.py", line 58](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmtsar/AWS.py#line=57), in job_tile
    tile = rio.open_rasterio(f, chunks=self.chunksize)\
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "[/home/thamires/anaconda3/envs/interferometry/lib/python3.11/site-packages/rioxarray/_io.py", line 1264](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/rioxarray/_io.py#line=1263), in open_rasterio
    result = _prepare_dask(result, riods, filename, chunks)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "[/home/thamires/anaconda3/envs/interferometry/lib/python3.11/site-packages/rioxarray/_io.py", line 929](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/rioxarray/_io.py#line=928), in _prepare_dask
    return result.chunk(chunks, name_prefix=name_prefix, token=token)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "[/home/thamires/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/util/deprecation_helpers.py", line 115](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/util/deprecation_helpers.py#line=114), in inner
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "[/home/thamires/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataarray.py", line 1390](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataarray.py#line=1389), in chunk
    ds = self._to_temp_dataset().chunk(
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "[/home/thamires/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataset.py", line 2699](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataset.py#line=2698), in chunk
    variables = {
                ^
  File "[/home/thamires/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataset.py", line 2700](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataset.py#line=2699), in <dictcomp>
    k: _maybe_chunk(
       ^^^^^^^^^^^^^
  File "[/home/thamires/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataset.py", line 299](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataset.py#line=298), in _maybe_chunk
    chunked_array_type = guess_chunkmanager(
                         ^^^^^^^^^^^^^^^^^^^
  File "[/home/thamires/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/namedarray/parallelcompat.py", line 119](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/namedarray/parallelcompat.py#line=118), in guess_chunkmanager
    raise TypeError(
TypeError: manager must be a string or instance of ChunkManagerEntrypoint, but received type <class 'xarray.core.daskmanager.DaskManager'>
"""

The above exception was the direct cause of the following exception:

TypeError                                 Traceback (most recent call last)
Cell In[21], line 15
      1 # previously, PyGMTSAR internally applied 0.1° buffer
      2 
      3 # define AOI as the whole scenes area
   (...)
     13 # if DEM missed, download Copernicus DEM from open AWS datastore
     14 # get complete 1°x1° tiles covering the AOI, crop them later using AOI
---> 15 AWS().download_dem(AOI)
     16 # don't worry about messages 'ERROR 3: [/vsipythonfilelike/](http://localhost:8891/vsipythonfilelike/) ... : I[/O](http://localhost:8891/O) error'

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmtsar/AWS.py:71](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmtsar/AWS.py#line=70), in AWS.download_dem(self, geometry, filename, n_jobs, product, skip_exist)
     69 #print ('left, right', left, right, 'lower, upper', lower, upper)
     70 with self.tqdm_joblib(tqdm(desc='DEM Tile Downloading', total=(right-left+1)*(upper-lower+1))) as progress_bar:
---> 71     tile_xarrays = joblib.Parallel(n_jobs=n_jobs)(joblib.delayed(job_tile)(product, x, y)\
     72                         for x in range(left, right + 1) for y in range(lower, upper + 1))
     74 dem = xr.combine_by_coords([tile for tile in tile_xarrays if tile is not None])
     75 bounds = self.get_bounds(geometry)

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/joblib/parallel.py:1952](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/joblib/parallel.py#line=1951), in Parallel.__call__(self, iterable)
   1946 # The first item from the output is blank, but it makes the interpreter
   1947 # progress until it enters the Try[/Except](http://localhost:8891/Except) block of the generator and
   1948 # reach the first `yield` statement. This starts the aynchronous
   1949 # dispatch of the tasks to the workers.
   1950 next(output)
-> 1952 return output if self.return_generator else list(output)

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/joblib/parallel.py:1595](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/joblib/parallel.py#line=1594), in Parallel._get_outputs(self, iterator, pre_dispatch)
   1592     yield
   1594     with self._backend.retrieval_context():
-> 1595         yield from self._retrieve()
   1597 except GeneratorExit:
   1598     # The generator has been garbage collected before being fully
   1599     # consumed. This aborts the remaining tasks if possible and warn
   1600     # the user if necessary.
   1601     self._exception = True

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/joblib/parallel.py:1699](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/joblib/parallel.py#line=1698), in Parallel._retrieve(self)
   1692 while self._wait_retrieval():
   1693 
   1694     # If the callback thread of a worker has signaled that its task
   1695     # triggered an exception, or if the retrieval loop has raised an
   1696     # exception (e.g. `GeneratorExit`), exit the loop and surface the
   1697     # worker traceback.
   1698     if self._aborting:
-> 1699         self._raise_error_fast()
   1700         break
   1702     # If the next job is not ready for retrieval yet, we just wait for
   1703     # async callbacks to progress.

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/joblib/parallel.py:1734](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/joblib/parallel.py#line=1733), in Parallel._raise_error_fast(self)
   1730 # If this error job exists, immediatly raise the error by
   1731 # calling get_result. This job might not exists if abort has been
   1732 # called directly or if the generator is gc'ed.
   1733 if error_job is not None:
-> 1734     error_job.get_result(self.timeout)

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/joblib/parallel.py:736](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/joblib/parallel.py#line=735), in BatchCompletionCallBack.get_result(self, timeout)
    730 backend = self.parallel._backend
    732 if backend.supports_retrieve_callback:
    733     # We assume that the result has already been retrieved by the
    734     # callback thread, and is stored internally. It's just waiting to
    735     # be returned.
--> 736     return self._return_or_raise()
    738 # For other backends, the main thread needs to run the retrieval step.
    739 try:

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/joblib/parallel.py:754](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/joblib/parallel.py#line=753), in BatchCompletionCallBack._return_or_raise(self)
    752 try:
    753     if self.status == TASK_ERROR:
--> 754         raise self._result
    755     return self._result
    756 finally:

TypeError: manager must be a string or instance of ChunkManagerEntrypoint, but received type <class 'xarray.core.daskmanager.DaskManager'>

Third, using GMT:

GMT().download_dem(AOI,product='1s')
---------------------------------------------------------------------------
NameError                                 Traceback (most recent call last)
Cell In[23], line 1
----> 1 GMT().download_dem(AOI,product='1s')

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmtsar/GMT.py:201](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmtsar/GMT.py#line=200), in GMT.download_dem(self, geometry, filename, product, skip_exist)
    198     pbar.update(1)
    200 if filename is None:
--> 201     da = xr.open_dataarray(grdname, engine=self.netcdf_engine, chunks=self.netcdf_chunksize)
    202     os.remove(grdname)
    203     return da

NameError: name 'xr' is not defined

After all this tries, one DEM was downloaded in the data directory. I tried to load it:

'data_hidrogenio_desc/dem.nc'
Stack.load_dem(DEM)

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
Cell In[27], line 1
----> 1 Stack.load_dem(DEM)

TypeError: Stack_dem.load_dem() missing 1 required positional argument: 'data'

Tried to load with other function:

sbas.load_dem('DEM')
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
Cell In[34], line 1
----> 1 sbas.load_dem(DEM)

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmtsar/Stack_dem.py:220](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmtsar/Stack_dem.py#line=219), in Stack_dem.load_dem(self, data, geometry)
    218         ortho = ortho.reindex(lat=ortho.lat[::-1])
    219 elif isinstance(data, str) and os.path.splitext(data)[-1] in ['.nc', '.netcdf', '.grd']:
--> 220     ortho = xr.open_dataarray(data, engine=self.netcdf_engine, chunks=self.chunksize)
    221 elif isinstance(data, str):
    222     print ('ERROR: filename extension is not recognized. Should be one from .tiff, .tif, .TIF, .nc, .netcdf, .grd')

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/backends/api.py:749](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/backends/api.py#line=748), in open_dataarray(filename_or_obj, engine, chunks, cache, decode_cf, mask_and_scale, decode_times, decode_timedelta, use_cftime, concat_characters, decode_coords, drop_variables, inline_array, chunked_array_type, from_array_kwargs, backend_kwargs, **kwargs)
    596 def open_dataarray(
    597     filename_or_obj: str | os.PathLike[Any] | BufferedIOBase | AbstractDataStore,
    598     *,
   (...)
    614     **kwargs,
    615 ) -> DataArray:
    616     """Open an DataArray from a file or file-like object containing a single
    617     data variable.
    618 
   (...)
    746     open_dataset
    747     """
--> 749     dataset = open_dataset(
    750         filename_or_obj,
    751         decode_cf=decode_cf,
    752         mask_and_scale=mask_and_scale,
    753         decode_times=decode_times,
    754         concat_characters=concat_characters,
    755         decode_coords=decode_coords,
    756         engine=engine,
    757         chunks=chunks,
    758         cache=cache,
    759         drop_variables=drop_variables,
    760         inline_array=inline_array,
    761         chunked_array_type=chunked_array_type,
    762         from_array_kwargs=from_array_kwargs,
    763         backend_kwargs=backend_kwargs,
    764         use_cftime=use_cftime,
    765         decode_timedelta=decode_timedelta,
    766         **kwargs,
    767     )
    769     if len(dataset.data_vars) != 1:
    770         raise ValueError(
    771             "Given file dataset contains more than one data "
    772             "variable. Please read with xarray.open_dataset and "
    773             "then select the variable you want."
    774         )

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/backends/api.py:579](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/backends/api.py#line=578), in open_dataset(filename_or_obj, engine, chunks, cache, decode_cf, mask_and_scale, decode_times, decode_timedelta, use_cftime, concat_characters, decode_coords, drop_variables, inline_array, chunked_array_type, from_array_kwargs, backend_kwargs, **kwargs)
    572 overwrite_encoded_chunks = kwargs.pop("overwrite_encoded_chunks", None)
    573 backend_ds = backend.open_dataset(
    574     filename_or_obj,
    575     drop_variables=drop_variables,
    576     **decoders,
    577     **kwargs,
    578 )
--> 579 ds = _dataset_from_backend_dataset(
    580     backend_ds,
    581     filename_or_obj,
    582     engine,
    583     chunks,
    584     cache,
    585     overwrite_encoded_chunks,
    586     inline_array,
    587     chunked_array_type,
    588     from_array_kwargs,
    589     drop_variables=drop_variables,
    590     **decoders,
    591     **kwargs,
    592 )
    593 return ds

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/backends/api.py:372](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/backends/api.py#line=371), in _dataset_from_backend_dataset(backend_ds, filename_or_obj, engine, chunks, cache, overwrite_encoded_chunks, inline_array, chunked_array_type, from_array_kwargs, **extra_tokens)
    370     ds = backend_ds
    371 else:
--> 372     ds = _chunk_ds(
    373         backend_ds,
    374         filename_or_obj,
    375         engine,
    376         chunks,
    377         overwrite_encoded_chunks,
    378         inline_array,
    379         chunked_array_type,
    380         from_array_kwargs,
    381         **extra_tokens,
    382     )
    384 ds.set_close(backend_ds._close)
    386 # Ensure source filename always stored in dataset object

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/backends/api.py:337](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/backends/api.py#line=336), in _chunk_ds(backend_ds, filename_or_obj, engine, chunks, overwrite_encoded_chunks, inline_array, chunked_array_type, from_array_kwargs, **extra_tokens)
    335 for name, var in backend_ds.variables.items():
    336     var_chunks = _get_chunk(var, chunks, chunkmanager)
--> 337     variables[name] = _maybe_chunk(
    338         name,
    339         var,
    340         var_chunks,
    341         overwrite_encoded_chunks=overwrite_encoded_chunks,
    342         name_prefix=name_prefix,
    343         token=token,
    344         inline_array=inline_array,
    345         chunked_array_type=chunkmanager,
    346         from_array_kwargs=from_array_kwargs.copy(),
    347     )
    348 return backend_ds._replace(variables)

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataset.py:299](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataset.py#line=298), in _maybe_chunk(name, var, chunks, token, lock, name_prefix, overwrite_encoded_chunks, inline_array, chunked_array_type, from_array_kwargs)
    296     chunks = {dim: chunks[dim] for dim in var.dims if dim in chunks}
    298 if var.ndim:
--> 299     chunked_array_type = guess_chunkmanager(
    300         chunked_array_type
    301     )  # coerce string to ChunkManagerEntrypoint type
    302     if isinstance(chunked_array_type, DaskManager):
    303         from dask.base import tokenize

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/namedarray/parallelcompat.py:119](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/namedarray/parallelcompat.py#line=118), in guess_chunkmanager(manager)
    117     return manager
    118 else:
--> 119     raise TypeError(
    120         f"manager must be a string or instance of ChunkManagerEntrypoint, but received type {type(manager)}"
    121     )

TypeError: manager must be a string or instance of ChunkManagerEntrypoint, but received type <class 'xarray.core.daskmanager.DaskManager'>

Also tried load an external one (COP-30 downloaded by me):

sbas.load_dem('SaoRomao_COP30.tif')
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
Cell In[33], line 1
----> 1 sbas.load_dem('SaoRomao_COP30.tif')

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmtsar/Stack_dem.py:214](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmtsar/Stack_dem.py#line=213), in Stack_dem.load_dem(self, data, geometry)
    212     ortho = data
    213 elif isinstance(data, str) and os.path.splitext(data)[-1] in ['.tiff', '.tif', '.TIF']:
--> 214     ortho = rio.open_rasterio(data, chunks=self.chunksize).squeeze(drop=True)\
    215         .rename({'y': 'lat', 'x': 'lon'})\
    216         .drop('spatial_ref')
    217     if ortho.lat.diff('lat')[0].item() < 0:
    218         ortho = ortho.reindex(lat=ortho.lat[::-1])

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/rioxarray/_io.py:1264](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/rioxarray/_io.py#line=1263), in open_rasterio(filename, parse_coordinates, chunks, cache, lock, masked, mask_and_scale, variable, group, default_name, decode_times, decode_timedelta, band_as_variable, **open_kwargs)
   1261     result.rio.write_gcps(*riods.gcps, inplace=True)
   1263 if chunks is not None:
-> 1264     result = _prepare_dask(result, riods, filename, chunks)
   1265 else:
   1266     result.encoding["preferred_chunks"] = {
   1267         result.rio.y_dim: riods.block_shapes[0][0],
   1268         result.rio.x_dim: riods.block_shapes[0][1],
   1269         coord_name: 1,
   1270     }

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/rioxarray/_io.py:929](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/rioxarray/_io.py#line=928), in _prepare_dask(result, riods, filename, chunks)
    927 token = tokenize(filename, mtime, chunks)
    928 name_prefix = f"open_rasterio-{token}"
--> 929 return result.chunk(chunks, name_prefix=name_prefix, token=token)

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/util/deprecation_helpers.py:115](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/util/deprecation_helpers.py#line=114), in _deprecate_positional_args.<locals>._decorator.<locals>.inner(*args, **kwargs)
    111     kwargs.update({name: arg for name, arg in zip_args})
    113     return func(*args[:-n_extra_args], **kwargs)
--> 115 return func(*args, **kwargs)

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataarray.py:1390](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataarray.py#line=1389), in DataArray.chunk(self, chunks, name_prefix, token, lock, inline_array, chunked_array_type, from_array_kwargs, **chunks_kwargs)
   1387 else:
   1388     chunks = either_dict_or_kwargs(chunks, chunks_kwargs, "chunk")
-> 1390 ds = self._to_temp_dataset().chunk(
   1391     chunks,
   1392     name_prefix=name_prefix,
   1393     token=token,
   1394     lock=lock,
   1395     inline_array=inline_array,
   1396     chunked_array_type=chunked_array_type,
   1397     from_array_kwargs=from_array_kwargs,
   1398 )
   1399 return self._from_temp_dataset(ds)

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataset.py:2699](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataset.py#line=2698), in Dataset.chunk(self, chunks, name_prefix, token, lock, inline_array, chunked_array_type, from_array_kwargs, **chunks_kwargs)
   2696 if from_array_kwargs is None:
   2697     from_array_kwargs = {}
-> 2699 variables = {
   2700     k: _maybe_chunk(
   2701         k,
   2702         v,
   2703         chunks_mapping,
   2704         token,
   2705         lock,
   2706         name_prefix,
   2707         inline_array=inline_array,
   2708         chunked_array_type=chunkmanager,
   2709         from_array_kwargs=from_array_kwargs.copy(),
   2710     )
   2711     for k, v in self.variables.items()
   2712 }
   2713 return self._replace(variables)

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataset.py:2700](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataset.py#line=2699), in <dictcomp>(.0)
   2696 if from_array_kwargs is None:
   2697     from_array_kwargs = {}
   2699 variables = {
-> 2700     k: _maybe_chunk(
   2701         k,
   2702         v,
   2703         chunks_mapping,
   2704         token,
   2705         lock,
   2706         name_prefix,
   2707         inline_array=inline_array,
   2708         chunked_array_type=chunkmanager,
   2709         from_array_kwargs=from_array_kwargs.copy(),
   2710     )
   2711     for k, v in self.variables.items()
   2712 }
   2713 return self._replace(variables)

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataset.py:299](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataset.py#line=298), in _maybe_chunk(name, var, chunks, token, lock, name_prefix, overwrite_encoded_chunks, inline_array, chunked_array_type, from_array_kwargs)
    296     chunks = {dim: chunks[dim] for dim in var.dims if dim in chunks}
    298 if var.ndim:
--> 299     chunked_array_type = guess_chunkmanager(
    300         chunked_array_type
    301     )  # coerce string to ChunkManagerEntrypoint type
    302     if isinstance(chunked_array_type, DaskManager):
    303         from dask.base import tokenize

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/namedarray/parallelcompat.py:119](http://localhost:8891/lab/tree/Documents/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/namedarray/parallelcompat.py#line=118), in guess_chunkmanager(manager)
    117     return manager
    118 else:
--> 119     raise TypeError(
    120         f"manager must be a string or instance of ChunkManagerEntrypoint, but received type {type(manager)}"
    121     )

TypeError: manager must be a string or instance of ChunkManagerEntrypoint, but received type <class 'xarray.core.daskmanager.DaskManager'>

I'm sorry to bother you with those comments but I just don't know what to do. It was functional before all this DEM download issue :(

@AlexeyPechnikov
Copy link
Owner

Your DEM file is located in the data directory instead of the raw directory. You can remove it or use the suggested function argument to rewrite it.

@teagamrs
Copy link

I adjusted the directory from data to raw and deleted the previous DEM file from data. Looks like COP DEM from AWS its not downloading (same error from my above comment) but GMT function gives an .nc archive.
This error appears when loading:

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
Cell In[24], line 2
      1 # define the area of interest (AOI) to speedup the processing
----> 2 sbas.load_dem(DEM)

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmtsar/Stack_dem.py:220](http://localhost:8892/anaconda3/envs/interferometry/lib/python3.11/site-packages/pygmtsar/Stack_dem.py#line=219), in Stack_dem.load_dem(self, data, geometry)
    218         ortho = ortho.reindex(lat=ortho.lat[::-1])
    219 elif isinstance(data, str) and os.path.splitext(data)[-1] in ['.nc', '.netcdf', '.grd']:
--> 220     ortho = xr.open_dataarray(data, engine=self.netcdf_engine, chunks=self.chunksize)
    221 elif isinstance(data, str):
    222     print ('ERROR: filename extension is not recognized. Should be one from .tiff, .tif, .TIF, .nc, .netcdf, .grd')

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/backends/api.py:749](http://localhost:8892/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/backends/api.py#line=748), in open_dataarray(filename_or_obj, engine, chunks, cache, decode_cf, mask_and_scale, decode_times, decode_timedelta, use_cftime, concat_characters, decode_coords, drop_variables, inline_array, chunked_array_type, from_array_kwargs, backend_kwargs, **kwargs)
    596 def open_dataarray(
    597     filename_or_obj: str | os.PathLike[Any] | BufferedIOBase | AbstractDataStore,
    598     *,
   (...)
    614     **kwargs,
    615 ) -> DataArray:
    616     """Open an DataArray from a file or file-like object containing a single
    617     data variable.
    618 
   (...)
    746     open_dataset
    747     """
--> 749     dataset = open_dataset(
    750         filename_or_obj,
    751         decode_cf=decode_cf,
    752         mask_and_scale=mask_and_scale,
    753         decode_times=decode_times,
    754         concat_characters=concat_characters,
    755         decode_coords=decode_coords,
    756         engine=engine,
    757         chunks=chunks,
    758         cache=cache,
    759         drop_variables=drop_variables,
    760         inline_array=inline_array,
    761         chunked_array_type=chunked_array_type,
    762         from_array_kwargs=from_array_kwargs,
    763         backend_kwargs=backend_kwargs,
    764         use_cftime=use_cftime,
    765         decode_timedelta=decode_timedelta,
    766         **kwargs,
    767     )
    769     if len(dataset.data_vars) != 1:
    770         raise ValueError(
    771             "Given file dataset contains more than one data "
    772             "variable. Please read with xarray.open_dataset and "
    773             "then select the variable you want."
    774         )

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/backends/api.py:579](http://localhost:8892/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/backends/api.py#line=578), in open_dataset(filename_or_obj, engine, chunks, cache, decode_cf, mask_and_scale, decode_times, decode_timedelta, use_cftime, concat_characters, decode_coords, drop_variables, inline_array, chunked_array_type, from_array_kwargs, backend_kwargs, **kwargs)
    572 overwrite_encoded_chunks = kwargs.pop("overwrite_encoded_chunks", None)
    573 backend_ds = backend.open_dataset(
    574     filename_or_obj,
    575     drop_variables=drop_variables,
    576     **decoders,
    577     **kwargs,
    578 )
--> 579 ds = _dataset_from_backend_dataset(
    580     backend_ds,
    581     filename_or_obj,
    582     engine,
    583     chunks,
    584     cache,
    585     overwrite_encoded_chunks,
    586     inline_array,
    587     chunked_array_type,
    588     from_array_kwargs,
    589     drop_variables=drop_variables,
    590     **decoders,
    591     **kwargs,
    592 )
    593 return ds

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/backends/api.py:372](http://localhost:8892/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/backends/api.py#line=371), in _dataset_from_backend_dataset(backend_ds, filename_or_obj, engine, chunks, cache, overwrite_encoded_chunks, inline_array, chunked_array_type, from_array_kwargs, **extra_tokens)
    370     ds = backend_ds
    371 else:
--> 372     ds = _chunk_ds(
    373         backend_ds,
    374         filename_or_obj,
    375         engine,
    376         chunks,
    377         overwrite_encoded_chunks,
    378         inline_array,
    379         chunked_array_type,
    380         from_array_kwargs,
    381         **extra_tokens,
    382     )
    384 ds.set_close(backend_ds._close)
    386 # Ensure source filename always stored in dataset object

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/backends/api.py:337](http://localhost:8892/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/backends/api.py#line=336), in _chunk_ds(backend_ds, filename_or_obj, engine, chunks, overwrite_encoded_chunks, inline_array, chunked_array_type, from_array_kwargs, **extra_tokens)
    335 for name, var in backend_ds.variables.items():
    336     var_chunks = _get_chunk(var, chunks, chunkmanager)
--> 337     variables[name] = _maybe_chunk(
    338         name,
    339         var,
    340         var_chunks,
    341         overwrite_encoded_chunks=overwrite_encoded_chunks,
    342         name_prefix=name_prefix,
    343         token=token,
    344         inline_array=inline_array,
    345         chunked_array_type=chunkmanager,
    346         from_array_kwargs=from_array_kwargs.copy(),
    347     )
    348 return backend_ds._replace(variables)

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataset.py:299](http://localhost:8892/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/core/dataset.py#line=298), in _maybe_chunk(name, var, chunks, token, lock, name_prefix, overwrite_encoded_chunks, inline_array, chunked_array_type, from_array_kwargs)
    296     chunks = {dim: chunks[dim] for dim in var.dims if dim in chunks}
    298 if var.ndim:
--> 299     chunked_array_type = guess_chunkmanager(
    300         chunked_array_type
    301     )  # coerce string to ChunkManagerEntrypoint type
    302     if isinstance(chunked_array_type, DaskManager):
    303         from dask.base import tokenize

File [~/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/namedarray/parallelcompat.py:119](http://localhost:8892/anaconda3/envs/interferometry/lib/python3.11/site-packages/xarray/namedarray/parallelcompat.py#line=118), in guess_chunkmanager(manager)
    117     return manager
    118 else:
--> 119     raise TypeError(
    120         f"manager must be a string or instance of ChunkManagerEntrypoint, but received type {type(manager)}"
    121     )

TypeError: manager must be a string or instance of ChunkManagerEntrypoint, but received type <class 'xarray.core.daskmanager.DaskManager'>

@AlexeyPechnikov
Copy link
Owner

I adjusted the directory from data to raw ...

You shouldn't do that because the 'raw' directory is recreated for every run, and your DEM file is dropped before you initialize the main stack object. Please refer to the example notebooks for the correct pipeline.

@teagamrs
Copy link

teagamrs commented Feb 27, 2024

The dem.nc was in data as I saw in the notebooks:

define DEM filename inside data directory
DEM = f'{DATADIR}/dem.nc'

I've restarted everything and run a new notebook from zero, following the updated pipeline (used Imperial_Valley as reference) but got the same errors when loading the DEM. The notebook is attached.

NewInterferometryFlow (1).zip

@AlexeyPechnikov
Copy link
Owner

If you need assistance, feel free to load it on Google Colab.

@teagamrs
Copy link

teagamrs commented Feb 27, 2024

It's ok on Google Colab: https://colab.research.google.com/drive/1JFMNVCDRRDScN240YKJarjRuXNfNV7Z0?usp=sharing
I'll try a little more with notebook, cause Colab for me just run out of memory in past tries. Thanks for your support!

Edit: Solved downgrading the Python to 3.10.13!!!

@AlexeyPechnikov
Copy link
Owner

Edit: Solved downgrading the Python to 3.10.13!!!

What Python version were you using previously? By the way, Google Colab currently uses Python 3.10.12 and locally I use 3.11.6.

@teagamrs
Copy link

Edit: Solved downgrading the Python to 3.10.13!!!

What Python version were you using previously? By the way, Google Colab currently uses Python 3.10.12 and locally I use 3.11.6.

I was using 3.11.8 and I ran some analysis in this meantime. But later it didn't work anymore till downgrade to 3.10.13 and this solve the issues

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants