-
Notifications
You must be signed in to change notification settings - Fork 41
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Gradient descending coreg #346
Conversation
adding gradientdescending coreg
NuthKaab also supports points and DEM coregistration. This is an implement by Zhihao.
Combine xdem pts and add nuthkaab pts
Edit bands into indexes of DEM class (GlacioHack#340)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @liuh886, this is great stuff! I have comments but I think we can fix this together and merge it soon. This is a fantastic addition to xdem!
Before merging, we should have a proper description here of the functionality that is introduced. Technically, we should also add documentation, but that can come later too.
- geoutils==0.0.10 | ||
- noisyopt |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We need to make sure this propagates to pypi and conda-forge (I'll help! This is more a reminder to me)
xdem/coreg.py
Outdated
@@ -431,7 +597,7 @@ def __init__(self, meta: CoregDict | None = None, matrix: NDArrayf | None = None | |||
|
|||
def fit( | |||
self: CoregType, | |||
reference_dem: NDArrayf | MArrayf | RasterType, | |||
reference_dem: NDArrayf | MArrayf | RasterType | pd.DataFrame, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's have a discussion on dataframes. Is that not just a nicely wrapped ndarray? In other parts of xdem and geoutils, we use point clouds as ndarrays of shape=(N, 3)
where the second dimension is (X, Y, Z). Unless there's a good use of dataframes, I really think we should make it simpler by only accepting ndarrays.
A compromise is to have the fit_pts
function take dataframes, but it's coverted to an ndarray that the internal function (_fit_pts_func
) uses.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As with below, this signature should only be for fit_pts
, not fit
. fit
is for rasters while fit_pts
is for points (and thus dataframes)!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
so far, we use fit_pts to call point coregistration, instead of fit, for example:
gds = xdem.coreg.GradientDescending(downsampling=6000)
gds.fit_pts(shifted_ref, self.tba, inlier_mask=self.inlier_mask,verbose=verbose)
nkp = xdem.coreg.NuthKaab()
nkp.fit_pts(shifted_ref, self.tba, inlier_mask=self.inlier_mask,verbose=verbose)
The NDArray will take replace of DataFrame soon.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My take here is that we should accept any type of point data (Vector
, GeoDataFrame
, NDArray
), but this is more a problem for GeoUtils to tackle, to make things more easily consistent. For xdem.Coreg
, I agree it's good to keep a standard input for now (+1 on NDArray).
Great addition @liuh886! 😄 :
What do you think? |
Thanks a lot for your input, @rhugonnet! I wholeheartedly agree on the need to abstract the coreg classes and potentially make it into a submodule. I certainly think we should consider this in the very near future and I would be more than happy to contribute. For now, however, I ask that we consider one step at a time. Perhaps the context of the timing of this PR will shed some light on the situation: Marco Mazzolini and Désirée Treichler have both been working with different forks of xdem for months. Most notably, it's the I'm contributing to GlaMBIE together with Désirée which involves some heavy ICESat-2 processing. As you may know, the deadline is quite close (late May)... I need ICP and NuthKaab with points for it to work, or else we have to rescind our contribution. My options were either to fork @liuh886's branch which is a fork of your branch which is a fork of xdem main, or we could finally converge the functionality developed here in Oslo into mainline xdem. After this PR, I will jump straight into a Because of these circumstances, I ask that we can focus on the functionality that will be added with this PR, and strongly consider your suggestions (which I very much like the sound of) until after the GlaMBIE deadline! Finally, while @liuh886 is the expert on his own work, I can at least say that |
Again, for clarification @rhugonnet, @liuh886, please let me know if you can explain it better! Or we'll just sit tight for your thesis ;) That is more than fine for me too! Your blog post is also great, but I seem to have lost the link! |
Thanks, your explanation is very nice. And 'Gradient Descent' is the correct way to call it! This PR aims to support a 'usable' points co-registration. And we will also update the pts version of NuthKaab (there are several improvements to make it run faster). Sure, we need to rethink how to organize Coreg class... I used 'gradient descent coregistration (GDC)' heavily in recent months, and have done coreg over DTM1 and ICESat-2 on a national scale. So far I am more concerned about the limitation of GDC: It is fast, but needs as many as possible points (5,000 for example) to overcome the noise, as (1) NMAD is still a 'statistical' metric. (2) a half-pixel shift for a 10 m DEM is 5 meters, but just 0.5 m for a 1 m DEM, where the gradient is not steep on the scale of shift. (3) GDC does not give a fixed result, because there are several routes to reach 'the same minimum'; in most cases, the differences of results are ineligible. There are several ways to overcome the noise: increasing sample points, and assigning the weight to 'high-quality measurements'. I have a post about that. |
Thanks for the details @liuh886 and the amazing work, can't wait to dive into it! 😊 And also thanks @erikmannerfelt for explaining the background behind all this, it's good to understand all of this to best move forward :).
For 3/, we'll need to figure out the most appropriate time. I'll be working on the The study will be a good occasion to test all methods on the same benchmark! I will follow up by email on this in the next weeks @liuh886 @erikmannerfelt! 😉 |
…ing constant. - Added fit() (Raster) functionality for GradientDescending. - Parameterized test to debug and extend it more easily.
Half-pixel shift fix and other improvements
@erikmannerfelt Yes, I do need help. I found the test environment looks like it does not run under the latest version of geoutils v0.0.10. In geoutils v.0.0.10, 'area_or_point' has been replaced with 'shift_area_or_point' to solve the relevant 'upper left corner or center' problem, this is actually the origins of the half-pixel issue. I saw in your PR 'area_or_point' were addressing again to fix the half-pixel issue again. However, it cannot run under the latest version of geoutils v0.0.10 on my computer. So, can you double check the test environment? |
Hi @liuh886. @rhugonnet seems to have figured out the problem with the build. Try his hack from 8defc14; move It's not the first time that richdem has been an issue... |
@liuh886 If you have nothing to add specific to your method, I can help merge this into the package and solve the conflicts! 😉 |
Yes, please. I don‘t have any new functions to add 🙂 |
Adding support for point co-registration
To check it run the following:
Changes in environment.yml, dev-environment.yml
This pack provide a pattern search enhanced gradient descent algorithm, which is better than scipy.optimize.minimize.
Changes in coreg.py
Add the following functions/Class required by point coregistration:
df_sampling_from_dem()
. Sample a DEM into a data frame if the points clouds are not provided.residuals_df()
. The dh between points cloud and DEM is calculated by this function.GradientDescending
. Solving the coregistration, using residuals_df as a cost function.apply_xyz_shift_df()
. In both GDS or NuthKaab_pts, the points cloud will be shifted instead of DEM.to make it happen, updated:
_fit_pts_func()
. Now nuth_kaab.fit_pts works.fit_pts()
. CDC and NuthhKaab can call fit_pts. It will sample ref_dem into the point cloud (datafame) bydf_sampling_from_dem
if the dataframe is not provided.Known issues:
The following parameters normally do not need to change:
x0: tuple = (0, 0): The initial shift.
bounds: tuple = (-5, 5). The bounds to search.
deltainit: int = 2. The pattern size for search.
deltatol: int = 0.004. Target pattern size, or the precision you want to achieve.
feps: int = 0.0001. The smallest difference in function value to resolve.
df_sampling_from_dem
can be replaced by functions from Geoutils.