Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mavic 3M Support #1771

Merged
merged 11 commits into from
Jun 29, 2024
Merged

Mavic 3M Support #1771

merged 11 commits into from
Jun 29, 2024

Conversation

pierotofy
Copy link
Member

@pierotofy pierotofy commented Jun 28, 2024

Continuing the work from #1740

I finally got good results on a test dataset. The main cause of the misalignment issue on the Mavic 3M was due to a fault in how the homography was computed. The images are > 1280 pixels, which was causing the calculations to be way off. The homography needed to be rescaled for it to make sense.

This should then fix alignment for other sensors as well.

Furthermore, the RGB images were dewarped, but didn't match the size of the multispectral bands. This was causing issues during alignment.

It should now be possible to process RGB+multispectral, or multispectral data alone from the Mavic 3M.

RGB
image

RNirRe

image

@pierotofy
Copy link
Member Author

pierotofy commented Jun 28, 2024

One issue with this approach, however, is that the RGB sensor and the MS sensor have different sensitivities / store different tags. I don't know how to reconcile these differences when radiometric calibration is turned on (often the use case with MS images).

I'm going to leave this RGB+MS code in the https://github.com/pierotofy/ODM/tree/mavic3mrgbms branch. I think it's fundamentally difficult to get accurate measurements using different sensors. I think the workflow with this drone is to process the MS and RGB data separately and get two orthophotos.

This PR will then add support for processing MS datasets only on the M3M, which is still a nice improvement.

@smathermather
Copy link
Contributor

Informed of the consequences or not, it's a very common request to be able process RGB and multispectral images together.

Does it make sense to allow processing together in main branch ODM if no radiometric calibration is applied?

@smathermather
Copy link
Contributor

Related: what kind of testing do you want? I've got a small pile of datasets now, although I may need to revert to some folks for different sharing links or smaller datasets.

@pierotofy
Copy link
Member Author

Does it make sense to allow processing together in main branch ODM if no radiometric calibration is applied?

Does other software do it?

Aside from the additional complexity that leaving this code in the main branch leads, I'd argue there's little value in having a combined orthophoto with substantially different units. You can't make meaningful measurements from it.

@pierotofy
Copy link
Member Author

what kind of testing do you want?

Would be nice to test a few Mavic 3M datasets, or other MS datasets from other drones that have had issues with misalignment (especially if the images were > 1280 px). This PR should improve processing of those.

@pierotofy
Copy link
Member Author

pierotofy commented Jun 28, 2024

I would add, if someone can point a way to get normalized reflectance values between MS and RGB sensors, I'd happily bring the feature in main.

@smathermather
Copy link
Contributor

smathermather commented Jun 28, 2024

Does other software do it?

Aside from the additional complexity that leaving this code in the main branch leads, I'd argue there's little value in having a combined orthophoto with substantially different units. You can't make meaningful measurements from it.

Probably the jetlag, but I'm kind of unclear what the issue is between RGB and MS: is this 8-bit vs. 16-bit, or DN vs. reflectance, or mixing RGB + R + G + RE + NIR or R + G + (B from RGB) + RE + NIR in a mixture of DN vs. reflectance?

It's not uncommon in the remote sensing world to mix different DN depths in the same image stack. MODIS was the first major sensor to do this. From a downstream science perspective alignment at the sensor level would be much preferable to 2 orthophotos (short of alignment at the sensor and adding the complexity of delivering two payloads), which is why folks are asking for everything together.

Not sure what other products do. Don't really care (much).

In short, I don't see any issue in mixing bit depth, that's normal and useful, (if that's the issue), but mixing DN and reflectance probably doesn't make sense.

Edit: from a science applications perspective, best possible output is:

RGB + R + G + RE + NIR, with no place no band swapping. Preserves RGB resolution, doesn't mix bit depth on RGB, allows for fully aligned full set of products and optional post-processing of MS bands.

@pierotofy
Copy link
Member Author

DN is not a problem. (That's what you can get with the code now in https://github.com/pierotofy/ODM/tree/mavic3mrgbms)

Reflectance is. You're going to get very different scales between RGB and MS. Problem: people will upload the images in WebODM, select "Multispectral" and will get the wrong indices calculations in the plant health tab.

@smathermather
Copy link
Contributor

smathermather commented Jun 29, 2024

Ah gotcha on the mixing RGB and MS bands. Point of clarification: there's calibration happening for both sets of bands, but the ranges don't match? That seems (broadly) solvable (not sure how to handle the blue, but have a hunch or two).

But I'm still seeing alignment issues, maybe because this is over water:
image

https://crankyserver.com/public/task/c2d619f3-1502-403c-b014-01ce2cef0207/map/?t=orthophoto

This is a quick to run subset, so I'll see what happens on the full dataset.

@pierotofy
Copy link
Member Author

pierotofy commented Jun 29, 2024

there's calibration happening for both sets of bands, but the ranges don't match?

Raw images values (0-255 for RGB), (0-65535 for MS) need to go through dn_to_radiance (see multispectral.py). At the end of that function, values need to be in a consistent range to be meaningful. They don't seem to be, at least using the current logic. For example, gain and exposure time seem different between the two sensors (leaving aside other corrections).

Maybe there's some way to reconcile the two, but even DJI's own manual seems to hint at the fact that the MS bands are to be treated separately from RGB for processing. https://dl.djicdn.com/downloads/DJI_Mavic_3_Enterprise/20230829/Mavic_3M_Image_Processing_Guide_EN.pdf

@pierotofy pierotofy merged commit 9acbaf5 into OpenDroneMap:master Jun 29, 2024
2 checks passed
@smathermather
Copy link
Contributor

smathermather commented Jun 29, 2024

I'm still seeing alignment issues. Did you want more testing? I've been gathering datasets for a few weeks for that purpose.

The above example is the first failure. It will take a couple days to get the rest tested, but I'll post those results as they complete.

edit: tested wrong for this branch. Never-the-less, this got merged while I was in the midst of testing.

@smathermather
Copy link
Contributor

smathermather commented Jun 29, 2024

Maybe there's some way to reconcile the two, but even DJI's own manual seems to hint at the fact that the MS bands are to be treated separately from RGB for processing. https://dl.djicdn.com/downloads/DJI_Mavic_3_Enterprise/20230829/Mavic_3M_Image_Processing_Guide_EN.pdf

Why publish homography data and embed it in the sensor exif specific to the sensor if they aren't meant to be used together? Should they be used in band math together? Probably not. Are they useful in aligned datasets together: yes.

If I had to guess, the values differ because the sensors are different. They shouldn't have the same reflectance values because the R in the RGB doesn't have the same spectral response function as the R from the MS bands. With different spectral response curves, the value for reflectance will be different between the same names for channels. This is normal. Where it might become a problem would be in indices as you highlight.

In the absence of spectral response curves published by DJI and a spectral resampling approach to apply those curves to ensure the bands match between RGB and MS, there is no robust, non-heuristic way to make them have the same range of values. That doesn't mean the RGB values are wrong, simply incommensurate. There are ways to work around this heuristically, but that's another conversation.

So: no (robust, non-heuristic) way to make them match, but also IMO no reason to make them match -- they aren't the same band, they are just named similarly.

Unless I'm missing something (likely) I'm still not clear why we aren't:

  1. Aligning RGB and MS bands together
  2. Running the indices on the correct bands (not mixing MS and RGB in an index)

I could see an argument for not building an RGB-R-RE-G-NIR output to avoid adding logic to WebODM to handle the indices correctly, but other than support issues in WebODM, IMO, RGB-R-RE-G-NIR would be appropriate and what is expected from this sensor. Sans spectral response sampling, R(GB) will never match R, nor should it. And users are right to expect that they should be able to process all bands together. That's the implicit promise of the sensor.

Example use case: use MS data to train CNN on RGB data, requiring good band alignment between RGB and MS bands. I'm doing this now and will use https://github.com/pierotofy/ODM/tree/mavic3mrgbms unless / until the functionality gets merged into master (and thanks for putting that branch together!). But I do worry if it sits in a feature branch, the functionality will go long periods untested and become unmaintained.

@smathermather
Copy link
Contributor

smathermather commented Jun 29, 2024

First test complete (disregard above):
image

Messiness is due to subsetting (and resultant overlap issues), but band alignment looks good. Will run on some of the others I have next.

Alignment is (at least) on-par with using exif alignment homography (though I don't have an A/B comparison at this time).

Now also testing mavic3mrgbms

@pierotofy
Copy link
Member Author

pierotofy commented Jun 29, 2024

Thanks for testing! Glad the alignment worked.

From a technical point of view, generating and handling a RGB-R-RE-G-NIR orthophoto is non-trivial, both in ODM and WebODM.

@smathermather
Copy link
Contributor

smathermather commented Jun 29, 2024

DN is not a problem. (That's what you can get with the code now in https://github.com/pierotofy/ODM/tree/mavic3mrgbms)

For clarification: is this branch supposed to align RGB and MS? Tested both with radiometric-calibration: camera and without:
image

From a technical point of view, generating and handling a RGB-R-RE-G-NIR orthophoto is non-trivial, both in ODM and WebODM.

That's my hunch on WebODM. In what way for ODM?

Thanks for testing! Glad the alignment worked.

Always. Testing on one more full size dataset and then call it.

@pierotofy
Copy link
Member Author

pierotofy commented Jun 29, 2024

is this branch supposed to align RGB and MS?

Edit: the mavic3mrgbms branch is supposed to, yes. (Can't guarantee it will).

In what way for ODM?

Mostly the modifications to https://github.com/OpenDroneMap/odm_orthophoto to accomodate for the mix of multi-band textures (RGB) and single band images (MS) with overlapping band names (R, R_MS, etc.). I wouldn't look forward to make those.

@ehallein
Copy link
Contributor

work well, however there is still a slight offset in the red band compared to the others in my test
image

@smathermather
Copy link
Contributor

Edit: the mavic3mrgbms branch is supposed to, yes. (Can't guarantee it will).

Cool. I'll poke at it. It may just make sense for me to run a pre-alignment of the bands and maybe add a script to contrib, but let's see if I can figure out what's happening with the branch first.

In what way for ODM?

Mostly the modifications to https://github.com/OpenDroneMap/odm_orthophoto to accomodate for the mix of multi-band textures (RGB) and single band images (MS) with overlapping band names (R, R_MS, etc.). I wouldn't look forward to make those.

Ah, yup. Makes sense. No build-in expectation multiple of the "same" bands, and the bands need tracked and correctly identified for orthophoto output requiring a fair amount of custom logic. It'll be interesting to see if other sensors show up with similar characteristics, but that does seem like a high LOE for a single sensor. Add to that the headache that introduces for WebODM... and yup.

Thanks for the extensive explainer (and of course the homography fix!).

I'll ping the community thread now that things a merged so folks can test.

@smathermather
Copy link
Contributor

smathermather commented Jun 30, 2024

work well, however there is still a slight offset in the red band compared to the others in my test

Interesting. Just to confirm @ehallein: the red band is offset but the rest align with each other?

@ehallein
Copy link
Contributor

work well, however there is still a slight offset in the red band compared to the others in my test

Interesting. Just to confirm @ehallein: the red band is offset but the rest align with each other?

correct. I'll do some more tests on other datasets tomorrow and see they go.

@InnovPlantProtect
Copy link

hi all, i am a user of webODM and the company i am working for has purchased a Mavic3M. the drone has been used for the last month or so every week. We have a good time series database for 6 areas of study.
I have followed all the issues related to the processing of the images captured by Mavic3M and i thank in advance for the effort you guys are making to solve them.
I have a simple question: after running the script for aligning the 4 bands, https://gitlab.com/Yario/image_registration_dji_mavic_3m/-/tree/main?ref_type=heads, we import the images in webODM and process them with the following parameters

auto-boundary: true, optimize-disk-space: true, pc-quality: ultra, radiometric-calibration: camera+sun, texturing-skip-global-seam-leveling: true

Note that i tried the same above but with radiometric-calibration: camera and i obtain similar reflectance values.

The processed images show very low reflectance values as you can see from the screenshot below (i split the bands in QGIS just to show the value of reflectance for each band). Is anyone able to tell me if these values are trustworthy? For instance, if i calculate NDVI i get values in the expected threshold (-1 to 1). So, apparently the NDVI comes out OK but i wanted to be reassured that such a low reflectance values are "correct". Thanks a lot in advance

mavic3M_bands_after_radiometric_calibration

@smathermather
Copy link
Contributor

correct. I'll do some more tests on other datasets tomorrow and see they go.

@ehallein, have you been able to replicate the red band alignment issue with other datasets?

@ehallein
Copy link
Contributor

So far it's fixed from my testing, and does better than DJI's own software. I suspect it will never be pixel perfect due minor differences in lenses etc.

@smathermather
Copy link
Contributor

Cool thanks. Looks like you already commented on the issue, so thanks for the double comment!

@ehallein
Copy link
Contributor

@pierotofy is this expected to work on the GPU build? I tried running it with the multispectral tiffs on GPU and it came out as a mess (the RGB jpegs work fine).

@pierotofy
Copy link
Member Author

pierotofy commented Jul 18, 2024

It should, it's the same code.

@ehallein
Copy link
Contributor

ehallein commented Jul 18, 2024

Never mind, I was doing something silly. Works fine!

@InnovPlantProtect
Copy link

hello all, i thank you all for the effort you are making to support the Mavic3M. Is there any news on the radiometric calibration of the multispectral images? In my previous comment i mentioned that i manage with the band alignement but the radiometric calibration is still an issue.
many thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants