Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

antsAI unexpected behavior #1529

Closed
akvilonBrown opened this issue Apr 27, 2023 · 15 comments
Closed

antsAI unexpected behavior #1529

akvilonBrown opened this issue Apr 27, 2023 · 15 comments

Comments

@akvilonBrown
Copy link

Describe the problem

Greetings!. I'm trying to apply ANTS for pairwise registration. My objects are 3D NMR images of plant seeds, they can have different orientations, and I understood that antsRegistration is not supposed to work with drastic affine differences.
So I tried antsAI to produce a rough affine transformation, but it seems it doesn't work as expected.
I created a pair to test - the second object is rotated 60 deg on one of the 3D axes.
antsRegistration can manage to find a rigid transformation (for greater rotations - it doesn't find), but antsAI gives a weird transformation.
When I apply antsApplyTransforms, the rotation is wrong. (It is ok with matrix from antsRegistration when rotation is small)
I am not sure how to debug and what I am missing.
Please advise. I spent a lot of time combing through the documentation, but I am still a beginner at comprehending complex registration techniques and terms, so I tried to follow the existing examples which are scattered in discussions and manuals.

To Reproduce

There are two files in the attachment nmr.zip: the source file, and the rotated file (I used monai.transforms.Rotate in Python to rotate). Two commands to produce transformation matrix:

antsAI --dimensionality 3 \
        --output Rigid_antsAI.mat \
        --transform Rigid[0.1] \
        --metric MI[seed_zero.nii.gz,seed_60.nii.gz,32,Regular,0.25] \
        --convergence 1000 \
        --search-factor 90 \
        --verbose
antsRegistration --dimensionality 3 \
        --output [antsReg, antsRegWarped.nii.gz] \
        --interpolation Linear \
        --winsorize-image-intensities [0.005,0.995] \
        --initial-moving-transform [seed_zero.nii.gz,seed_60.nii.gz,1] \
        --transform Rigid[0.1] \
        --metric MI[seed_zero.nii.gz,seed_60.nii.gz,1,32,Regular,0.25] \
        --convergence 1000x500x250x100 \
        --shrink-factors 8x4x2x1 \
        --smoothing-sigmas 3x2x1x0

The resulting matrix Rigid_antsAI.mat seems to be wrong but antsReg0GenericAffine.mat looks fine.
And they are quite different according to antsTransformInfo
When I apply transformations, only the registered object with antsReg0GenericAffine.mat rotates back correctly:

antsApplyTransforms -d 3 -r seed_zero.nii.gz -t antsReg0GenericAffine.mat  -i seed_60.nii.gz -o seed_60_to_zero.nii.gz
antsApplyTransforms -d 3 -r seed_zero.nii.gz -t Rigid_antsAI.mat  -i seed_60.nii.gz -o seed_60_to_zero_antsai.nii.gz

System information (please complete the following information)

  • OS: "CentOS Linux 7 (Core)
  • Type of system: HPC cluster

ANTs version information

Additional information

@akvilonBrown akvilonBrown changed the title antsAI un antsAI unexpected behavior Apr 27, 2023
@gdevenyi
Copy link
Contributor

A few comments as to what might be causing issues:

  1. You're using the full resolution image for antsAI
    The only place I've seen it used, the images are downsampled before use

    logCmd ${ANTSPATH}/ResampleImageBySpacing ${DIMENSION} ${EXTRACTION_TEMPLATE} ${EXTRACTION_INITIAL_AFFINE_FIXED} 4 4 4 1
    logCmd ${ANTSPATH}/ResampleImageBySpacing ${DIMENSION} ${N4_CORRECTED_IMAGES[0]} ${EXTRACTION_INITIAL_AFFINE_MOVING} 4 4 4 1
    exe_initial_align="${ANTSPATH}/antsAI -d ${DIMENSION} -v 1"
    exe_initial_align="${exe_initial_align} -m Mattes[ ${EXTRACTION_INITIAL_AFFINE_FIXED},${EXTRACTION_INITIAL_AFFINE_MOVING},32,Regular,0.2 ]"
    exe_initial_align="${exe_initial_align} -t Affine[ 0.1 ]"
    exe_initial_align="${exe_initial_align} -s [ 20,0.12 ]"
    exe_initial_align="${exe_initial_align} -g [ 40,0x40x40 ]"
    exe_initial_align="${exe_initial_align} -p 0"
    exe_initial_align="${exe_initial_align} -c 10"
    exe_initial_align="${exe_initial_align} -o ${EXTRACTION_INITIAL_AFFINE}"

  2. You're search-factor is way too large (90)

From the docs:

     -s, --search-factor searchFactor
                         [searchFactor=20,<arcFraction=1.0>]
          Incremental search factor (in degrees) which will sample the arc fraction around 
          the principal axis or default axis. 

You're only sampling ever 90 degrees of rotation around the three axes, this is way too coarse.

Your convergence -c 1000 is probably way too large as well, spend more time on more startings with a smaller search factor and then feed the best one into a proper antsRegistration as the initialization.

@cookpa
Copy link
Member

cookpa commented Apr 27, 2023

Agreed, -s 90 will search rotations of -180, 0, +90 only. Reduce this to -s 20.

I usually use very few iterations and a lot of start points, try -c 10. You can refine the result with antsRegistration.

We normally downsample brain images before doing this, but given that this data is not very large and less detailed than a brain, I think it's fine to run at full resolution. You could maybe downsample both by a factor of 2 with ResampleImageBySpacing to speed things up if needed. You would still do the actual registration with the original images.

You can also try -t Affine[ 0.1 ] in antsAI, unless you only want a rigid solution. This sometimes goes wrong but often finds a better solution because it can account for scale. Similarity is another option (allows global scale only) - I've not found this helps for brains, but might work for you.

In your antsRegistration command, you probably don't want to go down to a shrink factor of 8, or use so much smoothing. Internally, it won't downsample by that much anyway, because there would be hardly any voxels left.

@akvilonBrown
Copy link
Author

Valid points; I will try them.
However, I had used search factor 20 with the same result. (Also, search factor 90 didn't help with the sample rotated by 90 deg).
I have a few more theoretical questions
With downsampled images, how does it work?
I mean, when the transformation is obtained, it should be applied to the full-res images.
It would work for rotation matrices since angles are scale-invariant, but in which units the offset(translation) is recorded?
If it's set in pixel/voxel units, then displacement in full-ress images will be smaller.
Please clarify how you perform it, because once I tried to apply affine transformation from downsampled registration to full-res images, and the result was weird.

I also wonder what is the difference between antsAI and antsAffineInitializer?
I am considering using Python implementation of ANTs and affine_initializer is a wrapper for antsAffineInitializer

@cookpa
Copy link
Member

cookpa commented Apr 27, 2023

With downsampled images, how does it work?

The transforms are defined in physical space. The voxel to physical space transform of the input image is in the image header. So downsampling changes how a voxel index is mapped to physical space, but the same point in physical space will be transformed the same way. Example:

antsApplyTransforms -d 3 -i image.nii.gz -t transform.mat -r reference.nii.gz -o deformed.nii.gz
ResampleImageBySpacing 3 image.nii.gz downsample.nii.gz 2 2 2 0
antsApplyTransforms -d 3 -i downsample.nii.gz -t transform.mat -r reference.nii.gz -o deformedDownsample.nii.gz

downsample should overlap in physical space with image, and so should the deformed images.

So you can use your downsampled images in antsAI for speed. But if it's running acceptably fast, you can skip this.

You can also smooth the images to improve robustness, either before downsampling or as a standalone step.

I also wonder what is the difference between antsAI and antsAffineInitializer?

antsAI is the newer version. I don't remember all the details, but it has features that the older code does not, like searching translations as well as rotations.

@ntustison
Copy link
Member

antsAI is the newer version. I don't remember all the details, but it has features that the older code does not, like searching translations as well as rotations.

Exactly. It was originally an attempt to make the interface a bit nicer, in part, by using the ants command line options.

@cookpa
Copy link
Member

cookpa commented Apr 27, 2023

ResampleImage 3 seed_zero.nii.gz seed_zero_downsample.nii.gz 2x2x2 0 2
ResampleImage 3 seed_60.nii.gz seed_60_downsample.nii.gz 2x2x2 0 2
antsAI -d 3 -m Mattes[ seed_zero_downsample.nii.gz , seed_60_downsample.nii.gz , 32, Regular, 0.25 ] -t Rigid[0.2] -s [ 25, 1.0 ] -c 5 -o initialTransform.mat -v
antsApplyTransforms -d 3 -i seed_60.nii.gz -r seed_zero.nii.gz -t initialTransform.mat -o seed_60_deformed.nii.gz

itksnap -g seed_zero.nii.gz -o seed_60.nii.gz seed_60_deformed.nii.gz

image

@cookpa
Copy link
Member

cookpa commented Apr 27, 2023

I find Mattes works better than MI - maybe we should deprecate MI @ntustison because it's synonymous with Mattes in antsMotionCorr and antsRegistration, and for me Mattes has always worked better

@ntustison
Copy link
Member

Sure. Fine with me.

@akvilonBrown
Copy link
Author

So it works!? I will reproduce tomorrow with these parameters.

The offtopic and general question - I see ANTs is capable of many things, including registration of images with different modalities. But I came across numerous Deep Learning solutions for registration. Why are they developed, and what specific tasks do they solve that classic algorithms like ANTs can't?

@ntustison
Copy link
Member

I don't think this format lends itself to answering this type of question with sufficient clarity especially since it involves ongoing research. I would recommend looking at the various review articles that have been written on the topic. If I were forced to answer, I would respond briefly with:

Why are they developed?

Because of the significant potential for speed-up and/or accuracy.

what specific tasks do they solve that classic algorithms like ANTs can't?

Off the top of my head? Not many. However, this case where there are significant angular differences is a definite possibility.

@akvilonBrown
Copy link
Author

I've reproduced, it works like a charm!
Many thanks. I analyzed my previous attempts and realized my mistakes.

May I clarify a bit more to develop the success?
@cookpa suggested using Affine transformation, and I see it may have a better fit.
But is it possible to decompose the Affine transform to extract only the rigid part (translation + rotation)?
My goal is to create a 50% deformation - an average object.
I found that the geodesic interpolation script works great, but only when objects are aligned.
So I need to align them first. However, scaling in the initial transformation is not welcome because it already deforms the shape, and subsequent interpolation has a different starting point.

Ultimately I need something like a templateCommand.. script to make an average template from a sample collection, but currently it is overcomplicated for me and doesn't run smoothly. So later I will file another issue to resolve errors.
So far, I would like to create custom steps morphing objects pairwise like in a tournament, with tractable intermediate results.

@cookpa
Copy link
Member

cookpa commented Apr 28, 2023

From a general affine matrix my understanding is that it's complicated to extract rotation. If there's no shear component, it gets easier because SVD can represent the rotation and scaling in its component matrices.

The ANTs template scripts are not designed to handle large rotations between the input images. Such variations, if they exist, would need to be dealt with in preprocessing.

Even if the template could be constructed from the space of randomly oriented images, knowing the average rotation of an image with respect to some coordinate frame doesn't seem that useful. You'd still need to test a wide range of initializations to do the pairwise registrations.

@gdevenyi
Copy link
Contributor

But is it possible to decompose the Affine transform to extract only the rigid part (translation + rotation)?

Yes, see my implementation (with references) used for another application

https://github.com/CoBrALab/optimized_antsMultivariateTemplateConstruction/blob/master/average_transform.py

@akvilonBrown
Copy link
Author

Dear fellows,
Thank you for all the valuable comments and great software!
The original issue has been wholly addressed, so it would be fair to close it.
I will continue to explore ANTs and its applicability to our plant domain.
The references to ANTs algorithms will be definitely mentioned in the results of my current work.
In case of other predicaments, I will raise a question regarding particular script(s).
As for general/theoretical questions and consulting, may I post them here, in the closed issue, or in some other thread?

Some remarks to my last comments.
Perhaps, it was a misunderstanding with "average rotation" in template scripts - I used aligned objects but, anyway, got other errors with the RAM outage. Not ready to debug them now. I'll focus on stepwise morphing so far.
Indeed, if the shear component isn't present (as I understood, Affine transformation includes only scaling on the top of Rigid transformations) - then eliminating it would be simple.
And also, I guess @gdevenyi refers to homogenous_matrix_to_rotation_scaleshear - I'm going to test it as well.

@cookpa
Copy link
Member

cookpa commented May 1, 2023

I'm glad it worked.

You can open new issues if there's problems with the tools specifically, or discussions for more general topics.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants