@animal_warper

Hi,

I have been using the @animal_warper function to align a mouse brain template to my mouse brain images and it has been working very well. The problem is that applying the same linear and non-linear warps derived by this function (by using -ok_to_exist) to the annotations (or even the same) image does not take me to the same final result as the function itself. Would you please let me know which of the following data needs to be used and in what order?
The warp outputs of the function are:
*_shft.1D
*_shft_WARP.nii.gz
*_shft_al2std_mat.aff12.1D
*_composite_linear_to_template.1D
*_shft_osh2base_WARP.nii.gz

Best,
Momo

Hi, Momo-

Glad the program is useful for the mouse applications, too.

There are a lot of intermediate warps that you wouldn’t need to use. To go from “input” to the base dset with warps that have been created, you should be able to use the same warp information that is provided to afni_proc.py:


*_composite_linear_to_template.1D  
*_shft_WARP.nii.gz

… and you should be able to apply these as follows to your original input dset INPUT_DSET, to send it to the same space as your reference dset REF_DSET:


3dNwarpApply \
    -master REF_DSET \
    -source INPUT_DSET \
    -nwarp  "*_shft_WARP.nii.gz   *_composite_linear_to_template.1D"
    -prefix INPUT_IN_REF.nii.gz

(Note: I wouldn’t actually use the asterisks in the above command, but probably the real file name.)

Does that work for you?

And just to check, as well-- what version of AFNI are you using (i.e., the output of “afni -ver”)? The @animal_warper program has changed a bit over time-- I would just like to verify that you have a modern one.

Also, it’s not clear to me what you mean by “annotations”. Note that when you run @animal_warper, you can include several other datasets to “follow” the warping, either atlases/segmentations or float-valued dsets; and those can be either in the template space (and hence get warped to the original input dset space) or in the original input space (and hence get warped to the template).

–pt

Dear Paul,

Thank you for your response.
I am using the 20.1.11 version, and I tried the same code you provided with 3dNwarpApply function. However, this function does not take .1D files as input and the error I receive is that the “process was killed”. Instead, I tried running 3dAllineate and 3dNwarpApply separately, with .1D files as the input for 3dAllineate and the _WARP.nii.gz file for the 3dNwarpApply. Applying both of them in series didn’t lead to the same results as the @animal_warper function. Image warp.gif attached to this reply shows that.
I also tried applying the _composite_linear_to_template.1D to my input and then applying the inverse of it (_composite_linear_to_template_inv.1D) and supposedly it must get back to the input space, but it doesn’t! Weird … . Showed in image inv.gif.

Best,
Momo

inv.gif

warp.gif

I suspect that the fact that my input and base are in different spaces might have caused this (input in talairach and base in orig space).
Would you please let me know how this issue can be addressed?

Momo

Hi, Momo-

Hmm, OK. Just to note-- my above-posted command worked fine on a practice dset I tested. I think the difference is about how “centered” the coordinates in your dataset are to start with (explained more below).

The “Killed” message occurs because your computer ran out of memory during the warp generation process; at present, this can happen when the initial distance between the input and reference dset is quite large (unfortunately). The warping program tries to make a grid to cover both the input and the reference simultaneously, and if they are far apart, the grid is veeery big, potentially to the point of using up all your computer’s memory, which I guess happened here.

To check this hypothesis, what is the output of:


cat *_composite_linear_to_template.1D

I want to see how large the translation parts of the affine fit are (though will contain the bulk of the center of mass movement in space).

-pt

Hi,

Thank you for your response!
I ran the command you suggested and the output was:
1.12356 0.00500563 0.0739877 6.93706 0.035566 1.07698 -0.170503 12.3146 -0.0403194 0.129099 1.12266 -6.46164

I have also been trying to warp step by step by aligning the center of my image (orig space) to the desired template (which is Allen CCF3 in tlrc space), and the *.1D file I get out of it cannot generate the same movement.

Momo

Hi, Momo-

Thanks. The translation part of that is (\delta x, \delta y, \delta z) = (6.93706, 12.3146, -6.46164), which doesn’t look large to my human-dataset-centric eyes, but I’m guessing in a mouse dataset might be quite large.

What is the output of:


3dinfo -ad3 -n4 -prefix DSET_SOURCE DSET_TEMPLATE

where the DSET_* files are your mouse anatomical being input to @animal_warper and the reference template you are using there?

To check the initial overlap and relative alignment, could you please run this script, where you have to provide the particular file names for the DSET_SOURCE and DSET_TEMPLATE here, and then attach the output “img_overlap_FINAL.jpg”:


#!/bin/tcsh

set dset_src  =  [input mouse source dataset here]
set dset_ref  =  [input mouse reference template here]

set opref_img = img_overlap

# ------------------------------------------------------------------------

set dset_cp_ref = __TMP_COPY_REF.nii.gz

# copy the ref and make sure the ref and src have the same space, to
# overlay for visualizing
3dcopy                                     \
    -overwrite                             \
    "${dset_ref}"                          \
    ${dset_cp_ref}

3drefit -space `3dinfo -space "${dset_src}"` "${dset_cp_ref}"

# ------- make image along each viewplane
@chauffeur_afni                        \
    -ulay  "${dset_src}"               \
    -olay  "${dset_cp_ref}"            \
    -pbar_posonly                      \
    -cbar Reds_and_Blues               \
    -set_subbricks 0 0 0               \
    -opacity 5                         \
    -prefix   "${opref_img}"           \
    -montx 1 -monty 1                  \
    -set_xhairs ON                     \
    -label_mode 1 -label_size 3        \
    -do_clean 

# glue separate images together
2dcat                                  \
    -gap 5                             \
    -gap_col 128 128 128               \
    -nx 3 -ny 1                        \
    -prefix "${opref_img}_FINAL.jpg"   \
    "${opref_img}".*.png

echo ""
echo "++ DONE.  Please check out this image for relative alignment/overlap:"
echo "     ${opref_img}_FINAL.jpg"
echo ""

# ... and you can delete the __TMP_COPY_REF.nii.gz file, if you want

exit 0

This will make an image like the attached (run on human datasets). The underlay (grayscale) dset is your subject/source, and the overlay (colorized) dset is the reference template. In the attached case, there is not a very large difference between the two dsets, which is good for starting alignment.

–pt