2 images in the same space

Hi AFNI expert,

I have 1 anatomical image and 1 functional image that has been acquired with two different FOV orientations.
When I use different viewer (AFNI, MRIcron) the images superposed perfectly because (I don’t know where but the orientation of the FOV is saved somewhere)

I have to do some manipulation on the images (obliquity, change the centre etc…) and this orientation is lost.
I am looking for the command that will send the orientation of the functional image into the anatomical space.
I thought that the command was:

3dresample -prefix func_resemple -master anat -input func

but apparently it doesn’t match…

thank you for your help!!

Hi, Clément -

The headers of datasets contain things like voxelsizes and location of origin, which allow the data to have meaningful (x, y, z) locations in space. AFNI (and I guess MRIcron) use this information when displaying datasets-- they don’t just match corner locations of voxels and hope for the best, for example. The voxel size, origin, and other meta-data can be viewed with AFNI’s 3dinfo command.

“Orientation” typically refers to the order of arranging the data on the disk-- see “3dinfo -orient DSET”, for example. That is independent of how the datasets are displayed, except for the fact that it determines what the “origin” voxel is.

3drefit is a program in AFNI that allows you to change header information without adjusting the actual data (or “BRIK”) information-- this is mostly used when one assumes there is incorrect header info present, and one wants to reset it. Using this program, one could easily notice changes in how one dataset overlays another.

3dresample is a program to change both header/HEAD and dataset/BRIK info together, keeping them consistent. Typically, one might not see large changes in how one dataset overlays another when using this program (except for regridding/smoothing in some cases, depending on the operation).

… which is all to say, what commands are you specifically running on your dataset, causing you to see changes? And what are you wanting to do with your refitting/resampling?


Hi --pt,
If we go back to what we have done here:

# --------------------- for obliquity and center of mass of EPI

3dcalc \
    -a       ${dset_bold}        \
    -expr    'a'                         \
    -prefix  ${dset_bold_deob}

# purge obliquity info, and apply shifts so BOLD dset overlays anat dset well
3drefit                               \
    -deoblique                        \

3dCM                                  \
    -set 0 0 0                        \

#### Recenter the

It has work fine with the full dataset one of the cohort of monkey! thank you!

However, we have different orientation of the FOV between the anat and the func for another group of monkey due to a different coil.

This difference of orientation is lost in the header when I previously change the center of my images and they are not aligned anymore when I do:
(they were aligned with the other monkeys)

EPI '-dsets ..' dset(s)
# also, move the EPI to be centered (approximately) on that new spot,
# so the WARP dset doesn't have to stretch too far
@Align_Centers                       \
    -cm                              \
    -base a00_deob_ns_RECEN.nii.gz   \
    -dset LONGER_e00_deob.nii        \
    -prefix LONGER_e00_deob_RECEN.nii

That’s why I am trying to refit/resemble these images before.
I also tried
3drefit func -duporigin anat
and it didn’t work =(

thanks again

Hi, Clément -

Just to be clear, the code snippet above will just “recenter” one dataset (the EPI, I guess), and then one might not expect the anat to overlay.

If you want two datsets to have the same orientation before you start, then you can do something like this:


set ori_val = `3dinfo -orient DSET1`

3dresample \
   -orient ${ori_val} \
   -prefix DSET2_${ori_val}.nii \
   -input DSET2

Does doing that remove issues with center of mass/aligning?


Hi --pt,

Thank you, I don’t think that it is going to work because in the previous pipeline we already align the center of these two images.
I think that the problem comes from the fact that the two images do not overlay.
Probably, when we apply the transformation parameters, they can’t be accurate anymore because the orientation of the two brains is different.
The non-overlay of the two images is the only difference with the previous cohort, that’s why I think that it might come from that?
Is there a way to change the orientation to overlay like in the viewer?
Sry if I am not enough precise,

Hi, Clément-

Sorry, I guess I am not understanding.

What is the output of:

3dinfo -orient -extent -prefix [EPI dset] [anat dset]



Hi again,
sry I am probably not clear at all and I hope that I am not looking in the wrong direction.
The result from your previous command is:
RAI -61.685852 65.814148 -30.202454 97.297546 7.62584587.125847 anatT1.nii.gz
RAI -62.152084 63.847916 -21.139194 104.861847 -6.759480 55.241032 BOLD_restingstate.nii.gz

after tranfo:
RAI -62.152084 63.847916 -21.139194 104.861847 -6.759480 55.241032 BOLD_restingstate_anatorig.nii.gz

To illustrate:
anat_and func

I am wondering if the difference of FOV will be considered as two images in different spaces after we change their center coordinate. Could it be the reason why afni_proc.py fail especially on these animals?
Thank you!

Hi, Clément-

Sorry, that doesn’t clarify things for me; I’m also not sure what viewer the images you are showing is using, so I don’t know if it is doing something funny. Actually, in the images you sent, are you happy or sad with the alignment (and is that at the start or end of processing)? And I don’t know what “transform” you have applied; the 3dinfo output looks identical for the rest dset, and I don’t know what it is for the anat dset “afterwards”. I don’t know how afni_proc.py is “failing” for these dsets, either.

Sooo, could you please enumerate/clarify the steps.


Hi --pt,

So sorry for the confusion,

  1. The first group of monkeys (including the image that I send to you) has successfully ended. All the images are well aligned with the template (thank you again!) using the protocol that I describe here:

  2. I have a second group of monkey that has been scan with another coil. For almost all these 8 monkeys, the coregistration between the functional images and the Template is not very good. I used exactly the same protocol as previously described.

  3. My guess for why it is not working (probably wrong):

For some reason, I had to take a different FOV orientation for the anatomical images and the functional of the same animal. I thought that it will not be an issue since the orientation of the FOV is stored in the header. The images in my previous message are the raw functional and anatomical images open with MRIcron. You can see that the brain has a different orientation. However, when I use any viewer (AFNI, ITK, MRIcron), the viewer is able to superpose these two images perfectly (probably by reading the FOV orientation). In the third image, you have the result when I open an anatomical image and add as overlay the functional image.

Why am I thinking that it is the issue that led to a bad coregistration?

If I understood correctly:

For the first group of monkeys (1.) we helped the coregistration with the template by deobliquing and centering together the raw images (anat and func) on 0.

first group of monkeys (1.): LONGER_e00_deob_RECEN.nii and a00_deob_ns_RECEN.nii.gz
If you look at the two images (deobolique and recentered) overlayed using afni, the results is:

Then, we applied @animal_warper and afni_proc.py on the two images and the result is very good.

For the second group:
If I applied the same step to helped the coregistration with the template by deobliquing and centering together the raw images (anat and func) on 0 and observed their overlayed using afni:

Second group of monkeys (2.): LONGER_e00_deob_RECEN.nii and a00_deob_ns_RECEN.nii.gz

Then, I applied @animal_warper and afni_proc.py on the two images and the images are not well aligned.

afni_proc.py                                                              \
    -subj_id ${subj}                                                      \
    -script proc.${subj}                                                  \
    -scr_overwrite                                                        \
    -out_dir ${subj}.results                                              \
    -blocks tshift align tlrc volreg                                      \
    -dsets      LONGER_e00_deob_RECEN.nii                                 \
    -copy_anat  a00_deob_ns_RECEN.nii.gz                                  \
    -anat_has_skull no                                                    \
    -volreg_align_to   MIN_OUTLIER                                        \
    -volreg_align_e2a                                                     \
    -volreg_tlrc_warp                                                     \
    -align_opts_aea                    \
        -epi_strip 3dAutomask          \
        -cost lpc+zz                   \
        -giant_move                    \
        -check_flip                    \
    -tlrc_base  ${refvol}              \
    -tlrc_NL_warp                      \
    -tlrc_NL_warped_dsets                                   \
        ${aw_dir}/a00_deob_warp2std_nsu.nii.gz              \
        ${aw_dir}/a00_deob_shft_al2std_mat.aff12.1D         \
        ${aw_dir}/a00_deob_shft_WARP.nii.gz                 \
    -html_review_style pythonic                             \

So my guess was that the FOV orientation is lost in the centering process. So, when I applied afni_proc.py on these two images LONGER_e00_deob_RECEN.nii and a00_deob_ns_RECEN.nii.gz (the one seen on afni with the previous link) the movement parameters are wrong because the orientation of the two brains are differents.

${aw_dir}/a00_deob_warp2std_nsu.nii.gz              \
        ${aw_dir}/a00_deob_shft_al2std_mat.aff12.1D         \

So I was looking to:
First, reorient the grid? image? orientation? (I don’t know what is the appropriate word for that?) of the functional images on the anatomical to keep the same brain orientation.

Then, I will safely recenter and deoblique the images before @animal_warper and afni_proc.py as for the monkey of the first group.
Again, I am probably wrong on the origin of my problem but this is the explanation that has made sense to me.

Sorry again for the mess in my previous explanation,
I really hope that my explanations clear,

Thank you so much again!

Hi AFNI expert,

I finally found the solution and I will probably be able to be more clear.
Two things were impacting the results of this analysis and fixing it has tremendously improved the quality of the coregistration:

  1. Deobliquing the data prior to @afni_proc.py can be dangerous if like in my dataset the EPI and the anatomical images have different FOV orientations.

3drefit                               \
    -deoblique                        \

What I have done is to first read in the oblique transformation matrix from the anatomical image and make the cardinal dataset oblique to match the EPI
(It is the answer, explained with a good vocabulary of the question of this topic (2 images in the same space), we can probably change the title now =)):

3dWarp -card2oblique anat_image -prefix BOLD_anat_obliq_orig ${dset_bold_deob}

and then:

3drefit                               \
    -deoblique                        \

However, do not forget that EPI time series data should be time shifted with 3dTshift before rotating the volumes to a cardinal direction (https://afni.nimh.nih.gov/pub/dist/doc/program_help/3dWarp.html). In consequence, I removed Tshifte of @afni_proc.py and apply it directly to the raw EPI images prior to 3dWarp.

  1. The second thing is that the quality of the images was worst than the previous group of monkeys. Changing the “-cost” in @afni_proc.py from “lpc+ZZ” to “mni” has greatly improved the coregistration.

I hope that this would help someone one-day =),

Dear AFNI expert sorry for the bad explanation at the beginning, I was clearly missing some “AFNI vocabulary” to explain my problem.
I hope that I didn’t do anything wrong on the EPI images with these changes.

Thanks a lot to the AFNI community!