I am attempting to include a voxel-wise covariate (a nifti file showing the grey matter probability maps from VBM analysis) in my 3dMVM script see below, but I keep getting this error and I am not sure what this pertains to. Any help or suggestions is much appreciated!
Error in dim(vQV) ← c(dimx, dimy, dimz, length(unique(lop$dataStr[, lop$vQV[1]]))) :
dims [product 42024960] do not match the length of object [351351000]
Execution halted
Klad, do those voxel-wise covariate files have the same dimensions (or resolutions) as the input files? If not, that would be the cause for the error message.
I completed VBM using SPM and indicates the specific voxel size, bounding box to match the the MNI template I used for registration (SSwarper). I indicated the smoothing to match what I used in my AFNI pre-processing script. Should I have indicated the voxel size to match my fmri data instead
Do you suggest I alter the VBM input so that it aligns with a 2x2x2 voxel size to match my fmri outputs?
Essentially, I ran VBM using SPM and I would like to enter the grey matter maps as a covariate in my analysis to account for differences in atrophy in my groups. During one of the VBM steps (align to an MNI template), I specified the voxel size of the MNI template I used in AFNI (sswarper). Do you recommend I change this to the voxel size of my fmri data?
Since the fmri data was registered using the MNI template in SSwarper, shouldn’t they have the same voxel size?
Thank you very much!
My VBM files:
Data Axes Orientation:
first (x) = Right-to-Left
second (y) = Posterior-to-Anterior
third (z) = Inferior-to-Superior [-orient RPI]
R-to-L extent: -97.000 [R] -to- 97.000 [L] -step- 1.000 mm [195 voxels]
A-to-P extent: -115.000 [A] -to- 115.000 [P] -step- 1.000 mm [231 voxels]
I-to-S extent: -97.000 [I] -to- 97.000 [S] -step- 1.000 mm [195 voxels]
Stats file:
Data Axes Orientation:
first (x) = Left-to-Right
second (y) = Posterior-to-Anterior
third (z) = Inferior-to-Superior [-orient LPI]
R-to-L extent: -95.000 [R] -to- 95.000 [L] -step- 2.000 mm [ 96 voxels]
A-to-P extent: -95.000 [A] -to- 131.000 [P] -step- 2.000 mm [114 voxels]
I-to-S extent: -77.000 [I] -to- 113.000 [S] -step- 2.000 mm [ 96 voxels]
Do you suggest I alter the VBM input so that it aligns with a 2x2x2 voxel size to match my fmri outputs?
Use 3dresample to change the covariate files to the same resolution as the FMRI data. If your original FMRI data resolution is something like 3.75 x 3.75 x 4, you may consider changing all the files (including both FMRI and covariates) to a resolution of, for example, 3 x 3 x 3. There is nothing you’d gain with a finer voxel size of 2 x 2 x 2.
In addition to using 3dsample, I noticed that the orientation of my stats file and my VBM images may be different, though, I am not too sure (one is right to left, and the other is left to right). Do I need to flip my VBM to match my stats file?
Thank you again.
My VBM files:
Data Axes Orientation: first (x) = Right-to-Left
second (y) = Posterior-to-Anterior
third (z) = Inferior-to-Superior [-orient RPI]
R-to-L extent: -97.000 [R] -to- 97.000 [L] -step- 1.000 mm [195 voxels]
A-to-P extent: -115.000 [A] -to- 115.000 [P] -step- 1.000 mm [231 voxels]
I-to-S extent: -97.000 -to- 97.000 [S] -step- 1.000 mm [195 voxels]
Stats file:
Data Axes Orientation: first (x) = Left-to-Right
second (y) = Posterior-to-Anterior
third (z) = Inferior-to-Superior [-orient LPI]
R-to-L extent: -95.000 [R] -to- 95.000 [L] -step- 2.000 mm [ 96 voxels]
A-to-P extent: -95.000 [A] -to- 131.000 [P] -step- 2.000 mm [114 voxels]
I-to-S extent: -77.000 -to- 113.000 [S] -step- 2.000 mm [ 96 voxels]
The VBM output orientation is determined by SPM. While it’s likely that’s okay, mistakes could happen, and left-right flipping problems happen more often than we would like (see link to prepress paper below). We have tools in align_epi_anat.py with the “-check_flip” option, but that works well for anatomical and EPI datasets. It’s not too useful for statistical or computed results. You may want to change the orientation of the input datasets to VBM by left-right flipping it intentionally with 3dLRflip to flip the voxel values or 3drefit -orient to flip the orientation info in the header. Either way creates an incorrect dataset. If the VBM processing results in the same output, then there’s likely a problem. If it results in a mirror image, then that is a good and expected result.
The
National Institute of Mental Health (NIMH) is part of the National Institutes of
Health (NIH), a component of the U.S. Department of Health and Human
Services.