I am running into some issue with using AlphaSim correction.
I have been analyzing a structural dataset and these data were preprocessed by VBM SPM. I then fed these data into AFNI and ran 3dLME to look at group differences adding with covariates. The LME ran fine and I double checked the model was set up correctly and my main findings made a lot of sense. Since I was looking at the findings at whole-brain uncorrected p<0.005, I then tried to perform 3dClustim for correction. The mask I used was generated from my own sample. And somehow I kept getting weirdly large size of cluster (see below). I’d greatly appreciated if I could receive some help on this.
++ Authored by: The Bob
++ Number of voxels in mask = 559909
*+ WARNING: removed 12734 voxels from mask because they are constant in time
++ start ACF calculations out to radius = 27.65 mm
ACF done (0.00 CPU s thus far)
0 0 0 0
0.783367 8.58099 31.2275 21.8855
++ 3dClustSim: AFNI version=AFNI_18.0.09 (Jul 7 2015) [64-bit]
++ Authored by: RW Cox and BD Ward
++ 559909 voxels in mask (26.37% of total)
++ Kernel function radius = 74.39 mm
++ ACF(0.78,8.58,31.22) => FWHM=21.89 => 121x145x121 pads to 240x256x240
Kernel image dimensions 33 x 35 x 33
++ Startup clock time = 1.9 s
++ Using 15 OpenMP threads
Simulating:0123456789.0123456789.0123456789.0123456789.01234567!
++ Clock time now = 1244.7 s
bi-sided thresholding
Grid: 121x145x121 1.50x1.50x1.50 mm^3 (559909 voxels in mask)
Typically, one inputs the residual time series from single subject processing into 3dFWHMx, to estimate the scale of the spatial extent of noise (the output ACF parameters describe a function that fits this shape well; for a long time, it was just a Gaussian shape, but a paper made the useful point that that doesn’t appear to be a good approximation in practice, so AFNI has changed the shape for fitting the noise). Then, for a given study of N subjects, one typically has N sets of ACF parameters—one from each subject’s residual time series (errts*, in afni_proc.py outputs). Typically, we average each ACF parameter across teh group, because they are quite similar anyways in practice for subjects acquired on the same scanner. Then we take that final, single set of ACF parameters, and run 3dClustSim using that.
In your case, you are using just the residuals from 3dLME. I am not sure what those look like; I suspect that those might not be what you want to use here.
What is the output of:
3dinfo -n4 LME_3Grp_covar_resid.nii.gz
?
Also, it appears that your voxel sizes are: 1.50x1.50x1.50 mm^3. That is pretty small for FMRI, in general. A typical FMRI voxel would be, say, 3.0 mm isotropic, and so that would be a factor of 8 difference in volume between those—so, you would expect clusters that are about 8x as large in your case as if you were processing at 3mm. Even the difference between 1.5mm isotropic and 2.0mm isotropic is notable, with the latter being 2.4x bigger.
But the first issue may be a bigger consideration here.
Regarding of the second issue you pointed out, these are not functional data. They are structural data so resolutions appeared to be higher with 1.5mmx3.
One possibility is that the LME model might not be able to account for some potential confounding effects in your data. As a result, the “residuals” could contain some strongly correlated structure across space, leading to extremely high (and unrealistic) ACF parameters. One solution is to use some empirical ACF parameters from a similar dataset for 3dClustSim. Alternatively, try your best guess of FWHM and feed them into 3dClustSim -fwhm …
The
National Institute of Mental Health (NIMH) is part of the National Institutes of
Health (NIH), a component of the U.S. Department of Health and Human
Services.