I’ve run a 3dGroupInCorr for resting state data between two groups. I want to use 3dClustSim to estimate the probability of false positives/cluster size to use. I’ve written a script that essentially does this:
Is it appropriate to use the mefc (denoised) data to estimate the ACF parameters? This is the dataset that 3dGroupInCorr uses.
Is -detrend the right thing to do here?
I’m then going to use the means overall (combining both groups) of the ACF parameters in the 3dClustSim run. Is this right? I’ll use a command like: 3dClustSim -acf A B C -pthr 0.01 -athr 0.05 -iter 10000 -nodec -quiet
In the case of resting state data, it would be
better to estimate the blur from the more original
data, before projecting out the bad components.
The blur estimates will probably be a little larger
like this, though at least that is conservative.
Use of -detrend is more optional here, assuming
the basic trends are already gone.
Yes, average the parameters across subjects and
apply them in 3dClustSim. Note that -pthr 0.01
might not be considered strict enough.
Consider running 3dClustSim with the group mask.
The -pthr/-athr/-iter options are not needed, as
it outputs a table at ‘-iter 10000’ by default.
I too am analyzing resting state data and usually use 3dttest++ with -Clustsim option to do a seed based group analysis. It looks like this option uses the input to 3dttest which is the final seed-based results obtained from 3dDeconvolve. I recently ran a 3 level 3dAnova and was a little confused about whether the stats are FDR corrected. The output from anova_bucket dataset shows significance (at least I think that is what the asterisks mean) :
– At sub-brick #0 ‘group_f:Inten’ datum type is short: 0 to 32767 [internal]
[* 1.48214e-05] 0 to 0.485652 [scaled]
– At sub-brick #1 ‘group_f:F-stat’ datum type is short: 0 to 1035 [internal]
[* 0.01] 0 to 10.35 [scaled]
statcode = fift; statpar = 2 26
– At sub-brick #2 ‘AI_m:Mean’ datum type is short: -5718 to 32767 [internal]
[* 2.33344e-05] -0.133426 to 0.764599 [scaled]
– At sub-brick #3 ‘AI_m:t-stat’ datum type is short: -5440 to 32700 [internal]
[* 0.001] -5.44 to 32.7 [scaled]
And when I open the dataset in the GUI, there is an associated q value (the p and q values change automatically depending on which sub-brick you have as overlay and threshold - when I look at t-test results, it’s only the q value that changes and the p value stays constant unless you manually change it - why is that?) but I wasn’t sure how to only show the clusters that are FDR corrected. I tried to run 3dClustSim and based on these threads, I thought the less processed file, as you stated here, should be used to determine acf values etc… But now I am confused because of how 3dttest++ runs it. Should I try to use the file generated immediately prior to regressing out wm and csf noise? or should the acf values come from a dataset that is not blurred?
Thanks for your help!
The
National Institute of Mental Health (NIMH) is part of the National Institutes of
Health (NIH), a component of the U.S. Department of Health and Human
Services.