For an experiment containing 12 runs and 3 tasks (A, B, C) arranged in an interleaved manner (A-B-C-A-B-C-A-B-C-A-B-C), is it possible to construct a single afni_proc.py command to preprocess all 12 runs together (so that, e.g., we may have a better quality mean EPI volume) and do 3 separate 3dDeconvolve for each of the tasks?
If not, is it possible to use afni_proc.py in two steps: one for preprocessing, another for regression? How to specify the inputs for the regression afni_proc.py?
Or should I forget about afni_proc.py and simply go back to construct plain 3dDeconvolve manually?
The only thing that comes to mind about this is using a
specific volume as a base image across all 3 executions
of afni_proc.py, which would be simple to do.
Note what the MIN_OUTLIER method is for choosing a
volreg base. You could run 3dToutcount, maybe even
just in the first run, and pick a volume with the lowest
outlier fraction. And then use that as a volreg base for
all versions via: afni_proc.py -volreg_base_dset MIN_OUT+orig
Would that accomplish what you want?
However to back up, I do not yet see any reason for
this not to be done in a single regression. What are you
picturing as the problem with having the 3 types of timing
Since I also need blip, align, etc. which are shared by all 12 runs, so a single preprocess batch until regression would be great.
I don’t really understand what do you mean by “done in a single regression”. If you’re suggesting to use an omnibus design matrix like the following:
[A … …
… B …
… … C]
My main concern is that, would it be a waste of degrees of freedom to include irrelevant regressors, given that A, B, C are actually independent tasks?
If this is for a group analysis, degrees of freedom
do not matter, and note that in some sense you
would actually have more here. But I am assuming
the first level statistics are not what you care about,
just the betas.
But if you were to be careful with motion regressors,
such as doing per-run motion regression, then the
beta weights should be unchanged between the two