AFNI version info (afni -ver): Version AFNI_24.2.01 'Macrinus'
Hello,
I'd like to move my previous pre- processing protocol to afni_proc.py, if possible. Below are my current functions. Ideally, I do not want it to do any auto-masking as I've found that our application requires us to manually segment our areas of interest.
Let me know any other information I can provide to help clarify.
3dDespike \
-overwrite \
-nomask \
-localedit \
-cut 2.5 5 \
-ignore 2 \
-prefix $AnimalID/$FolderID/${AnimalID}_${FolderID}.DS.Box \
$AnimalID/$FolderID/${AnimalID}_${FolderID}.Box+orig
align_epi_anat.py \
-overwrite \
-epi $AnimalID/$FolderID/${AnimalID}_${FolderID}.DS.Box+orig \
-anat $AnimalID/$FolderID/${AnimalID}_${FolderID}.Box.Mean+orig \
-epi2anat \
-epi_base "$NBASE0" \
-epi_strip None \
-volreg_method 3dvolreg \
-volreg_base median \
-volreg_opts "-heptic -weight COPY_tempab2D+orig'[0]'" \
-anat_has_skull no \
-output_dir "$AnimalID/$FolderID/" \
-cost ls
# have not validated this below function works. Just want to give an idea that I'd like to also include this kind of regressing protocol
#
afni_proc.py \
-subj_id $AnimalID \
-dsets $AnimalID/$FolderID/${AnimalID}_${FolderID}.DS.Box_al+orig \
-blocks regress \
-anat_follower_ROI mMask epi $AnimalID/$FolderID/MMASK_${AnimalID}_${FolderID}.Box.Mean+orig \
-regress_ROI mMask
While the option order does not matter, let me set up a command that specifies the EPI (and any related into), the anat (and related), then the -blocks and block options. Changes that will be included:
include a scale block
assume 2 pre-steady state TRs (-ignore 2 suggests it)
align EPI to MIN_OUTLIER rather than median or $NBASE0
I do not currently think there is a way to apply 3dvolreg -weight
(such a thing could possibly be added)
regression will include motion (you could consider censoring)
some additional QC will be added
This is tcsh syntax, so you may need to translate.
Thanks so much for this Rick. I will give it a try.
An initial ponder leads me to the question:
If we realize our application does need the weight for optimal co-registration, is there a way to break up the afni_proc.py command to do essentially the same thing overall, but keeping use of 'align_epi_anat' instead?
EDIT: And using the a 1D file of motion parameters for the regress_motion part?
Currently I convert the motion parameters as follows:
Because you are leaving the EPI in original space, there is no harm in breaking the process into pieces, one up through volreg, one after.
Or, you could pass the needed files (.BRIK, .HEAD or .nii.gz) via -copy_files, and edit the proc script to use that file for a volreg weight.
Additionally, you could get me to add such an option to the program. I won't guarantee doing that right away, but it seems like it would be reasonable to add. But note that using an external weight volume probably means you should be using an external registration base (which aligns with the weight) via -volreg_base_dset.
Moving forward with breaking up the process into pieces after doing volreg independently: I can't figure out how to appropriately use my motion parameter files with afni_proc.py. Currently I output .1D files from 3dvold reg. and I also convert it using the following function:
I've gone through the documentation and I see 'regress_stim_times' and 'regress_make_ideal_sum' as options that allow text or 1d inputs but they don't seem like the correct options. I also see '-regress_apply_mot_types' but again not sure how to input a file with it, nor if it would do the regressing correctly. What do you recommend?
I tried out your exact command, but with datasets from our class data, and it worked fine. That makes me wonder if there is some NIFTI dataset that is input and is showing up as in standard space, while the rest of the data is in original space.
What is the output from (fill in the variables):
3dinfo -av_space -prefix $epis $anat $ROI
It seems most likely that the $ROI dataset might not be orig.
Thanks so much for your time into this. I tried your command and each are listed as '+orig'.
Because this isn't typical brain data, I did interpret what I should put in for anat, epis, and ROI. epis is standard, so I imagine no issue there. But for anat, I use an average 3D image calculated from epis. I imagine that's probably different than normal use, so not sure if that would cause an issue. And then for ROI, I use the plugin 'Draw Dataset' to make the mask.
Also curious if my input 1D file is an issue. It only has 6 columns. Not sure if it needs more (like all 12 potential 3dvolreg degrees of freedom)
Not sure if that information is helpful but just trying to provide any information that may be helpful since you said you use datasets from your class data and it worked.
Yes, the errts*.tproject dataset should be the one with everything projected out.
I am a little surprised about it failing to find a motion enorm dataest, given that you pass the motion file. I will try to test that out...
No worries. I imagine it takes a lot to go through all the threads. I'm super appreciative for your time and help.
Also, can you confirm a few things for me, please:
which file would be like a final dataset to use moving forward? It looks like if I use other blocks like scale, the 'errts*' dataset doesn't change but the 'allruns*' seems to change. And there's a 'pb01*scale' file that comes up with scale. What is that?
Do I still need to detrend the final dataset determined in (1)?
Just trying to confirm necessary steps before calculating power spectra. Thank you again!
Sorry, but would you please post the afni_proc.py command that this refers to, or if it is already above, say which one? Is this the "Current code is:" version?
Note that all_runs is the concatenation of the datasets input to 3dDeconvolve (though maybe you have only one run), while the errts is the residuals output be 3dDeconvolve. If you are using the regression to remove signals of no interest, then the errts dataset will be used going forward.
Using -blocks mask scale regress, the scale block would be the only "preprocessing" of the data before the linear regression, leaving pb01*scale datasets as being input to 3dDeconvolve (review this in the proc script).
The 3dDeconvolve command in the proc script should show you what polynomial is being used to model the slow drift/trend (-polort VALUE).
Thank you very much Rick! I'm really grateful for all your help!
The
National Institute of Mental Health (NIMH) is part of the National Institutes of
Health (NIH), a component of the U.S. Department of Health and Human
Services.