Hello. I used afni_proc.py to generate the script below.
I have a few questions:
(1) I wanted to remove the first 6 TRs, and so under the autoblock: tcat, you see ‘[6…179]’ .
Moving a little further down, you see the 3dbucket command:
3dbucket -prefix vr_base pb02.$subj.r01.tshift+orig"[0]"
Question: should the sub-brick index be [0] (as generated by afni_proc) or [6]?
(2) Same question for -epi_base option in the align_epi_anat.py command. Should it be -epi_base 0 or -epi_base 6 ?
(3) In the 3dvolreg command, is this option correct -base vr_base+orig ? Is a sub-brick index required, e.g., -base vr_base+orig[6]? Is there any difference between the two, with and without an index?
(4) Instead of doing the e2a registration align_epi_anat.py first with the option -volreg off and then the realignment 3dvolreg, can I do the 3dvolreg first? How should I change the script if I can do so?
Many thanks,
Duong
================= script ==================================
#!/bin/tcsh -xef
echo “auto-generated by afni_proc.py, Fri May 5 16:10:02 2017”
echo “(version 5.13, March 27, 2017)”
echo “execution started: date
”
execute via :
tcsh -xef ./proc_subj1.sh |& tee ./output.proc_subj1.sh
=========================== auto block: setup ============================
script setup
take note of the AFNI version
afni -ver
check that the current AFNI version is recent enough
afni_history -check_date 23 Sep 2016
if ( $status ) then
echo “** this script requires newer AFNI binaries (than 23 Sep 2016)”
echo " (consider: @update.afni.binaries -defaults)"
exit
endif
the user may specify a single subject to run with
if ( $#argv > 0 ) then
set subj = $argv[1]
else
set subj = 20170309.pt001_rtfMRI_1Trn3Tst_GD_030917.pt001_rtfMRI_1Trn3Tst_GD_030917
endif
assign output directory name
set output_dir = 20170309.pt001_rtfMRI_1Trn3Tst_GD_030917.pt001_rtfMRI_1Trn3Tst_GD_030917.results
verify that the results directory does not yet exist
if ( -d $output_dir ) then
echo output dir “$subj.results” already exists
exit
endif
set list of runs
set runs = (count -digits 2 1 2
)
create results and stimuli directories
mkdir $output_dir
mkdir $output_dir/stimuli
copy stim files into stimulus directory
cp ./stim_times/left.txt ./stim_times/right.txt ./stim_times/up.txt
./stim_times/down.txt $output_dir/stimuli
copy anatomy to results dir
3dcopy
20170309.pt001_rtfMRI_1Trn3Tst_GD_030917.pt001_rtfMRI_1Trn3Tst_GD_030917/anatHeadBrik/anat.r1+orig
$output_dir/anat.r1
============================ auto block: tcat ============================
apply 3dTcat to copy input dsets to results dir, while
removing the first 6 TRs
3dTcat -prefix $output_dir/pb00.$subj.r01.tcat
20170309.pt001_rtfMRI_1Trn3Tst_GD_030917.pt001_rtfMRI_1Trn3Tst_GD_030917/funcHeadBrik/epi.r1+orig’[6…179]’
3dTcat -prefix $output_dir/pb00.$subj.r02.tcat
20170309.pt001_rtfMRI_1Trn3Tst_GD_030917.pt001_rtfMRI_1Trn3Tst_GD_030917/funcHeadBrik/epi.r2+orig’[6…179]’
and make note of repetitions (TRs) per run
set tr_counts = ( 174 174 )
-------------------------------------------------------
enter the results directory (can begin processing data)
cd $output_dir
========================== auto block: outcount ==========================
data check: compute outlier fraction for each volume
touch out.pre_ss_warn.txt
foreach run ( $runs )
3dToutcount -automask -fraction -polort 3 -legendre
pb00.$subj.r$run.tcat+orig > outcount.r$run.1D
# censor outlier TRs per run, ignoring the first 0 TRs
# - censor when more than 0.1 of automask voxels are outliers
# - step() defines which TRs to remove via censoring
1deval -a outcount.r$run.1D -expr "1-step(a-0.1)" > rm.out.cen.r$run.1D
# outliers at TR 0 might suggest pre-steady state TRs
if ( `1deval -a outcount.r$run.1D"{0}" -expr "step(a-0.4)"` ) then
echo "** TR #0 outliers: possible pre-steady state TRs in run $run" \
>> out.pre_ss_warn.txt
endif
end
catenate outlier counts into a single time series
cat outcount.r*.1D > outcount_rall.1D
catenate outlier censor files into a single time series
cat rm.out.cen.r*.1D > outcount_${subj}_censor.1D
================================ despike =================================
apply 3dDespike to each run
foreach run ( $runs )
3dDespike -NEW -nomask -prefix pb01.$subj.r$run.despike
pb00.$subj.r$run.tcat+orig
end
================================= tshift =================================
time shift data so all slice timing is the same
foreach run ( $runs )
3dTshift -slice 17 -quintic -prefix pb02.$subj.r$run.tshift
-tpattern alt+z
pb01.$subj.r$run.despike+orig
end
--------------------------------
extract volreg registration base
3dbucket -prefix vr_base pb02.$subj.r01.tshift+orig"[0]"
================================= align ==================================
for e2a: compute anat alignment transformation to EPI registration base
(new anat will be intermediate, stripped, anat.r1_ns+orig)
align_epi_anat.py -anat2epi -anat anat.r1+orig
-save_skullstrip -suffix _al_junk
-epi vr_base+orig -epi_base 0
-epi_strip 3dAutomask
-giant_move
-volreg off -tshift off
================================= volreg =================================
align each dset to base volume, align to anat
register and warp
foreach run ( $runs )
# register each volume to the base
3dvolreg -verbose -zpad 1 -base vr_base+orig
-1Dfile dfile.r$run.1D -prefix rm.epi.volreg.r$run
-cubic
-1Dmatrix_save mat.r$run.vr.aff12.1D
pb02.$subj.r$run.tshift+orig
# create an all-1 dataset to mask the extents of the warp
3dcalc -overwrite -a pb02.$subj.r$run.tshift+orig -expr 1 \
-prefix rm.epi.all1
# catenate volreg/epi2anat xforms
cat_matvec -ONELINE \
anat.r1_al_junk_mat.aff12.1D -I \
mat.r$run.vr.aff12.1D > mat.r$run.warp.aff12.1D
# apply catenated xform: volreg/epi2anat
3dAllineate -base anat.r1_ns+orig \
-input pb02.$subj.r$run.tshift+orig \
-1Dmatrix_apply mat.r$run.warp.aff12.1D \
-mast_dxyz 3 \
-prefix rm.epi.nomask.r$run
# warp the all-1 dataset for extents masking
3dAllineate -base anat.r1_ns+orig \
-input rm.epi.all1+orig \
-1Dmatrix_apply mat.r$run.warp.aff12.1D \
-mast_dxyz 3 -final NN -quiet \
-prefix rm.epi.1.r$run
# make an extents intersection mask of this run
3dTstat -min -prefix rm.epi.min.r$run rm.epi.1.r$run+orig
end
make a single file of registration params
cat dfile.r*.1D > dfile_rall.1D
----------------------------------------
create the extents mask: mask_epi_extents+orig
(this is a mask of voxels that have valid data at every TR)
3dMean -datum short -prefix rm.epi.mean rm.epi.min.r*.HEAD
3dcalc -a rm.epi.mean+orig -expr ‘step(a-0.999)’ -prefix mask_epi_extents
and apply the extents mask to the EPI data
(delete any time series with missing data)
foreach run ( $runs )
3dcalc -a rm.epi.nomask.r$run+orig -b mask_epi_extents+orig
-expr ‘a*b’ -prefix pb03.$subj.r$run.volreg
end
warp the volreg base EPI dataset to make a final version
cat_matvec -ONELINE anat.r1_al_junk_mat.aff12.1D -I > mat.basewarp.aff12.1D
3dAllineate -base anat.r1_ns+orig
-input vr_base+orig
-1Dmatrix_apply mat.basewarp.aff12.1D
-mast_dxyz 3
-prefix final_epi_vr_base
create an anat_final dataset, aligned with stats
3dcopy anat.r1_ns+orig anat_final.$subj
record final registration costs
3dAllineate -base final_epi_vr_base+orig -allcostX
-input anat_final.$subj+orig |& tee out.allcostX.txt
-----------------------------------------
warp anat follower datasets (identity: resample)
================================== tlrc ==================================
warp anatomy to standard space
@auto_tlrc -base TT_icbm452+tlrc -input anat.r1_ns+orig -no_ss
-init_xform AUTO_CENTER
store forward transformation matrix in a text file
cat_matvec anat.r1_ns+tlrc::WARP_DATA -I > warp.anat.Xat.1D
================================== blur ==================================
blur each volume of each run
foreach run ( $runs )
3dmerge -1blur_fwhm 7 -doall -prefix pb04.$subj.r$run.blur
pb03.$subj.r$run.volreg+orig
end
================================== mask ==================================
create ‘full_mask’ dataset (union mask)
foreach run ( $runs )
3dAutomask -dilate 1 -prefix rm.mask_r$run pb04.$subj.r$run.blur+orig
end
create union of inputs, output type is byte
3dmask_tool -inputs rm.mask_r*+orig.HEAD -union -prefix full_mask.$subj
---- create subject anatomy mask, mask_anat.$subj+orig ----
(resampled from aligned anat)
3dresample -master full_mask.$subj+orig -input anat.r1_ns+orig
-prefix rm.resam.anat
convert to binary anat mask; fill gaps and holes
3dmask_tool -dilate_input 5 -5 -fill_holes -input rm.resam.anat+orig
-prefix mask_anat.$subj
compute overlaps between anat and EPI masks
3dABoverlap -no_automask full_mask.$subj+orig mask_anat.$subj+orig
|& tee out.mask_ae_overlap.txt
note Dice coefficient of masks, as well
3ddot -dodice full_mask.$subj+orig mask_anat.$subj+orig
|& tee out.mask_ae_dice.txt
================================= scale ==================================
scale each voxel time series to have a mean of 100
(be sure no negatives creep in)
(subject to a range of [0,200])
foreach run ( $runs )
3dTstat -prefix rm.mean_r$run pb04.$subj.r$run.blur+orig
3dcalc -a pb04.$subj.r$run.blur+orig -b rm.mean_r$run+orig
-c mask_epi_extents+orig
-expr ‘c * min(200, a/b*100)*step(a)*step(b)’
-prefix pb05.$subj.r$run.scale
end
================================ regress =================================
compute de-meaned motion parameters (for use in regression)
1d_tool.py -infile dfile_rall.1D -set_nruns 2
-demean -write motion_demean.1D
compute motion parameter derivatives (just to have)
1d_tool.py -infile dfile_rall.1D -set_nruns 2
-derivative -demean -write motion_deriv.1D
create censor file motion_${subj}_censor.1D, for censoring motion
1d_tool.py -infile dfile_rall.1D -set_nruns 2
-show_censor_count -censor_prev_TR
-censor_motion 0.3 motion_${subj}
combine multiple censor files
1deval -a motion_${subj}censor.1D -b outcount${subj}censor.1D
-expr “a*b” > censor${subj}_combined_2.1D
note TRs that were not censored
set ktrs = 1d_tool.py -infile censor_${subj}_combined_2.1D \ -show_trs_uncensored encoded
------------------------------
run the regression analysis
3dDeconvolve -input pb05.$subj.r*.scale+orig.HEAD
-censor censor_${subj}_combined_2.1D
-polort 3
-local_times
-num_stimts 10
-stim_times 1 stimuli/left.txt ‘BLOCK(30,1)’
-stim_label 1 L
-stim_times 2 stimuli/right.txt ‘BLOCK(30,1)’
-stim_label 2 R
-stim_times 3 stimuli/up.txt ‘BLOCK(30,1)’
-stim_label 3 U
-stim_times 4 stimuli/down.txt ‘BLOCK(30,1)’
-stim_label 4 D
-stim_file 5 motion_demean.1D’[0]’ -stim_base 5 -stim_label 5 roll
-stim_file 6 motion_demean.1D’[1]’ -stim_base 6 -stim_label 6 pitch
-stim_file 7 motion_demean.1D’[2]’ -stim_base 7 -stim_label 7 yaw
-stim_file 8 motion_demean.1D’[3]’ -stim_base 8 -stim_label 8 dS
-stim_file 9 motion_demean.1D’[4]’ -stim_base 9 -stim_label 9 dL
-stim_file 10 motion_demean.1D’[5]’ -stim_base 10 -stim_label 10 dP
-num_glt 4
-gltsym ‘SYM: +L’
-glt_label 1 L-b
-gltsym ‘SYM: +R’
-glt_label 2 R-b
-gltsym ‘SYM: +L -R’
-glt_label 3 L-R
-gltsym ‘SYM: +U -D’
-glt_label 4 U-D
-jobs 4
-fout -tout -x1D X.xmat.1D -xjpeg X.jpg
-x1D_uncensored X.nocensor.xmat.1D
-fitts fitts.$subj
-errts errts.${subj}
-bucket stats.$subj
if 3dDeconvolve fails, terminate the script
if ( $status != 0 ) then
echo ‘---------------------------------------’
echo ‘** 3dDeconvolve error, failing…’
echo ’ (consider the file 3dDeconvolve.err)’
exit
endif
display any large pairwise correlations from the X-matrix
1d_tool.py -show_cormat_warnings -infile X.xmat.1D |& tee out.cormat_warn.txt
– execute the 3dREMLfit script, written by 3dDeconvolve –
tcsh -x stats.REML_cmd
if 3dREMLfit fails, terminate the script
if ( $status != 0 ) then
echo ‘---------------------------------------’
echo ‘** 3dREMLfit error, failing…’
exit
endif
create an all_runs dataset to match the fitts, errts, etc.
3dTcat -prefix all_runs.$subj pb05.$subj.r*.scale+orig.HEAD
--------------------------------------------------
create a temporal signal to noise ratio dataset
signal: if ‘scale’ block, mean should be 100
noise : compute standard deviation of errts
3dTstat -mean -prefix rm.signal.all all_runs.$subj+orig"[$ktrs]"
3dTstat -stdev -prefix rm.noise.all errts.${subj}_REML+orig"[$ktrs]"
3dcalc -a rm.signal.all+orig
-b rm.noise.all+orig
-c full_mask.$subj+orig
-expr ‘c*a/b’ -prefix TSNR.$subj
---------------------------------------------------
compute and store GCOR (global correlation average)
(sum of squares of global mean of unit errts)
3dTnorm -norm2 -prefix rm.errts.unit errts.${subj}_REML+orig
3dmaskave -quiet -mask full_mask.$subj+orig rm.errts.unit+orig
> gmean.errts.unit.1D
3dTstat -sos -prefix - gmean.errts.unit.1D' > out.gcor.1D
echo “-- GCOR = cat out.gcor.1D
”
---------------------------------------------------
compute correlation volume
(per voxel: average correlation across masked brain)
(now just dot product with average unit time series)
3dcalc -a rm.errts.unit+orig -b gmean.errts.unit.1D -expr ‘a*b’ -prefix rm.DP
3dTstat -sum -prefix corr_brain rm.DP+orig
create ideal files for fixed response stim types
1dcat X.nocensor.xmat.1D’[8]’ > ideal_L.1D
1dcat X.nocensor.xmat.1D’[9]’ > ideal_R.1D
1dcat X.nocensor.xmat.1D’[10]’ > ideal_U.1D
1dcat X.nocensor.xmat.1D’[11]’ > ideal_D.1D
--------------------------------------------------------
compute sum of non-baseline regressors from the X-matrix
(use 1d_tool.py to get list of regressor colums)
set reg_cols = 1d_tool.py -infile X.nocensor.xmat.1D -show_indices_interest
3dTstat -sum -prefix sum_ideal.1D X.nocensor.xmat.1D"[$reg_cols]"
also, create a stimulus-only X-matrix, for easy review
1dcat X.nocensor.xmat.1D"[$reg_cols]" > X.stim.xmat.1D
============================ blur estimation =============================
compute blur estimates
touch blur_est.$subj.1D # start with empty file
create directory for ACF curve files
mkdir files_ACF
– estimate blur for each run in epits –
touch blur.epits.1D
restrict to uncensored TRs, per run
foreach run ( $runs )
set trs = 1d_tool.py -infile X.xmat.1D -show_trs_uncensored encoded \ -show_trs_run $run
if ( $trs == “” ) continue
3dFWHMx -detrend -mask full_mask.$subj+orig
-ACF files_ACF/out.3dFWHMx.ACF.epits.r$run.1D
all_runs.$subj+orig"[$trs]" >> blur.epits.1D
end
compute average FWHM blur (from every other row) and append
set blurs = ( 3dTstat -mean -prefix - blur.epits.1D'{0..$(2)}'\'
)
echo average epits FWHM blurs: $blurs
echo “$blurs # epits FWHM blur estimates” >> blur_est.$subj.1D
compute average ACF blur (from every other row) and append
set blurs = ( 3dTstat -mean -prefix - blur.epits.1D'{1..$(2)}'\'
)
echo average epits ACF blurs: $blurs
echo “$blurs # epits ACF blur estimates” >> blur_est.$subj.1D
– estimate blur for each run in errts –
touch blur.errts.1D
restrict to uncensored TRs, per run
foreach run ( $runs )
set trs = 1d_tool.py -infile X.xmat.1D -show_trs_uncensored encoded \ -show_trs_run $run
if ( $trs == “” ) continue
3dFWHMx -detrend -mask full_mask.$subj+orig
-ACF files_ACF/out.3dFWHMx.ACF.errts.r$run.1D
errts.${subj}+orig"[$trs]" >> blur.errts.1D
end
compute average FWHM blur (from every other row) and append
set blurs = ( 3dTstat -mean -prefix - blur.errts.1D'{0..$(2)}'\'
)
echo average errts FWHM blurs: $blurs
echo “$blurs # errts FWHM blur estimates” >> blur_est.$subj.1D
compute average ACF blur (from every other row) and append
set blurs = ( 3dTstat -mean -prefix - blur.errts.1D'{1..$(2)}'\'
)
echo average errts ACF blurs: $blurs
echo “$blurs # errts ACF blur estimates” >> blur_est.$subj.1D
– estimate blur for each run in err_reml –
touch blur.err_reml.1D
restrict to uncensored TRs, per run
foreach run ( $runs )
set trs = 1d_tool.py -infile X.xmat.1D -show_trs_uncensored encoded \ -show_trs_run $run
if ( $trs == “” ) continue
3dFWHMx -detrend -mask full_mask.$subj+orig
-ACF files_ACF/out.3dFWHMx.ACF.err_reml.r$run.1D
errts.${subj}_REML+orig"[$trs]" >> blur.err_reml.1D
end
compute average FWHM blur (from every other row) and append
set blurs = ( 3dTstat -mean -prefix - blur.err_reml.1D'{0..$(2)}'\'
)
echo average err_reml FWHM blurs: $blurs
echo “$blurs # err_reml FWHM blur estimates” >> blur_est.$subj.1D
compute average ACF blur (from every other row) and append
set blurs = ( 3dTstat -mean -prefix - blur.err_reml.1D'{1..$(2)}'\'
)
echo average err_reml ACF blurs: $blurs
echo “$blurs # err_reml ACF blur estimates” >> blur_est.$subj.1D
add 3dClustSim results as attributes to any stats dset
mkdir files_ClustSim
run Monte Carlo simulations using method ‘ACF’
set params = ( grep ACF blur_est.$subj.1D | tail -n 1
)
3dClustSim -both -mask full_mask.$subj+orig -acf $params[1-3]
-cmd 3dClustSim.ACF.cmd -prefix files_ClustSim/ClustSim.ACF
run 3drefit to attach 3dClustSim results to stats
set cmd = ( cat 3dClustSim.ACF.cmd
)
$cmd stats.$subj+orig stats.${subj}_REML+orig
================== auto block: generate review scripts ===================
generate a review script for the unprocessed EPI data
gen_epi_review.py -script @epi_review.$subj
-dsets pb00.$subj.r*.tcat+orig.HEAD
generate scripts to review single subject results
(try with defaults, but do not allow bad exit status)
gen_ss_review_scripts.py -mot_limit 0.3 -out_limit 0.1 -exit0
========================== auto block: finalize ==========================
remove temporary files
\rm -f rm.*
if the basic subject review script is here, run it
(want this to be the last text output)
if ( -e @ss_review_basic ) ./@ss_review_basic |& tee out.ss_review.$subj.txt
return to parent directory
cd …
echo “execution finished: date
”