AFNI version info (afni -ver
): AFNI_24.0.03
Hi AFNI Experts!
I'm running into a couple errors when trying to run my afni_proc script that I can't figure out.
Here is the end of the output displayed in my terminal right before my script fails:
3dABoverlap -no_automask full_mask.GTLC102+tlrc mask_anat.GTLC102+tlrc
tee out.mask_ae_overlap.txt
++ 3dABoverlap: AFNI version=AFNI_24.0.03 (Jan 29 2024) [64-bit]
#A=./full_mask.GTLC102+tlrc.BRIK B=./mask_anat.GTLC102+tlrc.BRIK
#A #B #(A uni B) #(A int B) #(A \ B) #(B \ A) %(A \ B) %(B \ A) Rx(B/A) Ry(B/A) Rz(B/A)
236247 251321 257072 230496 5751 20825 2.4343 8.2862 1.0436 1.0150 1.0374
3ddot -dodice full_mask.GTLC102+tlrc mask_anat.GTLC102+tlrc
tee out.mask_ae_dice.txt
0.945493
Badly placed ()'s.
-- template = 'MNI152_2009_template_SSW.nii.gz', exists = 1
-- will use min outlier volume as motion base
-- including default: -find_var_line_blocks tcat
-- tcat: reps is now 124
++ updating polort to 3, from run len 310.0 s
-- importing NL-warp datasets
-- volreg: using base dset vr_base_min_outlier+orig
++ volreg: applying volreg/epi2anat/tlrc xforms to isotropic 2 mm tlrc voxels
-- applying anat warps to 1 dataset(s): MPRAGE
++ mask: using epi_anat mask in place of EPI one
-- masking: group anat = 'MNI152_2009_template_SSW.nii.gz', exists = 1
** 3dinfo -nt failure: message is:
["/bin/sh: -c: line 0: syntax error near unexpected token `('"][]
** 3dinfo -tr failure: message is:
["/bin/sh: -c: line 0: syntax error near unexpected token `('"][]
** failed to find the number of TRs from dset '/Users/ajcoope9/Dropbox (ASU)/2023-LC_fMRI/MNI152_2009_template_SSW.nii.gz'
-- have 1 ROI dict entries ...
++ applying 2 stim types: ['AM1', 'AM1']
-- using default: will not apply EPI Automask
(see 'MASKING NOTE' from the -help for details)
--> script is file: proc.GTLC102
to execute via tcsh:
tcsh -xef proc.GTLC102 |& tee output.proc.GTLC102
to execute via bash:
tcsh -xef proc.GTLC102 2>&1 | tee output.proc.GTLC102
Since I am not finding any misplaced "(", I have a feeling this issue could be due to the directory paths to the data. The current directory path contains spaces and parenthesis. Here is the current directory path:
/Users/ajcoope9/Dropbox (ASU)/2023-LC_fMRI
Attached below is my afni_proc.py script:
#!/usr/bin/env tcsh
# set data directories
set subj = GTLC102
set scan_num = scan1
set nifti_dir = ./data/${subj}/mri/${scan_num}
set outputs_dir = ./data/${subj}/SSwarper_outputs/${scan_num}
set stim_dir = ./ParticipantFiles/${subj}/Oddball/${scan_num}
# set subject and group identifiers
set group_id = LC_PROJECT
# run afni_proc.py to create a single subject processing script
afni_proc.py -subj_id $subj \
-copy_anat $outputs_dir/anatSS.${subj}.nii \
-anat_has_skull no \
-anat_follower \
anat_w_skull anat $nifti_dir/MPRAGE+orig.BRIK \
-dsets \
$nifti_dir/oddball_run1+orig.BRIK \
$nifti_dir/oddball_run2+orig.BRIK \
$nifti_dir/oddball_run3+orig.BRIK \
$nifti_dir/oddball_run4+orig.BRIK \
-blocks tshift align tlrc volreg mask blur scale regress \
-radial_correlate_blocks tcat volreg regress \
-align_opts_aea -cost lpc+ZZ -giant_move -check_flip \
-tlrc_base "MNI152_2009_template_SSW.nii.gz" \
-tlrc_NL_warp \
-tlrc_NL_warped_dsets $outputs_dir/anatQQ.${subj}.nii \
$outputs_dir/anatQQ.${subj}.aff12.1D \
$outputs_dir/anatQQ.${subj}_WARP.nii \
-volreg_align_to MIN_OUTLIER \
-volreg_align_e2a \
-volreg_tlrc_warp \
-volreg_compute_tsnr yes \
-mask_epi_anat yes \
-blur_size 3.0 \
-regress_stim_times \
$stim_dir/oddball.1D \
$stim_dir/default.1D \
-regress_stim_types AM1 AM1 \
-regress_stim_labels oddball default \
-regress_basis 'dmBLOCK(1)' \
-regress_opts_3dD -jobs 8 \
-gltsym 'SYM: oddball -default' \
-glt_label 1 oddball-default \
-gltsym 'SYM: oddball default' \
-glt_label 2 oddball+default \
-regress_motion_per_run \
-regress_censor_motion 0.3 \
-regress_censor_outliers 0.05 \
-regress_3dD_stop \
-regress_reml_exec \
-regress_compute_fitts \
-regress_make_ideal_sum "sum_ideal.1D" \
-regress_est_blur_epits \
-regress_est_blur_errts \
-regress_run_clustsim no \
-html_review_style pythonic \
-execute
Thank you!