AFNI version info (afni -ver): Version AFNI_24.1.08 'Publius Septimius Geta'
I am new to scripting from afni_proc.py and I cannot figure out why I am getting "command not found" for regress_stim_times and regress_stim_labels lines... I have confirmed all file paths and names. Does anyone have any ideas? Thank you in advance for the help and opportunity to learn!
#!/usr/bin/env tcsh
# created by uber_subject.py: version 1.2 (April 5, 2018)
# creation date: Tue Jan 14 12:36:17 2020
# Updated 8/18/24 by CJS
# set subject and group identifiers
if ( $#argv > 0 ) then
set subj = $argv[1]
else
set subj = $subj
endif
#Subj needs to indicate _1 or _2
# set data directories
set anat_dir = ../SS_Warper/{$subj}
set epi_dir = ../Subjects/{$subj}/original_data
set stim_dir = ../StimTimes/VGS
# run afni_proc.py to create a single subject processing script
afni_proc.py -subj_id {$subj} \
-script proc.{$subj} -scr_overwrite \
-blocks tshift align tlrc volreg blur mask scale regress \
-copy_anat $anat_dir/anatSS.{$subj}.nii \
-dsets \
$epi_dir/pb00.{$subj}.fMRI.VGS.1.nii \
$epi_dir/pb00.{$subj}.fMRI.VGS.2.nii \
-tcat_remove_first_trs 0 \
-align_opts_aea -ginormous_move -deoblique on -cost lpc+ZZ \
-volreg_align_to MIN_OUTLIER \
-volreg_opts_vr -heptic \
-volreg_align_e2a \
-volreg_tlrc_warp -tlrc_base /usr/bin/abin/MNI152_2009_template_SSW.nii.gz \
-tlrc_NL_warp \
-tlrc_NL_warped_dsets \
$anat_dir/anatQQ.{$subj}.nii \
$anat_dir/anatQQ.{$subj}.aff12.1D \
$anat_dir/anatQQ.{$subj}_WARP.nii \
-blur_size 4.0 \
#FAILURE AT REGRESS STIM TIMES BLOCK BELOW - "Command not found"
-regress_stim_times \
$stim_dir/stimes.VGS_2runs_30trials_01_cue.1D \
$stim_dir/stimes.VGS_2runs_30trials_02_task.1D \
$stim_dir/stimes.VGS_2runs_30trials_03_catch.1D \
#FAILURE AT REGRESS STIM LABELS BLOCK BELOW - "Command not found"
-regress_stim_labels \
Cue Task Catch \
-regress_basis 'GAM' \
-regress_censor_motion 0.5 \
-regress_compute_fitts \
-regress_apply_mot_types demean deriv \
-regress_motion_per_run \
-regress_3dD_stop \
-regress_reml_exec \
-regress_make_ideal_sum sum_ideal.1D \
-regress_est_blur_epits \
-regress_est_blur_errts \
-regress_run_clustsim yes \
-html_review_style basic
end
Shell scripts for many shells like tcsh and bash shells can use the backslash \ as line continuation characters. That tells the script the command is split over multiple lines. We will very often place these backslashes at the end of a line to allow for a script to be more readable. A common error is to put spaces after the backslash, and then the following line will be treated as a separate command, instead of an option for the previous command. Try to use the backslashes to format your script, and check for spaces after the backslash. We include the program, file_tool, to check for these kinds of problems.
Indeed, as Daniel notes, there are spaces after one of your "continuation of line" backslash characters \.
The first thing I did was edit your post to make use of the fancy backtick functionality, to put code blocks in code formatting. Please see here about that.
Then I highlighted your AP code text, which shows where spaces are. Look here in line 45 of the code block, and you will see spaces to the right of \:
This is a case of the error message not really telling you exactly what is happening, which is annoying. But seeing command not found in a situation like this is not uncommon (happens to me, too!). Note that the non-found command was about regress_stim_labels, among others---that is the first command after the offending spaces. That is because having a space after\ breaks the continual read as a single line, so the shell things that line 46 starts a new command, but then it finds it doesn't recognize that command.
Rick's program file_tool can help point these things out in scripts, as well as subtle non-ascii characters (like - or ' that are actually not ASCII). For example, running:
Thank you both so much for the replies and for pointing me to the file_tool documentation. I have fixed offending spaces and run the file tool checks and am still running into the same problem with "0 bad characters" and a correctly Unix file type. Any other ideas?
Hmm, OK. One thing I notice is that instead of something like this syntax for inserting a variable ${variable_name}, your code has {$variable_name}. Is that intentional? I'm not sure if that effectively does the same thing, but maybe that should be avoided.
I have done that with your code snippet from above, moving the spaces to the right of \, and also vertically aligning options because I prefer reading it this way. So, if you try replacing this into your script+code, can you please copy+paste the full output?
Note that I reserve the right to ask some processing-related questions when this is done, if you don't mind:
why turn on only "basic" HTML review style? The "pythonic" is so much nicer to read!
I have never need to use -deoblique on within the `-align_opts_aea' for anatomical-EPI alignment---is that necessary?
... and maybe a couple other small items. But first things first, with getting the command to run.
Also, I didn't scroll down far enough in your initial post.
One issue to clean up in the script is that there is an end originally in line 62 that is hanging out there, but shouldn't be because there is no accompanying foreach .... That is a tcsh issue.
Make sure to remove those comment lines in the middle of your command that start with #FAILURE ... or put a line continuation character at the end of those lines too. file_tool won't detect that kind of error because there are no bad line continuation characters; the lines just don't continue automatically across comments.
Ah, indeed, Daniel's point probably resolves the second failure. In tcsh, this is ok:
afni_proc.py \
-subj_id ${subj} \
# comment, with a COL character at the end \
-script proc.${subj} \
...
but this is not:
afni_proc.py \
-subj_id ${subj} \
# comment, with *no* COL character at the end
-script proc.${subj} \
...
--pt
The
National Institute of Mental Health (NIMH) is part of the National Institutes of
Health (NIH), a component of the U.S. Department of Health and Human
Services.