FATAL ERROR: has 1 auxiliary values per time point [nopt=16]


I am currently receiving the following error and I am unsure of the cause or how to fix it:
** FATAL ERROR: ‘-stim_times 1’ file ‘final/stimuli/0_Stay.txt’ has 1 auxiliary values per time point [nopt=16]

The timing file of reference (0_Stay.txt) looks like this:
-although this also happens for all of my event files, no matter the participant or the condition (including ones that contain exactly two runs of data)

55.02.00257205963 97.51.41961884499 110.02.65587878227 127.52.08984804153 145.02.47264695168 155.01.80546998978 167.52.10321688652 190.01.86910700798 195.01.88350510597 207.51.32351708412 215.01.520327489 235.01.68914914131 290.01.47039198875 305.02.05382609367
30.01.27251815796 65.01.65395784378 77.51.62255883217 100.02.17324495316 127.51.45616602898 137.51.29013109207 160.01.26953887939 172.51.28880596161 177.51.40289998055 210.01.63748216629 267.51.65583705902 285.01.520788908

The relevant part of the tcsh script that is causing this error is (and particularly in the 3dDeconvolve block):

execute via :

tcsh -xef new_glm_draft.sh |& tee output.proc.final

=========================== auto block: setup ============================

script setup

take note of the AFNI version

afni -ver

check that the current AFNI version is recent enough

afni_history -check_date 28 Oct 2015
if ( $status ) then
echo “** this script requires newer AFNI binaries (than 28 Oct 2015)”
echo " (consider: @update.afni.binaries -defaults)"

the user may specify a single subject to run with

if ( $#argv > 0 ) then
set subj = $argv[1]
set subj = vc$VCnumber

label of GLM - and directory name

set glm_dir=‘final’

assign output directory name

set subj_dir = $subj.results

copy glm.ALLS file into directory

cp glm.$glm_dir GLMS/glm.$glm_dir

verify that the results directory does not yet exist

if ( -d $subj_dir/$glm_dir ) then
echo output dir “$glm_dir” already exists

set list of runs

set runs = (count -digits 2 1 6)

create results and stimuli directories

mkdir $subj_dir/$glm_dir
mkdir $subj_dir/$glm_dir/stimuli

copy stim files into stimulus directory


copy anatomy to results dir

3dcopy $default_dir/NIfTI/vc$VCnumber/Mpragevc$VCnumber.nii


enter the results directory (can begin processing data)

cd $subj_dir

set rundata = “pb04.$subj.r*.scale+tlrc.HEAD”
cat dfile.r*.1D > $glm_dir/dfile_rall.1D # make a single file of registration params

catenate outlier counts into a single time series

cat outcount.r*.1D > outcount_rall.1D

create ‘full_mask’ dataset (union mask)

cp full_mask* $glm_dir

create an anat_final dataset, aligned with stats

3dcopy Mpragevc$VCnumber+orig anat_final.$subj+tlrc.HEAD

================================ regress =================================

cd $glm_dir

compute de-meaned motion parameters (for use in regression)

1d_tool.py -infile dfile_rall.1D -set_nruns 6
-demean -write motion_demean.1D

compute motion parameter derivatives (for use in regression)

1d_tool.py -infile dfile_rall.1D -set_nruns 6
-derivative -demean -write motion_deriv.1D

create censor file motion_${subj}_censor.1D, for censoring motion

1d_tool.py -infile dfile_rall.1D -set_nruns 6
-show_censor_count -censor_prev_TR
-censor_motion 0.3 motion_${subj}

note TRs that were not censored

set ktrs = 1d_tool.py -infile motion_${subj}_censor.1D \ -show_trs_uncensored encoded

cd …

run the regression analysis

3dDeconvolve -input pb04.$subj.r*.scale+tlrc.HEAD
-censor $glm_dir/motion_${subj}_censor.1D
-polort 3
-num_stimts 8
-stim_times 1 $glm_dir/stimuli/0_Stay.txt ‘GAM’
-stim_label 1 Stay
-stim_times 2 $glm_dir/stimuli/1_Switch.txt ‘GAM’
-stim_label 2 Switch
-stim_file 3 $glm_dir/motion_demean.1D’[0]’ -stim_base 3 -stim_label 3 roll
-stim_file 4 $glm_dir/motion_demean.1D’[1]’ -stim_base 4 -stim_label 4 pitch
-stim_file 5 $glm_dir/motion_demean.1D’[2]’ -stim_base 5 -stim_label 5 yaw
-stim_file 6 $glm_dir/motion_demean.1D’[3]’ -stim_base 6 -stim_label 6 dS
-stim_file 7 $glm_dir/motion_demean.1D’[4]’ -stim_base 7 -stim_label 7 dL
-stim_file 8 $glm_dir/motion_demean.1D’[5]’ -stim_base 8 -stim_label 8 dP \
-gltsym ‘SYM: Stay -Switch’
-glt_label 1 S-S
-gltsym ‘SYM: 0.5Stay +0.5Switch’
-glt_label 2 mean.SS
-fout -tout -x1D $glm_dir/X.xmat.1D -xjpeg X.jpg
-x1D_uncensored $glm_dir/X.nocensor.xmat.1D
-fitts $glm_dir/fitts.$subj
-errts $glm_dir/errts.${subj}
-bucket $glm_dir/stats.$subj

if 3dDeconvolve fails, terminate the script

if ( $status != 0 ) then
echo ‘---------------------------------------’
echo ‘** 3dDeconvolve error, failing…’
echo ’ (consider the file 3dDeconvolve.err)’

cd $glm_dir

display any large pairwise correlations from the X-matrix

1d_tool.py -show_cormat_warnings -infile X.xmat.1D |& tee out.cormat_warn.txt

create an all_runs dataset to match the fitts, errts, etc.

3dTcat -prefix all_runs.$subj $glm_dir/pb04.$subj.r*.scale+tlrc.HEAD


create a temporal signal to noise ratio dataset

signal: if ‘scale’ block, mean should be 100

noise : compute standard deviation of errts

3dTstat -mean -prefix rm.signal.all all_runs.$subj+orig"[$ktrs]"
3dTstat -stdev -prefix rm.noise.all errts.${subj}+orig"[$ktrs]"
3dcalc -a rm.signal.all+orig
-b rm.noise.all+orig
-c full_mask.$subj+orig
-expr ‘c*a/b’ -prefix TSNR.$subj


compute and store GCOR (global correlation average)

(sum of squares of global mean of unit errts)

3dTnorm -norm2 -prefix rm.errts.unit errts.${subj}+orig
3dmaskave -quiet -mask full_mask.$subj+orig rm.errts.unit+orig
> gmean.errts.unit.1D
3dTstat -sos -prefix - gmean.errts.unit.1D' > out.gcor.1D
echo “-- GCOR = cat out.gcor.1D


compute correlation volume

(per voxel: average correlation across masked brain)

(now just dot product with average unit time series)

3dcalc -a rm.errts.unit+orig -b gmean.errts.unit.1D -expr ‘a*b’ -prefix rm.DP
3dTstat -sum -prefix corr_brain rm.DP+orig

create ideal files for fixed response stim types

1dcat X.nocensor.xmat.1D’[24]’ > ideal_Stay.1D
1dcat X.nocensor.xmat.1D’[25]’ > ideal_Switch.1D


compute sum of non-baseline regressors from the X-matrix

(use 1d_tool.py to get list of regressor colums)

set reg_cols = 1d_tool.py -infile X.nocensor.xmat.1D -show_indices_interest
3dTstat -sum -prefix sum_ideal.1D X.nocensor.xmat.1D"[$reg_cols]"

also, create a stimulus-only X-matrix, for easy review

1dcat X.nocensor.xmat.1D"[$reg_cols]" > X.stim.xmat.1D

It is important to note that this is part 2 of two scripts with part 1 running all of the necessary preprocessing steps and part 2 running the GLMs.
Finally, I have read through some previous forums on this site and have adjusted my timing files accordingly to be married with a colon instead of an asterisk and also changed the asterisks that represented no data for that run to be “-1:0” (as suggested here: https://afni.nimh.nih.gov/afni/community/board/read.php?1,84354,85947#msg-85947). Neither of these changes seemed to solve the Fatal Error but it is possible that I implemented these changes wrong.

Thank you in advance for your help!


Timing with auxiliary values should be given via either
-stim_times_AM1 or -stim_times_AM2, and likely the

  • rick

Great, that worked! Thank you so much for your help and your speedy reply.