AFNI version info (afni -ver): Version AFNI_25.2.06 'Gordian I'
Hello there!
I am analyzing hybrid sparse sampling data in which I convolved my HRF with the actual time of the experiment and subsequently deleted those timepoints at which data was not acquired. the idea is to run the regression on those convolutions that reflect the actual time of the experiment while carefully controlling for nuisance regressors.
I am now trying to run afni_proc.py but without convolving and am not successful given I use 3dDeconvolve to obtain my contrasts. I of course also need to do all the other preprocessing steps. The only difference is I need not to convolve twice, and feed what AFNI needs: I have created a .1D matrix of all regressors and all runs (columns are: constant, poly1, poly2, condition1, condition2, 4 nuisance regressor columns; rows are 1008 reflecting 6 runs of 168 time points). However my EPIs have 172 slice images for which I am removing the first 4 TRs.
I also tried not to have a redo of polynomials since I include them in my initial convolution with real time points.
My question is: How do I properly set up afni_proc.py to use my pre-convolved regressors without additional convolution? I'm using -regress_no_stim_times but still encountering issues with the regressor setup.
Need to perform standard preprocessing but skip HRF convolution
Any guidance on the proper afni_proc.py syntax for this sparse sampling scenario would be greatly appreciated! Please find my current script attached below for which I get errors invariably at pb03.
Thank you again for your useful reply. I have applied, I believe, what you suggested. However I am getting an error message I did not get before building a matrix with all these 0. I am not sure where the error actually comes from but it suggests to use GOFORIT (4 in this case but suspect it would end up needing to be more).
Here is the error:
*+ WARNING: GLT setup: inversion of C[1/(X'X)]C' fails; trying SVD.
[This happens when some regressor columns are all ]
[zero, or when the GLT has linearly-dependent rows]
[********* EXAMINE YOUR RESULTS WITH CARE ********]
++ Matrix setup time = 1.25 s
** ERROR: !! 3dDeconvolve: Can't run past 4 matrix warnings without '-GOFORIT 4'
** ERROR: !! Currently at -GOFORIT 0
** ERROR: !! See file 3dDeconvolve.err for all WARNING and ERROR messages !!
** ERROR: !! Be sure you understand what you are doing before using -GOFORIT !!
** ERROR: !! If in doubt, consult with someone or with the AFNI message board !!
** FATAL ERROR: !! 3dDeconvolve (regretfully) shuts itself down !!
** Program compile date = Jul 30 2025
if ( 1 ) then
exit 0
I added motion-per-run and hope it will add to the ortvec. I saw discussions on GOFORIT and have not fully understood how nor where to put it in my script.
Thank you very much! Your help is so appreciated.
Kind regards,
Aude.
Here is my script that gives the error but that I believe I have adapted to your suggestions:
##
# Copy individual regressor files from stim_dir
cp $stim_dir/const.1D .
cp $stim_dir/poly1.1D .
cp $stim_dir/poly2.1D .
cp $stim_dir/sing.1D .
cp $stim_dir/nosing.1D .
cp $stim_dir/mrsnd.1D .
cp $stim_dir/visualcue.1D .
cp $stim_dir/pitchcue.1D .
cp $stim_dir/metrocue.1D .
# Extract single-run polynomial templates from the first block of concatenated files (assuming repeated across runs)
1d_tool.py -infile const.1D -select_rows '0..167' -overwrite -write const_template.1D
1d_tool.py -infile poly1.1D -select_rows '0..167' -overwrite -write poly1_template.1D
1d_tool.py -infile poly2.1D -select_rows '0..167' -overwrite -write poly2_template.1D
# Create padded per-run polynomials using the single-run templates (assumes they are similar across runs)
foreach deg ( const poly1 poly2 )
foreach run ( `seq 1 6` )
1d_tool.py -infile ${deg}_template.1D -pad_into_many_runs ${run} 6 -set_run_lengths 168 168 168 168 168 168 -overwrite -write ${deg}_r${run}.1D
end
end
# Combine into one multi-column file
1dcat const_r*.1D poly1_r*.1D poly2_r*.1D > my_polorts.1D
# Combine nuisance regressors into one file
1dcat mrsnd.1D visualcue.1D pitchcue.1D metrocue.1D > nuisance.1D
# Combine polys and nuisances into one ortvec file
1dcat my_polorts.1D nuisance.1D > all_ort.1D
# Run afni_proc.py
afni_proc.py -subj_id ${subj}_${sess}.sparse_GLM \
-script proc.sparse_GLM.${subj}_${sess} -scr_overwrite \
-blocks align tlrc volreg blur mask scale regress \
-copy_anat $anat_dir/brainmask.nii.gz -anat_has_skull no \
-anat_follower_ROI anat_w_skull anat $anat_dir/FS_${subj}_Combined_SurfVol.nii \
-anat_follower_ROI aaseg anat $anat_dir/aparc+aseg.nii \
-anat_follower_ROI aeseg epi $anat_dir/aparc+aseg.nii \
-tlrc_NL_warp \
-tlrc_base /Users/name/abin/MNI152_2009_template_SSW.nii.gz \
-dsets \
$epi_dir/SPARSE1+orig.HEAD \
$epi_dir/SPARSE2+orig.HEAD \
$epi_dir/SPARSE3+orig.HEAD \
$epi_dir/SPARSE4+orig.HEAD \
$epi_dir/SPARSE5+orig.HEAD \
$epi_dir/SPARSE6+orig.HEAD \
-tcat_remove_first_trs 4 \
-volreg_align_to MIN_OUTLIER \
-volreg_align_e2a \
-volreg_tlrc_warp \
-align_opts_aea -cost lpc+ZZ -giant_move \
-mask_epi_anat yes \
-blur_size 5 \
-outlier_polort 0 \
-regress_polort -1 \
-regress_motion_per_run \
-regress_stim_files \
sing.1D nosing.1D \
-regress_stim_labels \
sing nosing \
-regress_stim_types file file \
-regress_extra_ortvec all_ort.1D \
-regress_extra_ortvec_labels ort \
-regress_opts_3dD \
-jobs 2 \
-gltsym 'SYM: sing' \
-glt_label 1 sing \
-gltsym 'SYM: nosing' \
-glt_label 2 nosing \
-gltsym 'SYM: sing -nosing' \
-glt_label 3 sing_vs_nosing \
-regress_reml_exec \
-regress_censor_motion 1 \
-regress_censor_outliers 0.1 \
-regress_compute_fitts \
-regress_make_ideal_sum sum_ideal.1D \
-regress_est_blur_epits \
-regress_est_blur_errts \
-execute
# If successful
if ($?tcsh) then
exit 0
else
return 0
endif
Nothing looks bad to me here. You might plot the final file to be sure of how it looks though:
1dplot -sepscl nuisance.1D
Are there slightly earlier warnings in the 3dDeconvolve output? It might say something more specific earlier.
It might not help, but you can keep noise and polynomials separate, if would seem nicer. Also, you can put the columns together any time. 1d_tool.py can do the same operations with multi-column inputs. For example,
Anyway, let me know what else 3dDeconvolve may be saying. Or even send me the files.
-rick
The
National Institute of Mental Health (NIMH) is part of the National Institutes of
Health (NIH), a component of the U.S. Department of Health and Human
Services.