first run missing from design matrix after 3ddeconvolve

Hi experts,
I am running 3dDeconvolve on data from a task with 6 conditions. Data was collected with a block design that consisted of 5 runs, with 1 block of each condition per run. Each run is 160s long, so total scanning time is 800s. When taking a look at the design matrix files (X.xmat.1D, X.nocensor.xmat.1D, X.jpg) generated by 3dDeconvolve, I noticed that blocks from run 1 are not being modeled (see attached files, first block should occur at 4s). The time axis in X.nocensor.xmat.1D matches what I would expect (800s), but there are only 4 events modeled for each condition when there should be 5. The timing of the events matches what I would expect for blocks in runs 2-5. This is happening for all subjects. Please let me know if you have any suggestions. Below is my 3ddeconvolve code and an example stim_times file

3dDeconvolve -input parDir/func/*foodcue*{temp}blur6-scale.nii
-censor parDir/func/{subID}foodcue-allruns_censor${TRcen}.tsv
-mask parDir/func/foodcue_full_mask*{temp}
.nii
-polort 2
-num_stimts 21
-stim_times 1 onsetDir/{subID}_HighLarge*.txt ‘BLOCK(18,1)’
-stim_label 1 HighLarge
-stim_times 2 onsetDir/{subID}_HighSmall*.txt ‘BLOCK(18,1)’
-stim_label 2 HighSmall
-stim_times 3 onsetDir/{subID}_LowLarge*.txt ‘BLOCK(18,1)’
-stim_label 3 LowLarge
-stim_times 4 onsetDir/{subID}_LowSmall*.txt ‘BLOCK(18,1)’
-stim_label 4 LowSmall
-stim_times 5 onsetDir/{subID}_OfficeLarge*.txt ‘BLOCK(18,1)’
-stim_label 5 OfficeLarge
-stim_times 6 onsetDir/{subID}_OfficeSmall*.txt ‘BLOCK(18,1)’
-stim_label 6 OfficeSmall
-stim_file 7 parDir/func/{subID}_foodcue-allruns_confounds-header.tsv’[0]’ -stim_base 7 -stim_label 7 trans_x_01
-stim_file 8 parDir/func/{subID}_foodcue-allruns_confounds-header.tsv’[1]’ -stim_base 8 -stim_label 8 trans_y_01
-stim_file 9 parDir/func/{subID}_foodcue-allruns_confounds-header.tsv’[2]’ -stim_base 9 -stim_label 9 trans_z_01
-stim_file 10 parDir/func/{subID}_foodcue-allruns_confounds-header.tsv’[3]’ -stim_base 10 -stim_label 10 rot_x_01
-stim_file 11 parDir/func/{subID}_foodcue-allruns_confounds-header.tsv’[4]’ -stim_base 11 -stim_label 11 rot_y_01
-stim_file 12 parDir/func/{subID}_foodcue-allruns_confounds-header.tsv’[5]’ -stim_base 12 -stim_label 12 rot_z_01
-stim_file 13 parDir/func/{subID}_foodcue-allruns_confounds-header.tsv’[6]’ -stim_base 13 -stim_label 13 csf
-stim_file 14 parDir/func/{subID}_foodcue-allruns_confounds-header.tsv’[7]’ -stim_base 14 -stim_label 14 white_matter
-stim_file 15 parDir/func/{subID}_foodcue-allruns_confounds-header.tsv’[8]’ -stim_base 15 -stim_label 15 glob_signal
-stim_file 16 parDir/func/{subID}_foodcue-allruns_confounds-header.tsv’[9]’ -stim_base 16 -stim_label 16 trans_x_02
-stim_file 17 parDir/func/{subID}_foodcue-allruns_confounds-header.tsv’[10]’ -stim_base 17 -stim_label 17 trans_y_02
-stim_file 18 parDir/func/{subID}_foodcue-allruns_confounds-header.tsv’[11]’ -stim_base 18 -stim_label 18 trans_z_02
-stim_file 19 parDir/func/{subID}_foodcue-allruns_confounds-header.tsv’[12]’ -stim_base 19 -stim_label 19 rot_x_02
-stim_file 20 parDir/func/{subID}_foodcue-allruns_confounds-header.tsv’[13]’ -stim_base 20 -stim_label 20 rot_y_02
-stim_file 21 parDir/func/{subID}_foodcue-allruns_confounds-header.tsv’[14]’ -stim_base 21 -stim_label 21 rot_z_02
-gltsym ‘SYM: HighLarge -HighSmall’ -glt_label 1 HighLarge-Small
-gltsym ‘SYM: LowLarge -LowSmall’ -glt_label 2 LowLarge-Small
-gltsym ‘SYM: HighLarge -LowLarge’ -glt_label 3 LargeHigh-Low
-gltsym ‘SYM: HighSmall -LowSmall’ -glt_label 4 SmallHigh-Low
-gltsym ‘SYM: HighLarge LowLarge -HighSmall -LowSmall’ -glt_label 5 Large-Small_allED
-gltsym ‘SYM: HighLarge HighSmall -LowLarge -LowSmall’ -glt_label 6 High-Low_allPS
-gltsym ‘SYM: .5HighLarge .5HighSmall .5LowLarge .5LowSmall -OfficeLarge -OfficeSmall’ -glt_label 7 Food-Office
-jobs 8
-fout -tout -x1D X.xmat.1D -xjpeg X.jpg
-x1D_uncensored X.nocensor.xmat.1D
-fitts fitts.$subID
-errts errts.$subID
-bucket stats.$subID

Example stim_times file onsetDir/{subID}_HighLarge*.txt (the space between the onset time and * is always a tab):
4.0 *
108.0 *
126.0 *
108.0 *
30.0 *

That is strange. Would you please email me one or more of those timing files? Click on my name.

  • rick

Thanks for sending the files. Indeed, they have non-printable characters in them, meaning they are not in pure ASCII text format.

You can use file_tool to fix them, for example. But it would be good to figure out editing methods that will not lead to this. It can cause trouble in scripts, too.

For now, you can test a file using something like:

file_tool -test -infile TIMING_FILE

Add something like -prefix to fix it. For example, using tcsh, not bash:

mkdir fixed.files
foreach file ( sub*.txt )
   file_tool -infile $file -test -prefix fixed.files/$file
end

Does that seem reasonable?

  • rick

Thank you! I was able to specify the encoding to ASCII when generating the files with python. That got rid of the bad characters and the matrices generated by 3dDeconvolve look correct now.

This also was the solution to my issues getting 3dMVM to read my separate -dataTable file.

Thanks again :)-D

That’s great, thanks!

  • rick