3dDeonvolve: turning bucket sub-bricks into volumes in 4D dataset

Hi all,

I ran 3dDeconvolve with the arguments listed in the script below and saved the bucket to a NIfTI file. When I view the verbose header information for the file containing the bucket output, I can confirm that the file contains all of the expected sub-bricks with beta weights for each regressor specified in 3dDeconvolve. However, when I view the bucket output in a program like FSLview, only one volume is present.

Is there any way to write each sub-brick from this bucket dataset onto the time dimension of a 4D dataset? To clarify, this would mean that each sub-brick would be represented in a 3D volume of a 4D dataset.

I am hoping to index through sub-bricks of the bucket dataset to perform a beta series correlation analysis, and it will be easier to carry out this analysis in matlab if all sub-bricks are listed along the 4th dimension of a single dataset.

Thanks in advance!

Bill

3dDeconvolve -input
…/epi_run1_fnirt_6run_melodic.ica/denoised_data.nii.gz
…/epi_run2_fnirt_6run_melodic.ica/denoised_data.nii.gz
…/epi_run3_fnirt_6run_melodic.ica/denoised_data.nii.gz
…/epi_run4_fnirt_6run_melodic.ica/denoised_data.nii.gz
…/epi_run5_fnirt_6run_melodic.ica/denoised_data.nii.gz
…/epi_run6_fnirt_6run_melodic.ica/denoised_data.nii.gz
-jobs 4
-polort 2 -num_stimts 4
-local_times
-stim_times_IM 1 /deep/heller/work/savorema/1DDirs/${i}_1Dfiles/AFNI_win_local_times.txt ‘SPMG3’
-stim_label 1 Win
-local_times
-stim_times 2 /deep/heller/work/savorema/1DDirs/${i}_1Dfiles/AFNI_loss_local_times.txt ‘SPMG3’
-stim_label 2 Loss
-stim_file 3 /deep/heller/work/savorema/anatDirs/${i}_anat/WM_all.txt
-stim_label 3 WM_mask
-stim_file 4 /deep/heller/work/savorema/anatDirs/${i}_anat/CSF_all.txt
-stim_label 4 CSF_mask
-xsave -xjpeg /deep/heller/work/savorema/${i}/FinancialWin/win_glm/win_glm
-errts /deep/heller/work/savorema/${i}/FinancialWin/win_glm/residuals_win_IM.nii
-bout
-bucket /deep/heller/work/savorema/${i}/FinancialWin/win_glm/win_glm.nii

Hi Bill,

There seem to be some reasonable options here. To split
a dataset in to a collection of single volume datasets,
try 3dTsplit4D, e.g.

3dTsplit4D -prefix beta.nii bucket.nii

If you want only volumes 2 through 17 with a step of 3,
try something like:

3dTsplit4D -prefix beta.nii bucket.nii’[2…17(3)]’

Or perhaps you simply want another 4D dataset, but with
only the volumes of interest:

3dbucket -prefix betas.nii bucket.nii’[2…17(3)]’

How do those options seem?

  • rick

Hi,

I believe I have the same question as Bill, but I don’t think either of above options addresses it.

After running 3dDeconvolve with a single “-stim_times_IM”, I have a bucket dataset with beta weights for each regressor (including each separate event of the individually modulated stimulus). Like Bill, I’d like to end up with a single 3D+time dataset with each beta weight for the individually modulated stimulus represented as one point along the time dimension.

I can’t get a time dimension using 3dbucket to select the volumes of interest as described above, however. Although the output of 3dbucket has the sub-bricks that I want, it is another bucket dataset and is read by other software as a 5D dataset instead of a 4D dataset (of dimensions [number of R-to-L voxels, number of A-to-P voxels, number of slices, 1 timepoint, number of beta weights]).

Is there a simple way to get a 3D+time dataset instead?

Thanks!

Dillon

Probably easiest to use 3dTcat in place of 3dbucket here.