3dSurf2Vol

I want to map the" ts_normalized" surf_file to the AFNI volume domain(group level), however, the brick datasets in the generated “lh/rh_ts_normalized_sb_tlrc” file don’t have same number of volumes with the original “ts_normalized_tlrc” file.
Why?

foreach infile(ts_normalized)
foreach hemi (lh rh)
3dSurf2Vol
-spec $spath/group/freesurfer/SUMA/subAvg_“$hemi”+tlrc.spec
-surf_A smoothwm
-surf_B pial
-sv $spath/group/subAvg_SurfVol_at+tlrc.nii.gz
-grid_parent “$spath”/“$sub”/“$infile"at+tlrc.nii.gz
-sdata_1D “$hemi”
”$infile".1D
-datum float
-map_func max_abs
-f_steps 15
-f_index voxels
-f_p1_fr -0.2 -f_pn_fr 0.4
-prefix ./“$hemi”_“$infile”_sb
end

You seem to be missing an “end” for one of your loops.

What are the dimensions of your 1D file, that would have your data to map? THis will tell you about how many time points you have in your surface data set of choice. From the 3dSurf2Vol help file:


Note: With -sdata_1D,  the first column of the file 
      should contain a node's index, and following columns are
      that node's data. See the '-sdata_1D' option for more info.
      Option -sdata takes NIML or GIFTI input which contain
      node index information in their headers.

To find out the dimensions, for example, try:


1d_tool.py -infile "$hemi"_ts_normalized.1D -show_rows_cols

What are the dimensions of your << original “ts_normalized_tlrc” file. >>?
Check this with something like:


3dinfo -nv FILE

And finally, how did you generate the 1D surface files? What is your afni_proc.py command, for example? Did you select to remove TRs from teh beginning of the data set?

–pt