Hi experts,
I try to run surface-based preprocessing on fMRI data. The script failed at reading input the surf.smooth.params.1D file and I check the file which turns out to be empty. I find a related topic on the board, it said that taking out some commands in SurfSmooth (-blurmaster and -detrend_master https://afni.nimh.nih.gov/afni/community/board/read.php?1,146445,146459#msg-146459) but it did not work.
I tried to fix it but I can’t find more related information. Could you please help me with it?
If you are doing a surface-based analysis, then I don’t think you want the “tlrc” block (nor the accompanying “-tlrc_* …” option).
FreeSurfer creates a mesh from your anatomical volume, and when you run @SUMA_Make_Spec_FS on it, SUMA estimates standard mesh versions (at different densities or surface resolutions), so that your data becomes “standardized” by projecting onto the surface. That is, every node with index X is supposed to correspond to the same physiological feature in any subject (assuming the surface alignments/etc. went well).
So, if you remove that block/option, does that solve your issue?
Another note: when you ran @SUMA_Make_Spec_FS, did you include one of the following options: -GNIFTI, -GIFTI, -IFTI?
You should be doing so.
Thanks, Taylor!
It did not throw any error after removing the tlrc-related commands. But it ended with “** surf name 6, ‘pial’: multiple matches: some paths” and there are many files start with “rm” in the results folder. Are these standard output files? I’m considering whether these are some temporal files that should have been deleted when the whole process completed successfully.
As for the @SUMA_Make_Spec_FS, I’ve included the flag -NIFTI based on the suma tutorial from the afni website.
Sorry, I was wrong. I used “tcsh -xef proc.av |& tee output.proc.av” and the errors did not pop up. But when I just use “tcsh proc.av” to run the file, the errors are still there:
[size=large]Errors:[/size]
Failed to read dset pb04.18001.lh.r01.surf.niml.dset
** ERROR: mri_read_ascii: can’t read any valid data from file surf.smooth.params.1D
** FATAL ERROR: Can’t read input file ‘surf.smooth.params.1D’
** Program compile date = Sep 24 2019
params: Subscript out of range.
Thanks for the output. The “** surf name 6, ‘pial’: multiple matches: some paths” error is probably the basis of the problem. Maybe FreeSurfer is generating multiple ‘pial’ surfaces now, and we have to deal with that.
Until we do, you might need to copy the spec file and remove one of those pial surfaces, before giving the new file to afni_proc.py. I am not sure about the smoothed pial surface. That might be appropriate, but we will have to review such data before being more sure.
Thanks, Rick!
I tried to keep either the “pial-outer-smoothed” or the “pial.nii” in the .spec file. The former version still throws 8 errors like “invalid surfaces, different # nodes ()” and the normal ‘pial.nii’ version works well.
I checked the ‘recon-all.log’ file and the FS files were generated by standard “recon-all” command. The FS version is “freesurfer-Linux-centos6_x86_64-stable-pub-v6.0.0”.
Did FreeSurfer actually generate pial-outer-smoothed.gii? We don’t have that file. That you see different node numbers suggests the file is coming in from somewhere else.
If you want to compare across subjects, it is generally preferable to use the standard mesh surfaces, like std.141, rather than the original surfaces. It would be good to specify those spec files in the afni_proc.py command.
Note that you can also be more explicit about the surface applied in the afni_proc.py command. That might be easier. Consider something like:
-surf_A pial.gii
That way, if you happen to still have the pial-outer-smoothed surface, it will be skipped. It would not be necessary to alter the spec files.
I copied FS files from someone else, probably he did some other processings.
Good to know there is an easier way to do it!
Thanks a lot for the help and suggestions!
Best,
Xin
The
National Institute of Mental Health (NIMH) is part of the National Institutes of
Health (NIH), a component of the U.S. Department of Health and Human
Services.