Surface based analysis of NMT template

Hi AFNI!
Hello! As a new user, I wanted to express my appreciation for the NMT template and the provided MACAQUDE_DEMO, which have been incredibly helpful in my research. However, being new to the system, I might have made some mistakes along the way.

Currently, I am interested in conducting surface-based analysis, and I have made some modifications to the 'do_21_ap_all.tcsh' script in the MACAQUE DEMO.

    -surf_anat  ...../NMT_v2.1_sym/NMT_v2.1_sym.nii.gz               \
    -surf_spec  ......./NMT_v2.1_sym_surfaces/NMT_v2.1_sym_?h.spec   \
    -surf_A white_surface                                            \
    -surf_B gray_surface                                             \

However, upon reviewing the "face vs. object" results, I have noticed some discrepancies that seem off. As a result, I have a couple of questions and would greatly appreciate your guidance and expertise.

  1. After aligning the monkey images to the NMT template using '@animal_wrap', if I wish to proceed with NMT-surface-based analysis, should I use the NMT template or the native T1 image for 'surf_anat'?
  2. In the 'NMT_v2.1_sym_rh.spec' file, I noticed that there are no 'smoothwm' and 'pial' surface files included. Should I utilize 'white_surface' and 'gray_surface' instead?

Thank you very much! Please correct me if I made mistake.

-Yipeng

Hi, Yipeng-

Typically, surface-based analysis and warping-to-template are not done in the same analysis. That is, the 'surf' block and 'tlrc' blocks don't appear together, as you have here.

This is because---at least in human datasets---one generates surfaces from each subject's anatomical (like using FreeSurfer), and these sit geometrically on the subject anatomical; we create standardized surface mesh versions of these with @SUMA_Make_Spec_FS. In that case, one uses the 'surf' block to map the data from the subject's anatomical to that tailored surface, and then processing continues there (e.g., blurring, scaling, regression). There is no mapping to standard volumetric reference space with 'tlrc'.

I think you are including both here because for the macaque, the only surface around is one defined in standard space---not on each subject's own anatomical. The command really isn't built for that kind of assumption, and probably a number of checks would have to be run to see how viable that is. I am frankly surprised that it ran through to completion!

So, it might not be impossible to do that, but it is probably not something that is currently advisable. Are you able to either run a volumetric analysis, or to generate surface meshes on each macaque's own anatomical?

--pt

Hi Taylor
Thank you so much for your kind reply! I did include both the 'surf' and 'tlrc' blocks in this analysis because I needed to perform surface-based analysis in common space(surface generated with NMT2.1). However, generating individual surface for macaque brain using recon-all has been challenging for me. In light of this, I explored an alternative approach using the command below, and fortunately, it yielded some results:
-surf_anat …..NMT_v2.1_sym_SS.nii.gz
-surf_spec …..NMT_v2.1_sym_?h.spec
-surf_anat_aligned yes
-surf_anat_has_skull no
-surf_A white_surface
-surf_B gray_surface
Moving forward, I will try to generate macaque surface following GitHub - VisionandCognition/NHP-Freesurfer: Procedure to create monkey brain surfaces and project results to it. I will also begin by exploring volume analysis as my next step. Once again, I genuinely appreciate your guidance and encouragement. If you have any further suggestions or ideas, I would be more than happy to hear them.

Thank you,
Yipeng

HI Yipeng, I'll reply here instead of to your email. I have created and used the NHP-Freesurfer workflow to deal with the recon-all issues. It still requires a bit of manual processing though and I haven't used it in a while, so I hope everything still works as described. I have not done surface-based group analysis. Instead I've use the individual surfaces to display individual results using my adaptation of PyCortex. You can find it on gtihub as NHP-pycortex. There are a few small changes in comparison to the original that make it work with these macaque brains/surfaces.

Hi Chris!
Thank you for the suggestion. If surface-based group analysis proves to be challenging, I will focus on volume-based analysis and try to project the results onto the cortex using pycortex.
Thank you once again for your support and guidance!
Best regards,
Yipeng

Those datasets were generated with a surface processing package named CIVET. That is supposed to do something similar to FreeSurfer, but I haven't tried it myself. You can find out more about that here:

If someone else has experience with that, then they can comment.

As Paul pointed out, the surfaces provided are in the NMT standard space, so you can use them to present group results from a volumetric analysis.

Another way, that might be similar to what Chris suggests with PyCortex, is to generate surfaces for each macaque in their native space. That could be done with IsoSurface, but they won't have nodewise correspondence across subjects. An ROI-approach from the CHARM atlases transformed to native space (@animal_warper) and then projected to the surface could be done though (3dVol2Surf).

Hi Daniel:
Thank you for your suggestion! The readme file in NMT folder mentioned CIVET pipeline, but I'm afraid I can't run @SUMA_MAKE_SPEC_FS directly for surface generated from this method. I think project result to surface from volumetric analysis is a good solution. Thank you!
Best regards,
Yipeng

Somewhere there is a program called @SUMA_Make_Spec_CIVET, which should be the equivalent/analogue of @SUMA_Make_Spec_FS for converting the outputs to AFNI usability. It is referenced here:

But I guess it will take some tracking down, since (oddly) I am not sure where it lives on the interwebs.

--pt

This was the version I could find (with some minor tweaks for formatting) that I have put in a branch on github here:

Thank you for the code! I'll try to use it for my NMT-surface-based analysis.