Using SUMA to create fun figures

AFNI version info (afni -ver):
Precompiled binary macos_13_ARM_clang: Feb 12 2024 (Version AFNI_24.0.06 'Caracalla')

Hi AFNI family,

I've got a beginners question that I'm angry at myself about because I made this work a couple of years ago.
Simple task: I have a volumetric AFNI mask with a couple of ROIs. For a publication, I would like to visualize those ROIs on a 3D brain using SUMA. It doesn't have to be 100% accurate since it is more of a schematic figure to illustrate the resting state networks we looked at.
My mask is in MNI space and I have successfully downloaded the SUMA MNI152_2009 template and displayed in SUMA.
How do I now overlay my mask onto said brain in SUMA?
Apologies for this basic question but I am a bit stuck here.

Thanks - as always - for your tremendous work and support!

Warm regards from cold cold Germany
Jonas

Hi, Jonas-

OK, let's see if we can help bring some SUMA-based springtime cheer to your processing.

I like using isosurfaces, and you can create them from a map of ROIs (each one having a different integer values) called DSET_ROIS with:

IsoSurface                               \
    -isorois+dsets                       \
    -o native.gii                        \
    -input DSET_ROIS                     \
    -noxform                             \
    -Tsmooth 0.01 6

Then, you can view the results in SUMA with some reference volume DSET_VOL:

suma                               \
    -onestate                      \
    -i    native*.gii              \
    -vol  DSET_VOL &

... and adjust the view/angle/rotation as you wish. You can make snapshots with Ctrl+r---make sure the SUMA view is as large as you can make it, for getting an effectively higher res image output (because it works by screencapture at desktop resolution).

Attached is a quick example of doing that with ventricle ROIs from the AFN_data6/FT_analysis/FT/SUMA/ dataset:

IsoSurface                               \
     -isorois+dsets                       \
     -o native.gii                        \
     -input aparc.a2009s+aseg_REN_vent.nii.gz   \
     -noxform                             \
     -Tsmooth 0.01 6

suma                               \
     -onestate                      \
     -i    native*.gii              \
     -vol FT_SurfVol.nii &

->

--pt

You can show ROIs in different ways in suma - as surfaces, as volumetric slices, as volumetric rendered regions, as surface annotation patch ROIs or datasets. Assuming you want regions to show up as their own individual surfaces, then you can use IsoSurface to generate surfaces for each of the ROIs. ROIs can be based on atlas regions, regions you have drawn yourself or from clusters. Here are a couple example scripts to do that.

clusters in 3D
   cd ~/AFNI_data6/afni
   3dClusterize -clust_nvox 200 -bisided -5 5 -ithr 2 -idat 1 -NN 1 \
        -inset func_slim+orig. -pref_map myclusters -overwrite
   3dZeropad -P 1 -prefix myclusters_zp -overwrite myclusters+orig
   IsoSurface -isorois+dsets -o clusttest.gii -overwrite -input myclusters_zp+orig
   afni -niml
   suma -onestate -i *.gii -vol strip+orig. -sv anat+orig

show atlas in suma
   mkdir n27_ml_surfs
   cd n27_ml_surfs
   IsoSurface -isorois+dsets -input ~/abin/MNI_caez_ml_18+tlrc -o ml.gii  
   opacity, points -hiding mesh, some or all - exploring and limiting

   for specific regions,
   suma -vol ~/abin/MNI_caez_N27+tlrc. -onestate -i ml.*_Thalamus.*.gii

Hi Paul and Daniel,

you know, writing on here sometimes feels like a cheat code. You sit in front of your monitor and tinker around for hours with unsatisfying results and feel frustrated and then you write here and you guys come up with such a great solution! This already helps tremendously.

I found lots of ways to alter the appearance of the ROIs but haven't seen an option to change their colors like I would in the AFNI viewer. Do you know if that is possible?

And please allow a second question: It seems like viewing the ROIs like this does not allow for "prying" the brain like I normally would in SUMA with Ctrl + Mouse 1.
I know that back when I created this figure

I was able to pry the brain in half to get the lateral and medial view. I know this is a tough question to ask but would you know how I went from my ROI mask in AFNI to the surface views above?

Thanks again for your help!

Best,
Jonas

For your questions about recoloring data in suma, there are multiple ways to do this (again):

  1. Color the convexity. By default, you will notice those bright and dark striping on the surface that represents the convexity. By default, those are treated as "background". You can toggle that on and off with the 'b' key. Switch to the background dset to control how convexity is colored with ctrl-s to bring up the surface object controller. On lower left of that menu, "Switch Dset", "bk: Convexity". Change color map with "Cmp" (right-click to left of Cmp button). You can choose a monochromatic map - Red, Green, Blue, Amber, Grayscale type. Or pick a multi-color one for ROIs like ROI_i128. With the ROI color map, the default range and coloring method creates a messy looking image. Change the coloring type "Col" to "Dir". The color of the surface should be a single color. Switch among colors by using the up and down arrows over the colorbar.


  2. Color with datasets. Surfaces get colored by datasets. These are often in .niml.dset, so one could create a dataset with a single value for all nodes. You can apply any constant value you want with 3dcalc. For this case, a value of 1 is all that's required. Load that dataset into suma with Load Dset. With any color map, rotate through the color map with the up/down arrows over the color bar, as in the previous case. You can use any colorbar you like for this approach, so a continuous spectrum will work too. You may need to turn the "Dim" factor

3dcalc -a myregion.k1.niml.dset -expr 1 -prefix test

  1. Merged IsoSurface dataset. IsoSurface has an option to create a single combination dataset of all the isosurfaces with '-mergerois+dsets". These can be colored with a single colormap that assigns a single color for each input surfaces value - isosurface 1,2,3, .... This technique was used here in the ALICE ECOG package to color the electrodes interactively and dim or turn off each of the hundred or more surfaces. You can convert existing surfaces to a merged surface with ConvertSurface -merge_surfs

https://afni.nimh.nih.gov/pub/dist/doc/htmldoc/tutorials/rois_corr_vis/suma_spheres.html#:~:text=sphere_01_surf.png-,Make%20icosahedrons,-¶

  1. Change the default isosurface coloring. By default, each Isosurface gets a color from one of the ROI colormaps, but we can change those in the niml.dset files. You can use ConvertDset to apply a label and color map to a niml.dset file. See example 2 in the help for that program. You can also edit the RGB values directly in the default IsoSurface's niml.dset file. The RGB values are fraction values of the Red Green Blue values for each color and label. The table can be edited to adjust the values for each ROI label.

A lot of these topics are covered in more detail in this webpage showing how to work with spherical surfaces in afni and suma with some example scripts:

https://afni.nimh.nih.gov/pub/dist/doc/htmldoc/tutorials/rois_corr_vis/suma_spheres.html

  1. Color planes. You can color a dataset by node index with color planes by selecting "Load Col" in the surface object controller. In this case, you would most likely want the same R G and B values to be the same for all nodes.

Hi, Jonas-

I think to make the image you showed with the open halves of the brain, you have 2 choices for sending the volumetric ROI information to the surface:

  • to make a new surface dataset of that ROI information, you can project your volumetric ROIs into onto the surface with 3dVol2Surf. You can use various options here to manage the projection as you see fit.
  • you can open the afni and suma GUIs together using the -niml option for each, making sure to use the -sv .. option to load a surfvol in SUMA to provide coordinates. Then, hit the 't' to have AFNI and SUMA talking. Load the ROI as overlays in the AFNI GUI, and you will see them projected on to the SUMA surfaces. There might be some gaps in the projected ROIs, because of how the SUMA surfaces intersect the ROIs. This is described in this AFNI Academy video on SUMA. The commands/examples for in running the afni&suma GUIs together are in the Bootcamp data, and I'll put those below.

Then, to get the "inflated surfaces" in SUMA, you cycle through the list in the *.spec file by hitting the . (period) key and moving through the available list (this won't change anything about projecting into AFNI, if you are using that method). To open the hemispheres in SUMA, you can Ctrl+left-click and drag side-to-side---what we can "walnut brain" functionality---which is described here.

--pt


To start AFNI and SUMA in a way they are prepped to talk (in a simple way, assuming no other AFNI and SUMA GUI instances are open, which is OK to have but you then need to specify the port channel numbers for the communication):

# note the option used here
afni -niml &

# note that `-sv ..` is included
suma -spec std.141.FT_both.spec -sv FT_SurfVol.nii &

Then, load your underlay of choice in AFNI and your ROI dset, and then go to the SUMA window and hit t to start the GUIs talking. You should see the SUMA surface info appear in the AFNI GUI as lines, and you should see the overlay colors that intersect those lines appear on the surfaces in SUMA.

It's pretty tricky to get multiple IsoSurfaces in a pryable walnut brain. It's easier to just look at one hemisphere at a time or to hide either of the hemispheres with the left and right bracket keys.

Hi Paul and Daniel,

thanks again for your instructions. I took some time to play around with your options. For the IsoSurface, I think coloring the convexity is the most straight forward way for me and worked really well! However, when I assigned the colors of my liking to the ROIs, I noticed that this decision did not get saved when using the "Save View" option from the Viewer in order to reproduce what I did before (changing dset to convexity, picking a colormap and a corresponding color and so on). Is this the intended behavior?

For displaying the volumetric ROIs on the surface, I was not as lucky.
I tried 3dVol2Surf using the preprocessed MNI surface kindly provided by AFNI like this:

3dVol2Surf -spec ../data/MNI152_2009_surf/MNI152_2009_both.spec \
           -surf_A ../data/MNI152_2009_surf/lh.pial.gii -surf_B ../data/MNI152_2009_surf/lh.smoothwm.gii \
           -sv ../data/MNI152_2009_surf/MNI152_2009_SurfVol.nii -grid_parent ../../charles_BNA_mask_as_networks_1mm+tlrc \
           -map_func ave -f_steps 50 -f_index nodes \
           -out_niml all_networks_vol2surf_out.niml.dset

Apart from the fact that only the left hemisphere is now colored (I guess this is intended behavior running 3dVol2Surf like that but preferrably, I would like to see it on both hemispheres), even selecting a classical colormap for this purpose like ROI_i64, I get different colors.

And unfortunately, with the straight forward option that I also though would get me the result - running both AFNI and SUMA with -niml option and having them talk to each other - nothing really happened. But I do get some error messages that the the idcode of the suma dataset was not found in AFNI.

I launched AFNI and SUMA with

afni -niml &

suma -niml -spec ../data/MNI152_2009_surf/MNI152_2009_both.spec -sv ../data/MNI152_2009_surf/MNI152_2009_SurfVol.nii &

I guess this somehow has to do with my mask not having any relationship with the preprocessed MNI surface data provided by AFNI?

And one final issue (as if this wasn't enough already): If I try to my fun rotating brain recording as a .mpeg file, the console tells me a successful output, but the file is nowhere to be found, which is quite odd. I have noticed that AFNI uses the version of ffmpeg that I had installed on my mac before to compile the video, which is different from what I saw in the bootcamp video. Maybe it has something to do with that?

++ Running '/opt/homebrew/bin/ffmpeg -loglevel quiet -y -r 24 -f image2 -i realmovie.EXH.%06d.ppm -b 400k -qscale 11 -intra realmovie.mpg' to produce realmovie.mpg
. **DONE**

As always, I have learned a lot in the process and understood some fundamentals, especially with the AFNI academy video you kindly referenced, Paul.

Warm regards,
Jonas

Hi, Jonas-

For this part:

For displaying the volumetric ROIs on the surface, I was not as lucky...

... one thing you should do is change -map_func ave to -map_func mode. By "ave", that means that at boundaries where you might have 2 different ROI values (like 57 and 19) between the boundary surfaces, you will end up with a weighted average of them (like 38), which can be a different number, when what you would rather have is still one of the original ROI values---the more prevalent one along that line. On a minor note, -f_steps 50 could probably be made much smaller, like -f_steps 9 or even less, because you don't need that many sample points between the two surfaces---there just isn't that much space between them.

I also prefer using standard meshes---at some point, you might find it useful to display the results on another brain, and you can only transfer across with the nodal correspondence that these bring. I see no reason not to do so, so might as well.

Finally, as to lh and rh: you can make both in separate commands. I think by default when you "load dset" one hemi into SUMA, the other will still get loaded (happened for me in my test case.

So, perhaps try:

3dVol2Surf                                                           \
    -spec         ../data/MNI152_2009_surf/MNI152_2009_both.spec     \
    -surf_A       ../data/MNI152_2009_surf/std.141.lh.pial.gii       \
    -surf_B       ../data/MNI152_2009_surf/std.141.lh.smoothwm.gii   \
    -sv           ../data/MNI152_2009_surf/MNI152_2009_SurfVol.nii   \
    -grid_parent  ../../charles_BNA_mask_as_networks_1mm+tlrc        \
    -map_func     mode                                               \
    -f_steps      10                                                 \
    -f_index      nodes                                              \
    -out_niml     all_networks_vol2surf_out.lh.niml.dset

3dVol2Surf                                                           \
    -spec         ../data/MNI152_2009_surf/MNI152_2009_both.spec     \
    -surf_A       ../data/MNI152_2009_surf/std.141.rh.pial.gii       \
    -surf_B       ../data/MNI152_2009_surf/std.141.rh.smoothwm.gii   \
    -sv           ../data/MNI152_2009_surf/MNI152_2009_SurfVol.nii   \
    -grid_parent  ../../charles_BNA_mask_as_networks_1mm+tlrc        \
    -map_func     mode                                               \
    -f_steps      10                                                 \
    -f_index      nodes                                              \
    -out_niml     all_networks_vol2surf_out.rh.niml.dset

... and then run suma, open the object controller and go to "Load Dset" -> all_networks*dset
... and hopefully that looks better?

--pt

Hi again, Jonas-

Soooo, the thing about ID codes is an unfortunate current blip in the codebase when AFNI and SUMA talk, but the volumetric dataset in question doesn't have an ID code (which the *SurfVol* dsets have tended not to have previously, we have noticed). The simple fix for this is to run:

3drefit -newid DSET

on your *SurfVol* dset. Then AFNI and SUMA will talk more completely (at the moment, the overlay from AFNI won't appear in SUMA, even though the crosshairs will move in the other GUI).

--pt

Hi Paul,

you are correct, loading only one hemisphere dset did indeed load both hemispheres!
I don't think it's hugely important but I run into issues trying to use the standard meshes. When I run the script as you suggested I get

** surface name 'std.141.lh.pial.gii' not found

If I choose "- spec std.141.MNI152_2009_both.spec" as the specification table instead the script executes but I have trouble loading the resulting .niml.dsets and SUMA fails saying that it can't load the data. Hence I stuck with not using the standard meshes for now.

Would you happen to have an idea, why saving as .mpeg is not working even though the console prints a success message (see my last post)? This is probably the last issue I'm having, even though .gifs work for now.

Lastly, I want to point out how useful some of the AFNI academy videos were in the last week since writing my initial post. For example this gem here:

basically solved a lot of the questions I still had about isosurfaces and I think I could have saved you some time if I had found this earlier. Maybe there is merit to somehow link these incredible tutorials to the -help files of the respective programs? I normally try to become more knowledgable (and often fail) by trying to read the help files of the programs first before I bother people and I think some of my questions would have been obsolete, had I found that academy video earlier.

Anyhow, hope you two have a great weekend!

Best,
Jonas

Hi, Jonas-

Thaaat is weird that std.141.lh.pial.gii is not found. This is in suma_MNI152_2009, right, that you downloaded from our website here? What is the output of:

ls std.141*lh*gii

?

For MPEGs not working, do you have ffmpeg installed? Is there terminal text that talks about failure? Also, what does failure mean---no file, or it doesn't play afterward? (Note that GIF and MPEG playback more easily with different programs, I find; on Ubuntu, I use eog to open GIF, and vlc to open MPEG.)

That is a good idea about adding links to the help files.

thanks,
pt

Hi Paul,

yeah, that is why I was so confused because the standardized mesh is in the folder:

(base) jonassteinhauser@MacBook-Pro-von-Jonas data % ls MNI152_2009_surf/std.141*lh*gii
MNI152_2009_surf/std.141.lh.inf_200.gii    MNI152_2009_surf/std.141.lh.sphere.gii
MNI152_2009_surf/std.141.lh.inflated.gii   MNI152_2009_surf/std.141.lh.sphere.reg.gii
MNI152_2009_surf/std.141.lh.pial.gii       MNI152_2009_surf/std.141.lh.white.gii
MNI152_2009_surf/std.141.lh.smoothwm.gii

I used the version downloaded from the website, a quick diff command between the contents of my folder and the one you provided showed that they contain the same files.

Regarding the MPEG outputs.
This is my console output after hitting save as MPEG in the Recording/Snapshot window.

++ Running '/opt/homebrew/bin/ffmpeg -loglevel quiet -y -r 24 -f image2 -i testpt.WNW.%06d.ppm -b 400k -qscale 11 -intra testpt.mpg' to produce testpt.mpg
. **DONE**

However, I can't find the file under the SUMA_Recordings folder that gets generated automatically, nor in the folder where I launched SUMA. I noticed from your AFNI academy video that your path to ffmpeg being called is different, but that may be because I'm running SUMA on macOS. ffmpeg is correctly installed however or at least it works when I call it from python scripts to do video processing.

Thanks
Jonas

Hi, Jonas-

For the first issue, what happens if you run the command from within the directory that contains the *.spec and *.gii files? I am wondering if some part of my memory has correctly stored that the *.spec file references are checked for locally?

For the MPEG stuff, I am not sure. Maybe we could Zoom about that---if you have it installed. The movie would not go into SUMA_Recordings (the snapshot images go there). I would have thought it would be in the folder from which suma was launched, indeed... Just to be sure, could you search for the filename with the macOS Finder, to be sure it isn't somewhere else? I guess if you have used it from Python, it isn't some security setting requiring "OK"ing, either. Weird.

--pt

Hi Paul,

so my earlier error is not reproducible anymore, so I guess I just messed it up somewhere along the way. Sorry for the trouble. For the world to find this solution (and my own foggy brain in probably about 2-3 months), these were the steps that led to a working view model of the ROIs on a surface model (walnut brain!!1!!1):

#!/bin/bash

BASE_DIR="/Users/jonassteinhauser/scripts/GitLab_TUD/jonas_scripts/afni-masks/charles_ukbiobank/figures/data/MNI152_2009_surf"
cd "$BASE_DIR"
printf "Current working directory: %s\n" "$PWD"

3dVol2Surf                                                  \
    -spec         std.141.MNI152_2009_both.spec             \
    -surf_A       std.141.lh.pial.gii                       \
    -surf_B       std.141.lh.smoothwm.gii                   \
    -sv           MNI152_2009_SurfVol.nii                   \
    -grid_parent  ../../../charles_BNA_mask_CEN_1mm+tlrc    \
    -map_func     mode                                      \
    -f_steps      10                                        \
    -f_index      nodes                                     \
    -out_niml     ../../CEN/CEN_vol2surf_out.lh.niml.dset

3dVol2Surf                                                  \
    -spec         std.141.MNI152_2009_both.spec             \
    -surf_A       std.141.rh.pial.gii                       \
    -surf_B       std.141.rh.smoothwm.gii                   \
    -sv           MNI152_2009_SurfVol.nii                   \
    -grid_parent  ../../../charles_BNA_mask_CEN_1mm+tlrc    \
    -map_func     mode                                      \
    -f_steps      10                                        \
    -f_index      nodes                                     \
    -out_niml     ../../CEN/CEN_vol2surf_out.rh.niml.dset

afni -niml &

suma -niml -spec data/MNI152_2009_surf/std.141.MNI152_2009_both.spec -sv data/MNI152_2009_surf/MNI152_2009_SurfVol.nii &

Then "Load Dset" in the Surface Controller to load the newly created CEN_vol2surf_out.lh.niml.dset.

Re: MPEG
I used the Finder GUI and

find / -name "testpt.mpg" 2>/dev/null

in the Terminal but with no success. We can surely do a quick Zoom if this is not taking too much of your valuable time - I have already benefited greatly from all your support. If we do, it'd be great if we could do it some day before 11am EDT because of the time difference to Germany (I personally wouldn't mind Zooming in the evening hours but I'm afraid risking my good standing with my girlfriend and kid if I continue to babble on about brains after like 5 PM).

Best
Jonas

Hi, Jonas-

Glad some of the issues have disappeared along the way---indeed, this is why we like scripting, and thanks for posting your code here.

I will send you a message about scheduling a chat, hopefully ensuring that AFNI usage does not disrupt your familial harmony.

--pt