Functional connectivity analyses questions (3dUndump and 3dNetCorr)

Dear AFNI experts,

I am interested in analyzing the correlation matrix between nodes using Power et al. (2011). There are 264 regions of interest in the Power atlas.

Power, J. D., Cohen, A. L., Nelson, S. M., Wig, G. S., Barnes, K. A., Church, J.
A., . . . Petersen, S. E. (2011). Functional network organization of the
human brain. Neuron, 72(4), 665-678. doi:10.1016/j.neuron.2011.09.006

I used the below 3dUndump script and a text file that has all the x,y,z coordinates from the Power paper to create nodes.

3dUndump -master /subj/errts.subj.fanaticor+tlrc
-orient LPI
-prefix nodes_264
-srad 5
-xyz powerrois_264.txt

The above script created a single tlrc file with all 264 nodes. After creating the spherical regions of interest, I used 3dNetCorr script below.

3dNetCorr -inset /subj/errts.subj.fanaticor+tlrc
-prefix subj.netcorr
-in_rois nodes_264+tlrc
-ts_out -ts_label -ts_indiv

The .netcc output file looked like below, which I think is weired.

1 # Number of network ROIs

2 # Number of netcc matrices







I was expecting to have a 264 x 264 correlation matrix (z score) between the regions of interest from the .netcc output. Could you please let me know what would be an issue here?


Hi, Jim-

Hmm, the *.netcc output makes it look like there is only 1 ROI there. What does your powerrois_264.txt file look like—does it have 3 columns (namely, x-, y- and z-coord values), or 4 columns (x-, y- and z-coord values and an ROI value to be bestowed)?

If that *.txt file has three columns, then each ROI created with have a default value of 1. And it is the integer value of voxel that decides in which ROI it belongs—so, even non-connected voxels that all have the same value will be part of the same ROI.

Is that what might be happening here?

Also note: that is a lot of ROIs to pack in, and if the created spheres overlap, then I think that the later ones overwrite the voxels of previous ones (that is, the ordering of the ROIs in the text file would matter). I think that is the case… I also think the program will warn you if there is overlap, with messages like the following (but with different index values):

*+ WARNING: Overwrite short voxel 32 40 17


Hi pt,

Thank you (always) for your help and quick response.

Yes, the txt file has 3 columns (x-, y- and z-coord values). Could you clarify what should be the ‘ROI value’ in the 4th column? I attached a screenshot of Power et al. (2011)'s coordinate excel file. The 4th column should be the Master Assignment column that divides the coordinates into each network (e.g., DMN, DAN etc)? Would you suggest that the txt file should look like below (I just put some random numbers)?

x y z network
10 20 30 1
5 15 25 2

I didn’t see the warning message though. Below is the message that appeared when running the 3dNetCorr.

++ Reading in.
++ Allocating…
++ User didn’t enter mask: will make my own, based on where I find nonzero time series.
++ Applying mask to ROIs.
++ Labelling regions internally.
++ No refset labeltable for naming things.
++ Getting volumes.
++ Calculating average time series.
++ Calculating correlation matrix.
++ Writing output: subj.netcorr …
3.492u 0.615s 0:04.16 98.5% 0+0k 0+2io 29pf+0w


Hi, Jim-

I would put the integer that is in the leftmost column—each ROI needs to be given its own integer value. (In AFNI, and probably in other software, an additional string label can be attached, if you wish, but the integer is the primary+necessary identifier.)

So, if using MNI space from that file, the powerrois_264.txt file could look like:

-25 -98 -12 1
27 -97 -13 2
24 32 -18 3

… etc.

And, as I think you have noted in your 3dUndump command, just having the coordinate values per row isn’t enough—you need to say what orientation system they are in; so I assume your “-orient LPI” reflects that, and I am wasting valuable photons prattling on about it here.

Note: the mentioned warning about potential overlap of spheres:

*+ WARNING: Overwrite short voxel 32 40 17

would occur when running the 3dUndump command.


Hi pt,

Thank you for the suggestion. After editing the text file according to your suggestion, a matrix with r values and z scores was generated.

There is one thing that still makes me confused though. In my analysis, there are two time points for each subject. I just ran 3dNetCorr for a single subject and in the .netcc output, there were 232 ROIs for time 1 and 249 ROIs for time 2, even though there were a total of 264 ROIs in the text file used for 3dUndump. Could you please let me know why this occurred?


Hi, Jim-

Glad that the ROIs are being made happily.

Re. the two runs: was masking applied to the EPIs, and do the two runs share the exact same mask? I wonder if the coverage of each FMRI output is the same. Were they processed together?

I imagine that the reason why there are <264 ROIs in each output is that there are some empty ROIs. It doesn’t look like you are using a mask explicitly in your 3dNetCorr command. You could add this option to output the internally-determined mask (by where time series are non-constant) in each case, and overlay the ROI map on each, to check if that is what is happening.

                     :internally, this program checks for where there are
                      nonnull time series, because we don't like those, in
                      general.  With this flag, the user can output the
                      determined mask of non-null time series.


Hi pt,

Thank you again! I used the example 11 resting-state analysis of proc py which has the FreeSurfer option. Anat_final file (MNI warped) should be used for each errts files mask so the two runs don’t share the exact same mask. Do you think I need to rerun the

I also googled about the approach using the Power atlas and it seems reasonable to exclude some ROIs (especially subcortical ROIs) for which at least one subject had 10% or more null data. Could you please let me know if this makes sense?


I see. I guess I might have processed the 2 time series in a single command, and then split them at the end. That way, they would share the same volreg volume, and should be essentially aligned and in the same space at the end. Even without that, the should be pretty similar (unless the sessions were very different, brightness patterns and coverage changed, etc.).

Note that in masks are generated and used for a couple things, but typically not applied to the datasets themselves—so there is data everywhere, and most programs have a “-mask …” option when needed to provide one.

In terms of concatenating the runs during processing, that could be chosen either way. But typically I might not mask the time series themselves.

That being said, it is good to be aware of the practical coverage of the data—if you have data with extremely low SNR and effectively no real signal in it in a region, do you want to include it in your matrix? I am not sure. You could include it, and hopefully it would just be noisy/zero, or you could choose to exclude some of the Power atlas because of how it overlaps (or where it doesn’t) with your mask.


Hi pt,

Thank you so much for the thoughtful comments. I’ll be really careful about this and will talk to my advisor about this too.

Could I have one more question that is related to 3dNetCorr? In the 3dNetCorr output, I think the command naturally sorts out the 0 values. I am wondering if there is any way I can include 0 values into the output so that I can easily gather information on those nodes.


Wait, I have to do some work now?..

OK, OK, I will look at 3dNetCorr for outputting zeros. This will likely be a behavior that can be flagged with an option.

One thing to note is: this new output (where zero rows/columns representing “missing” ROIs are included) should then probably not be put through the current fat_mvm_* pipeline for group analysis, as it currently works. For *.netcc files from 3dNetCorr, the assumption is that all parts of the matrix are “real” values to be processed—and zeros are a possible correlation value. So, you might run 3dNetCorr twice, then: once where the empty ROIs are excluded from the matrix outputs, and once where they are included, depending on what you want to do with them.


Hi pt,

Thank you! I’ll wait for the outputting zeros options. Keep me posted!

Have a good weekend,

Hi, JW-

The updated 3dNetCorr is now in the distribution, which was just rebuilt this morning. So, if you type

@update.afni.binaries -d

you should be able to get the new version (version number >= AFNI_21.2.06).

There are 2 new options, and I suspect you will want to use both simultaneously for having a “complete” NxN matrix output, even when some of the N ROIs are completely filled with time series that are all zeros:

-allow_roi_zeros   :by default, this program will end unhappily if any ROI
                      contains only time series that are all zeros (which
                      might occur if you applied a mask to your data that
                      is smaller than your ROI map).  This is because the
                      correlation with an all-zero time series is undefined.
                      However, if you want to allow ROIs to have all-zero
                      time series, use this option; each row and column
                      element in the Pearson and Fisher-Z transformed
                      matrices for this ROI will be 0.  NB: you cannot
                      use -part_corr when this option is used, to avoid
                      of mathematical badness.
                      See the NOTE about this option, below

  -automask_off      :if you do not enter a mask, this program will
                      make an internal automask of where time series are
                      not uniformly zero.  However, if you don't want this
                      done (e.g., you have a map of N ROIs that has greater
                      extent than your masked EPI data, and you are using
                      '-allow_roi_zeros' to get a full NxN matrix, even if
                      some rows and columns are zero), then use this option.

and this note comments a bit more about it:

Please let me know how it goes.


Hi pt,

It works well! Thank you so much! Sorry for the continuous questions but could I add one more question on this thread?

I would like to compute within- (average of all edges (nij) within the network) and between-network (average of all edges between nodes of the networks) connectivity. Could you please let me know if it is possible to directly compute these using 3dNetCorr?