global brain connectivity - correlation matrices

Hi,
I’d like to perform a global brain connectivity (GBC) analysis and replicate the following procedure described in a paper:
“…first computed GBC maps independently for the safe and threat conditions by correlating each voxel’s timecourse with every other voxel’s timecourse, applying the Fisher’s Z transformation, and averaging across these correlation maps”.

The gray matter mask that I use from the TT_N27 atlas has 73500 voxels, which means creating a 73500 x 73500 matrix for each subject, applying a fisher’s z, and averaging for each voxel all the Zs to obtain a final brik with one value per voxel.

What are the steps that I need to do after the deconvolution to replicate this procedure?

thanks

ns

Hi, ns-

Sounds like using 3dTcorrMap-- but do read the warnings about memory consumption/computational demands. You might want to try it with a smaller mask first.

https://afni.nimh.nih.gov/pub/dist/doc/htmldoc/programs/3dTcorrMap_sphx.html#ahelp-3dtcorrmap
using this output option:


-Zmean pp = Save tanh of mean arctanh(correlation) into 'pp'

-pt

thanks, it seems to be exactly what I’m looking for.

How many time points do I need to perform this type of correlation? I have 8 time points per trial, and this is the message that I received when I tried to run it:
ERROR: Input dataset ‘./0128136.CSPLINz.ThreatNoPic.1stHalf.IRF+tlrc.HEAD’ is too short: 8 time points

I understand that these type of correlation might be well suited for resting state analysis or at least blocks of conditions that probably lasts few minutes, but is there something similar that I can do on on short time intervals?

thank you

ns

Looks like you are on the cusp of runability: you need 9 time points.

I am not sure what correlation values of so few time points would mean. The stdev of a Pearson correlation value scales with 1/\sqrt(N-2), so for N=8, that would be 1/sqrt(6) ~ 0.4, which is preeeettty huge, consideration the Pearson r range is [-1, 1]. That means that your estimates with so few time points would be dominated by noise.

–pt

… and actually I have just been corrected by a real statistician, that the estimate of the standard error of the Pearson r for N time points is: 1/\sqrt(N-3).

That actually makes it worse: 1/\sqrt(5) ~ 0.45.

–pt

I didn’t consider this huge drawback. then, it’s better not to try this analysis with the current data set.
thanks much for the suggestions/advice.

ns

Okeydoke, sorry to be a buzzkill. It does sound like some other type of analysis might be more amenable with that particular data set.

–pt