Hi. Is the -weight_ts option to 3dNetCorr equivalent to the -censor option to 3dDeconvolve? I assume so, but it never hurts to ask. In other words, do time points weighted with zeros have their residuals set to zero rather than using the value of zero in the correlations?
Hi. Okay, now I am shamelessly rebumping because the subject of the post would be identical to this one. Unless you say otherwise, I will assume that you prefer rebumping when the subject is very similar versus starting new threads. Or that you have no preference. The only reason I say that is I very much appreciate help from SSCC and want to make things as easy as possible for it.
I ran some tests with 3dNetCorr both with and without -weight_ts. I used identical code. The only change was -weight_ts. Without the weighting file, everything was fine. With the weighting file, the entire correlation matrix was 1.000 (for the diagonal) or 0.999 (elsewhere). Within my 77 functional MRI runs, when the censor file had no zeros by chance, the results were fine again. See below for the hideous details.
Like Antaeus of old, run-ins with hideous details just (counterintuitively, perhaps) make AFNI programs stronger!
I ran your commands with different data (the FATCAT_DEMO data), using a weights file of all ones, and things went fineâthe correlation values were unchanged from the original. So I wonder if the issue you are seeing is particular to that data in particularâwhat does the *.1D file of weights look like? Note that the diagonals of the âCCâ (Correlation Coefficient) matrix should always be 1s, because the correlation of a time series with itself is unity.
Of course, a correlation of something with itself is 1.0. I do not know why I included that detail. I should have just said that all off-diagonal correlations are 0.999 and left it at that.
I replicated your result. When the censor file is all ones, there is no problem. The problem occurs when at least one time point is zero in the censor file. I am almost certain that I am doing something wrong or there is a problem with my data, but before we close this case in our minds, would you humor me and try to recreate the problem with at least one time point being censored?
Ooook, I think this can happen when the time series is not already demeaned before entering it into 3dNetCorr. If a residual time series is entered, which should have a mean of zero, then this large change would not happen. If you have a time series with a mean far away from zero, and then apply a weight that is zero in some places, that will look like a giant spike in the time series, which will drive high correlation. Thanks for pointing that out.
Okeydoke. I assumed you would opt for that, so I have put that into the code base, and we will aim to do a build this evening with it, so it will be available tomorrow. Your option of interest will be:
-weight_corr WCORR :input a 1D file WTS of weights that will be applied
to estimate a weighted Pearson Correlation. This
is different than the '-weight_ts ..' weighting.
⌠and after last nightâs build, AFNI ver 22.3.05 contains this discussed update. It should already be on Biowulf, and/or you can â@update.afni.binaries -dâ your local OS to update the binaries there.
Hi! At first pass, the CC and FZ matrices look good. Thanks!
The PC matrix is all zeros. However, it is all zeros regardless of whether I use the -wight_corr option. Plus, I assume the input to the fetal fat_lme_prep.py program will be the unstandardized regression coefficients in the PCB matrix, so I assume the PC matrix is not terribly important. Plus, plus, because these are bivariate correlations, I assume the CC and PC correlation matrices would be identical.
Well, with my test data here, PC and PCB are not full of zeros, even when using either â-weight_* âŚâ option. I might have to get your data to see about that?
I also donât know about using the partial correlation (âPCâ) matrix with the 3dMVM- or 3dMLE-based approach as the next step. Those are using the set of correlation values for comparisons altogether, and I would have thought using the Fisher-Z version of output would be the most useful/appropriate? Is there a specific reason you would prefer the partial correlation?
Ohhhhh. Thank you for explaining the gestalt of the associated math from the help file. I appreciate your patience. That is awesome: the unique variance explained after removing the variance explained by every other region. Wow. That decreases the limitation that the correlation between region A and B is not direct and is caused by a third region C. I will need to think about that a ton and put it into an entirely separate publication.
For now, because I will use the FZ matrix in the fetal fat_lme_prep.py program, my PC matrix zeros are only a concern in case I made a mistake somewhere. Thank you again for your help and patience. I checked all 77 runs, and I get the same result. Would you be willing to look at a run? If yes, is there somewhere that I could copy the data on Biowulf? On the other hand, maybe I should run some diagnostics on my own. Does anything come to mind? For example, should I try demeaning the data first?
OK, looking at the data, the reason that PC calculations run into troubleânote that this runtime warning appears as another sign of it:
*+ WARNING: Badness in partial correlation calculation.
Normalizing factor is <=0 (how to take sqrt?)!
-> making all zeros.
âis that the number of ROIs in the â-inset .â is greater than the number of time points in the input data; this is especially true when the weight is applied, because here that is censoring time points out. This leads to collinearity within the matrix, making the denominator calc for the âpartial correlationâ impossible mathematically. Here, there are 57 time points after censoring; when the number of ROIs is >=57, this badness should be expected for partial correlation (here, it actually happens when the number of ROIs is 55; I think that might be because the matrix is diagonal, so there would be 56 indep columns, and the one extra for⌠a reason I donât know).
Anyways, that appears to be the issue: the time series are too short (or the number of ROIs in the corr matrix too large) for the matrix to be invertible.
The
National Institute of Mental Health (NIMH) is part of the National Institutes of
Health (NIH), a component of the U.S. Department of Health and Human
Services.