I guess this one is for Peter: Would it be possible to add a -weights option to 3dNetCorr so that one could compute a weighted average for each TR rather than a straight up arithmetic mean?
-weights would cause a 1D file to be read. It would have to be as long as there are TRs in -inset
Having looked over the code briefly, I think it would yield something like RAT[n] = (ts[n]*w[n])/Nv[0] at line 41 in rsfc.c where w[n] is the weight for TR n.
That’s definitely a Paul. Though if he says no, I’m sure you could persuade me to do it.
Howdy, Colm-
Yep, that should be implementable in the next short while.
–pt
Hi, Colm-
The dreaded “-weight_ts …” option now awaits you in 3dNetCorr. All you have to do is “@update.afni.binaries -d”, and the power shall be yours.
Note that I think the correlation values will be pretty sensitive to this. In many ways, I don’t think it is so much a weight as a scaling-- I don’t think it just “emphasizes” more points than others, and I wouldn’t use it for censoring. Even if your initial time series were zero-meaned already, this would then alter that.
I’m a bit curious what the application of weighting/scaling time points in the average ROI time series is?
–pt