I have a question regarding 3dNetCorr and I think it relates to a recent change made to the function. Specifically, I am trying to use 3dNetCorr on a pre-processed resting state data set that has been censored based upon a motion threshold. I am running into issues relating to a change I see was made last month to be more careful about checking for null time series in an ROI. I noticed certain scans in my data were failing on 3dNetCorr with an error indicating excessive null time series while other scans ran without issue. Output from the scans that failed indicated that every single ROI was filled with null time series.
After checking that the failed scans did not have any voxels with a null time series in any ROI, my first thought was that the failed scans may be those with the most censored time points and the large number of 0 values in the time series was somehow being considered a “null” time series. Upon further investigation I noticed this was not that case as 3dNetCorr had no issue with many of the scans with larger amounts of censoring.
In the end, I discovered that the common theme across failed scans was that the first time point in the scan had been a censored volume, and I could run 3dNetCorr successfully on the previously failed scans with only one change: by removing the censored volume so that the first volume was uncensored.
I am wondering if I am misunderstanding something with regards to the concept of a “null” time series and if there is a reason why 3dNetCorr seems to not like scans where the first time point is a 0 in a time series.
Thanks for any help/information related to this!