Hi Gang,
Appreciate the link.
I sincerely doubt the issue is with a lack of memory, as this same error arose when I attempted the analysis on a machine with 256GB of memory, with the program never coming close to any threshold. Apparently, I am not the only one to have experienced this - fortunately the solution proposed in this thread did the trick.
Since this thread appears high on Google, I will briefly summarize the steps:
- Split your input images, masks, etc. using
3dZcutup
into subsets to process individually. As of October 2024, I observe that this memory issue occurs around 1200 NIFTI1 images that are each of float32 dtype and 7.5MB when not gzipped. For my purposes and with a personal workstation of 128GB of memory, doing a lower half and an upper half split for 1400 volumes was sufficient to avoid these memory issues. Your mileage may vary. - Process each split with
3dLMEr
separately. Assign unique names to the statistical image and residuals output so that they don't get overwritten. - Recreate the intended statistics and residuals files by concatenating these splits along the Z-axis together using
3dZcat
. You can sanity-check yourself if you happen to have a copy of the statistics file from an earlier run that failed to procure the residuals file due to this memory address-related error (i.e. check that all the voxels are essentially the same between the two cases usingnibabel
andnumpy
with Python, for example).
I can appreciate that the root cause of this may be very difficult to pinpoint - is it AFNI? R? Some shared library or dependency that has some imposed upper memory ceiling?
Given the emergence of more "big-data" endeavors, this issue may begin to slowly prop up more often and at large enough scan numbers even this z-axis split trick may begin to fail.
In the spirit of FOSS, let me know if there is a way I can help.