You can estimate the total memory size by multiplying one subject-pair’s data size with the total number of subject pairs. If memory is the issue, there are two solutions: 1) find a computer with bigger memory; 2) break the data into two or more smaller portions (e.g., halves or quarters).
I’ve tried it with bigger memory computer that has 512 GB.
Then the error message has been changed to
Error : package or namespace load failure for ‘lme4’
.
.
.
**Error:
Quitting due to the model test failure…
however when I test the same code with small number of subjects (n=10) it worked fine without any errors.
I’ve also tried to split my data set into 5 proportions,my question is in that case should I average all 5 results to get the whole data set’s results?
I’ve also tried to split my data set into 5 proportions,my question is in that case should I average
all 5 results to get the whole data set’s results?
That is not a good idea. Instead, cut each subject pair along the Z axis into multiple (e.g., 5) portions. Suppose you have 100 slices along the Z axis:
Then run each of these 5 subsets with 3dISC separately. In the end, use 3dZcat to glue the results together.
The
National Institute of Mental Health (NIMH) is part of the National Institutes of
Health (NIH), a component of the U.S. Department of Health and Human
Services.