I’m using more than 130 subjects for the 3dISC which I have about 1 thousand ISC pairs.
and when I run 3dISC with those data set the steps for the reading data seems to be successful however it returns as soon as reading all the data.
I think it’s problem with the memory but I’m not sure. If that’s the problem is there any way to solve this? instead of upgrading the memory spaces.
this was the log
Reading input files for effect estimates: Done!
You can estimate the total memory size by multiplying one subject-pair’s data size with the total number of subject pairs. If memory is the issue, there are two solutions: 1) find a computer with bigger memory; 2) break the data into two or more smaller portions (e.g., halves or quarters).
Thank you for advises!
I’ve tried it with bigger memory computer that has 512 GB.
Then the error message has been changed to
Error : package or namespace load failure for ‘lme4’
Quitting due to the model test failure…
however when I test the same code with small number of subjects (n=10) it worked fine without any errors.
I’ve also tried to split my data set into 5 proportions,my question is in that case should I average all 5 results to get the whole data set’s results?
I’ve also tried to split my data set into 5 proportions,my question is in that case should I average
all 5 results to get the whole data set’s results?
That is not a good idea. Instead, cut each subject pair along the Z axis into multiple (e.g., 5) portions. Suppose you have 100 slices along the Z axis:
3dZcutup -prefix SubjectPair1A -keep 0 19 SubjectPair1+tlrc
3dZcutup -prefix SubjectPair1B -keep 20 19 SubjectPair1+tlrc
3dZcutup -prefix SubjectPair1E -keep 80 99 SubjectPair1+tlrc
Then run each of these 5 subsets with 3dISC separately. In the end, use 3dZcat to glue the results together.