3dttest++ -Clustsim failure

Hello AFNI-

I have exprienced a strange failure with -Clustsim in 3dttest++. The actual ttest seems to be working well, just the clustsim is failing. I saw it mentioned once here but the previous thread didn’t seem to come to a resolution. I am hopeful that I can just run 3dClustSim on the output I have from the ttest instead of rerunning the entire test, but I am not totally sure which output to use.

The command I am calling is:

3dttest++ -prefix belvdes -setA ./individual_subj_data/deoblw_shellgame_*_belvdes_glt_template.nii -mask ./individual_subj_data/TLRC_full_resampled_mask+tlrc -Clustsim 8

Here is the error:

++ 3dttest++: AFNI version=AFNI_18.3.05 (Nov 19 2018) [64-bit]
++ Authored by: Zhark++
++ option -setA :: processing as SHORT form (all values are datasets)
++ 1579657 voxels in -mask dataset
++ Number of -Clustsim threads set to 8

  • Default clustsim prefix set to ‘belvdes’
    ++ loading -setA datasets
    e[7m*+ WARNING:e[0m 3dttest++ -setA :: 1185 vectors are constant
    ++ Memory usage now = 428,985,156 (429 million [mega])
    ++ t-testing:0123456789.0123456789.0123456789.0123456789.0123456789.!
  • skipped 1185 voxels completely for having constant data
    ++ ---------- End of analyses – freeing workspaces ----------
    ++ Memory usage now = 2,165,737,072 (2.2 billion [giga])
    ++ Creating FDR curves in output dataset
    ++ Smallest FDR q [1 SetA_Zscr] = 2.30799e-05
  • Added 1 FDR curve to dataset
    ++ Output dataset ./belvdes+tlrc.BRIK
    ++ Output dataset ./belvdes.resid.nii
    ++ ================ Starting -Clustsim calculations ================
  • === temporary files will have prefix belvdes ===
  • === running 8 -randomsign jobs (1250 iterations per job) ===
  • === creating 31,593,140,000 (32 billion [giga]) bytes of pseudo-data in .sdat files ===
  • — 3dClustSim reads .sdat files to compute cluster-threshold statistics —
  • — there is 270,277,365,760 (270 billion [giga]) bytes of memory on your system —
    ++ 3dttest++: AFNI version=AFNI_18.3.05 (Nov 19 2018) [64-bit]
    ++ Authored by: Zhark++
    ++ 1579657 voxels in -mask dataset
    ++ option -setA :: processing as SHORT form (all values are datasets)
    ++ random seeds are 106707279 420866544
    ++ opened file ./belvdes.0000.sdat for output
    ++ loading -setA datasets
    e[7m*+ WARNING:e[0m 3dttest++ -setA :: 1185 vectors are constant
    ++ Memory usage now = 265,643,405 (266 million [mega])
    ++ t-test randomsign:0123456789.0123456789.0123456789.0123456789.0123456789.!
    ++ saving main effect t-stat MIN/MAX values in ./belvdes.0000.minmax.1D
    ++ output short-ized file ./belvdes.0000.sdat
  • 3dttest++ ===== simulation jobs have finished (11638.6 s elapsed)
  • successfully read all 8 minmax.1D files; computing 5percent.txt outputs
    ++ 3dttest++ ----- Global % FPR points for simulated z-stats:
    3.938 = 1-sided 9% FPR A [Bonferroni=5.303]
    4.189 = 2-sided 9% FPR A [Bonferroni=5.428]
    3.982 = 1-sided 8% FPR A [Bonferroni=5.324]
    4.227 = 2-sided 8% FPR A [Bonferroni=5.449]
    4.035 = 1-sided 7% FPR A [Bonferroni=5.349]
    4.268 = 2-sided 7% FPR A [Bonferroni=5.473]
    4.091 = 1-sided 6% FPR A [Bonferroni=5.376]
    4.317 = 2-sided 6% FPR A [Bonferroni=5.500]
    4.154 = 1-sided 5% FPR A [Bonferroni=5.409]
    4.377 = 2-sided 5% FPR A [Bonferroni=5.532]
    4.227 = 1-sided 4% FPR A [Bonferroni=5.449]
    4.444 = 2-sided 4% FPR A [Bonferroni=5.571]
    4.317 = 1-sided 3% FPR A [Bonferroni=5.500]
    4.542 = 2-sided 3% FPR A [Bonferroni=5.621]
    4.444 = 1-sided 2% FPR A [Bonferroni=5.571]
    4.660 = 2-sided 2% FPR A [Bonferroni=5.691]
    4.660 = 1-sided 1% FPR A [Bonferroni=5.691]
    4.834 = 2-sided 1% FPR A [Bonferroni=5.808]
  • [above results also in file belvdes.A.5percent.txt]
    
  • 3dttest++ ===== starting 3dClustSim A: elapsed = 11724.9 s
    ++ 3dClustSim: AFNI version=AFNI_18.3.05 (Nov 19 2018) [64-bit]
    ++ Authored by: RW Cox and BD Ward
    ++ Loading -insdat datasets
    e[7m** FATAL ERROR:e[0m dataset grid too big – must recompile to use shorts as indexes :frowning:
    ** Program compile date = Nov 19 2018
    e[7m** FATAL ERROR:e[0m ===== 3dClustSim command failed :-((( =====
    ** Program compile date = Nov 19 2018
    ++ 3dcalc: AFNI version=AFNI_18.3.05 (Nov 19 2018) [64-bit]
    ++ Authored by: A cast of thousands
    ++ Output dataset ./desvbel+tlrc.BRIK
    ++ 3drefit: AFNI version=AFNI_18.3.05 (Nov 19 2018) [64-bit]
    ++ Authored by: RW Cox
    e[7m*+ WARNING:e[0m Can’t open -atrcopy dataset AFNI_CLUSTSIM_NN1_1sided
    e[7m*+ WARNING:e[0m Can’t open -atrcopy dataset AFNI_CLUSTSIM_NN2_1sided
    ++ Processing AFNI dataset desvbel+tlrc
  • atrcopy
    ++ applying attributes
    ++ 3drefit processed 1 datasets

The output that was generated was the following:
belvdes.000.sdat
belvdes.001.sdat
belvdes.002.sdat
belvdes.003.sdat
belvdes.004.sdat
belvdes.005.sdat
belvdes.006.sdat
belvdes.007.sdat
belvdes.A.5percent.txt
belvdes.resid.nii
belvdes+tlrc.BRIK
belvdes+tlrc.HEAD

Thank you!!!
Regan

Hi Regan,

It looks like the Clustsim code is written to be lighter/faster by using unsigned bytes as index values, but that your datasets are too high resolution (some dimension above 255 voxels). The message refers to being able to compile a version that allows for the larger dimensions.

Is this particularly high-resolution for some reason (perhaps non-human data)?

  • rick

Hi Rick, Thanks for responding so quickly! The data is from humans and was collected on a 3T scanner. Moreover, I have analyzed this exact dataset before using 3dttest++ and -Clustsim. The only difference is that with the original analyses I warped the data to TLRC using AFNI but this time I warped it with ANTS. So I assume it is some issue caused by the ANTS warping. I noticed that the grid spacing for the ANTS warped data is 1x1x1 and the grid spacing for the original AFNI warped data is 2.2x2.2x2.2 (which is the same size as the raw data). Could this be the cause of the issue? If so, it seems like it could be easily fixed by resampling the ANTS warped data to say 2x2x2, right?

Thank you again!
Regan

Yes. You probably have 256 voxels in any direction, which is just 1 more than the program allows for (with the current compile flag).

  • rick

Yup! That worked. Thanks again Rick!

Hi all,

I am in the same boat here, with higher resolution (normalized in ANTs) than AFNI seems to allow. I’d prefer not to resample my data again. Is there a way to bypass this with some flag to allow more voxels for Clustsim?

Thanks!
Rachel

Hi Rachel,

We will take a look at this, thanks!

  • rick

Hi AFNI Experts,
I am getting the same error message:
++ Loading -insdat datasets
** FATAL ERROR: dataset grid too big – must recompile to use shorts as indexes :frowning:
** Program compile date = Jul 25 2018
** FATAL ERROR: ===== 3dClustSim command failed :-((( =====
** Program compile date = Jul 25 2018
What is the resolution without resampling my data?
Thanks,
Lisa

Hi Lisa,

I recall finding the source of that, but not sure whether it was fixed (last May). Let me peek again…

  • rick

Hi Lisa,

That seems to have been fixed last April (to allow 256). The restriction had been a mistake.

Are you able to update the binaries?

Thanks,

  • rick