Issue running ANOVA


I recently attempted to run a 3x3x2 ANOVA on a Linux computer and got a memory error. We tried increasing swap memory, but we still got the same error. We also tried running the ANOVA on a Mac that has run a similarly sized data set, and it loaded the appropriate files but then was “killed” shortly afterward. We tried running a 3x2 ANOVA on the Linux computer this afternoon, and it was able to run, but it would be ideal if the whole data set could be processed at once. I would appreciate any suggestions we could try!

Thanks so much,


Please post your exact commands as well as the output of: -check_all

Thank you for your help!

The command that I ran was:

tcsh -x MVManovaFixed.txt > diary.txt &

And here is the relevant output:

-------------------------------- general ---------------------------------
architecture: 64bit ELF
system: Linux
release: 3.13.0-165-generic
version: #215-Ubuntu SMP Wed Jan 16 11:46:47 UTC 2019
distribution: Ubuntu 14.04 trusty
number of CPUs: 8
apparent login shell: bash
shell RC file: .bashrc (exists)

--------------------- AFNI and related program tests ---------------------
which afni : /home/abclab/linux_openmp_64/afni
afni version : Precompiled binary linux_openmp_64: Mar 3 2017
: AFNI_17.0.13
AFNI_version.txt : AFNI_17.0.13, linux_openmp_64, Mar 03 2017
which python : /usr/bin/python
python version : 2.7.6
which R : /usr/bin/R
R version : R version 3.4.4 (2018-03-15) – “Someone to Lean On”
which tcsh : /usr/bin/tcsh

instances of various programs found in PATH:
afni : 1 (/home/abclab/linux_openmp_64/afni)
R : 1 (/usr/bin/R)
python : 1 (/usr/bin/python2.7)
python2 : 1 (/usr/bin/python2.7)
python3 : 1 (/usr/bin/python3.4)

testing ability to start various programs…
afni : success
suma : success
3dSkullStrip : success : success
3dAllineate : success
3dRSFC : success
SurfMesh : success

checking for R packages…
rPkgsInstall -pkgs ALL -check : success

checking for $HOME files…
.afnirc : missing
.sumarc : missing
.afni/help/all_progs.COMP : missing

------------------------------ python libs -------------------------------
++ module ‘PyQt4’ found at /usr/lib/python2.7/dist-packages/PyQt4
++ module loaded: PyQt4

-------------------------------- env vars --------------------------------
PATH = /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/home/abclab/MATLAB/R2016b/bin:/home/abclab/Slicer-4.6.3:/home/abclab/linux_openmp_64


------------------------------ data checks -------------------------------
data dir : missing AFNI_data6
data dir : missing AFNI_demos
data dir : missing suma_demo
data dir : missing afni_handouts
atlas : found TT_N27+tlrc under /home/abclab/linux_openmp_64

------------------------------ OS specific -------------------------------
which apt-get : /usr/bin/apt-get
apt-get version : apt 1.0.1ubuntu2 for amd64 compiled on Jan 18 2019 20:24:31

have Ubuntu system: Ubuntu 14.04 trusty

========================= summary, please fix: =========================

  • login shell ‘bash’, trusting user to translate from ‘tcsh’
  • shell bash: consider sourcing (non-login) .bashrc from (login) .profile
  • consider running: cp /home/abclab/linux_openmp_64/AFNI.afnirc ~/.afnirc
  • consider running “suma -update_env” for .sumarc
  • consider running: apsearch -update_all_afni_help
  • insufficient data for AFNI bootcamp

AFNI_17.0.13, linux_openmp_64, Mar 03 2017

Your AFNI is two years old. It’s highly recommended that you update your software:

@update.afni.binaries -d

However, the memory issue is likely unrelated to the AFNI version in this particular case. How many input files are involved in this analysis? What is the voxel size and number of voxels in each of the dimensions?

Thanks for your response! We’ve updated our AFNI, just to be safe. There are 576 input files (32 participants and 9 conditions (note that this is image-based fNIRS data: 1 file for oxy and one for deoxy for each condition)). Each image has 256 cubic voxels, and each voxel is 1 cubic cm. The files are too big to attach, but here’s a link to an example on dropbox:

each voxel is 1 cubic cm

1 cubic mm. So you have over 67 GB input data, which your computer was not happy with the size. Two possible solutions:

  1. down-sample the data
  2. analyze each slice separately