I’m currently testing the ETCA method on some data, and while it produce output that makes sense (in that it is similar to the standard 3dttest output) it is producing errors that do not.
Specifically, it states that it cannot load a few 1D files during processing, but in examining the folder those files are present, and contain data. From looking at the output in the terminal window it seems that perhaps there was an attempt to load the files before they were written. It is just a error that doesn’t crash processing, so I am not sure if this is normal or expected.
Relevant output:
[truncated]
3443444455455556666566777776788888789.9.9.9.9.89.!
++ saving main effect t-stat MIN/MAX values in ./group/etac/hc/pre_pain_only_etac.0004.minmax.1D
++ output short-ized file ./group/etac/hc/pre_pain_only_etac.0004.sdat
!
++ saving main effect t-stat MIN/MAX values in ./group/etac/hc/pre_pain_only_etac.0005.minmax.1D
++ output short-ized file ./group/etac/hc/pre_pain_only_etac.0005.sdat
!
++ saving main effect t-stat MIN/MAX values in ./group/etac/hc/pre_pain_only_etac.0000.minmax.1D
++ output short-ized file ./group/etac/hc/pre_pain_only_etac.0000.sdat
+ 3dttest++ ===== simulation jobs have finished (1343.5 s elapsed)
** ERROR: Can't read file ./group/etac/hc/pre_pain_only_etac.0001.minmax.1D
** ERROR: Can't read file ./group/etac/hc/pre_pain_only_etac.0002.minmax.1D
** ERROR: Can't read file ./group/etac/hc/pre_pain_only_etac.0003.minmax.1D
** ERROR: Can't read file ./group/etac/hc/pre_pain_only_etac.0006.minmax.1D
+ 3dttest++ ===== starting 3dXClustSim : elapsed = 1344.9 s
++ Environment variable AFNI_AUTOGZIP already set to 'NO'. Value of 'YES' from /home/dowdlelt/.afnirc is ignored.
To kill such warnings Set AFNI_ENVIRON_WARNINGS to NO
++ 3dXClustSim: AFNI version=AFNI_18.2.00 (Jul 2 2018) [64-bit]
++ Authored by: Lamont Cranston
++ Loading -insdat datasets
++ Single FPR goal: 5.0%
++ minimum cluster size = 5 voxels
++ 3dXClustSim: Using 7 OpenMP threads
++ MultiThresh cluster FOM threshold method = cdf 90.0%
++ STEP 1a: start 1-sided clustering with NN=2
!
++ saving main effect t-stat MIN/MAX values in ./group/etac/hc/pre_pain_only_etac.0003.minmax.1D
++ output short-ized file ./group/etac/hc/pre_pain_only_etac.0003.sdat
!
++ saving main effect t-stat MIN/MAX values in ./group/etac/hc/pre_pain_only_etac.0001.minmax.1D
++ output short-ized file ./group/etac/hc/pre_pain_only_etac.0001.sdat
!
[truncated]
Could this be due to parallel processing being used? Thanks for the help, just want to make sure the output is valid.
As an update to this, I tested reducing OMP_NUM_THREADS down from 7, to 3,2 and 1. The original example above used 7 threads, and had 4 apparent load errors. With 3 and 2 threads I still had 2 and 1 errors respectively. Turning off multi-threading (the 1 case) led to no errors whatsoever (but took forever, of course).
So if the error matters, it appears to be dependent on the number of parallel threads used.
dowdlelt@hypnos:~$ afni_system_check.py -check_all
-------------------------------- general ---------------------------------
architecture: 64bit
system: Linux
release: 4.15.0-041500-generic
version: #201801282230 SMP Sun Jan 28 22:31:30 UTC 2018
distribution: debian stretch/sid
number of CPUs: 16
apparent login shell: bash
shell RC file: .bashrc (exists)
--------------------- AFNI and related program tests ---------------------
which afni : /home/dowdlelt/abin/afni
afni version : Precompiled binary linux_ubuntu_16_64: Jul 6 2018
: AFNI_18.2.04
AFNI_version.txt : AFNI_18.2.04, linux_ubuntu_16_64, Jul 06 2018
which python : /home/dowdlelt/anaconda3/bin/python
python version : 3.6.3
which R : /usr/bin/R
R version : R version 3.4.1 (2017-06-30) -- "Single Candle"
which tcsh : /usr/bin/tcsh
instances of various programs found in PATH:
afni : 1 (/home/dowdlelt/abin/afni)
R : 1 (/usr/bin/R)
python : 2
/home/dowdlelt/anaconda3/bin/python3.6
/usr/bin/python2.7
python2 : 1 (/usr/bin/python2.7)
python3 : 2
/home/dowdlelt/anaconda3/bin/python3.6
/usr/bin/python3.5
testing ability to start various programs...
afni : success
suma : success
3dSkullStrip : success
uber_subject.py : success
3dAllineate : success
3dRSFC : success
SurfMesh : success
3dClustSim : success
checking for R packages...
rPkgsInstall -pkgs ALL -check : FAILURE
oo Warning:
These packages are not installed on the computer: brms!
checking for $HOME files...
.afnirc : found
.sumarc : found
.afni/help/all_progs.COMP : found
------------------------------ python libs -------------------------------
** python module not found: PyQt4
-- PyQt4 is no longer needed for an AFNI bootcamp
-------------------------------- env vars --------------------------------
PATH = /home/dowdlelt/bin:/home/dowdlelt/anaconda3/bin:/home/dowdlelt/soft/code/other/pestica4/slomoco:/home/dowdlelt/soft/code/other/pestica4:/usr/local/fsl/bin:/home/dowdlelt/bin:/home/dowdlelt/soft/mrtrix3/bin:/home/dowdlelt/bin:/usr/local/freesurfer/bin:/usr/local/freesurfer/fsfast/bin:/usr/local/freesurfer/tktools:/usr/local/fsl/bin:/usr/local/freesurfer/mni/bin:/home/dowdlelt/bin:/home/dowdlelt/bin:/home/dowdlelt/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/usr/local/cuda/bin:/snap/bin:/home/dowdlelt/abin:/home/dowdlelt/soft/code/BROCCOLI/compiled/Bash/Linux/Release
PYTHONPATH =
R_LIBS = /home/dowdlelt/R
LD_LIBRARY_PATH = :/usr/lib:/home/dowdlelt/soft/code/BROCCOLI/code/BROCCOLI_LIB/clBLASLinux
DYLD_LIBRARY_PATH =
DYLD_FALLBACK_LIBRARY_PATH =
------------------------------ data checks -------------------------------
data dir : found AFNI_data6 under $HOME/afni_class_data
top history: ...PPI scripts are now based on FT_analysis/s05.ap.uber
data dir : found AFNI_demos under $HOME/afni_class_data
top history: ... [pault]: remove rank from FATCAT_DEMO, FAT_MVM_DEMO
data dir : missing suma_demo
data dir : found afni_handouts under $HOME/afni_class_data
atlas : found TT_N27+tlrc under /home/dowdlelt/abin
------------------------------ OS specific -------------------------------
which apt-get : /usr/bin/apt-get
apt-get version : apt 1.2.27 (amd64)
========================= summary, please fix: =========================
* login shell 'bash', trusting user to translate from 'tcsh'
* have python version 3.6.3, some programs need 2.7.x
* missing R packages (see rPkgsInstall)
* insufficient data for AFNI bootcamp
* consider running: apt-get install python-qt4
That is interesting. It looks like perhaps 3dttest++ continued
when thread 0 finished. However, we have run this many times
with other computers and have not seen such a problem.
Puzzling - for the record I am using Ubuntu 16.04.4 LTS, 16 GB ram (which is not exceeded during this process as far as I can tell), with a AMD Ryzen 7 1700 8 Core processor.
If I get a chance, I will try some other method, perhaps just -Clust, rather than -ETAC, to see if that provides any insight.
This is annoying, and I have seen it before - when other people run ETAC, but not when I have run it. Probably because I use a Mac (mostly).
The failure to read these files means that the ‘5percent’ files will not be produced (read the 3dttest++ -help output), but that is all.
What seems to be happening is conflict when the multiple instances of 3dttest++ spawned by the master program try to write the files simultaneously. I’ll look at the code again, and try to figure out some way around the problem.
Thanks for the follow-up, if it is just the *.5percent.txt files I am less concerned, as while I do like free things, I don’t think I would be using those. It is good to know that it is not an issue that makes the other results deceptive and/or inaccurate.
At this moment, I’m trying to figure out how to avoid this problem, but the problem I’ve having now is that I cannot duplicate the issue.
However, I’ve added a message to inform the user when such ‘minmax.1D’ files can’t be found, that the only loss is the ‘5percent.txt’ files – which I only put in there because one person asked for them, and because they are essentially trivial to compute.
The
National Institute of Mental Health (NIMH) is part of the National Institutes of
Health (NIH), a component of the U.S. Department of Health and Human
Services.