Downgrade afni possible? New version breaks pipeline

I am having the same issue with 3dTcorr1d in C-PAC after updating afni in July. Error is below. I am in Ubuntu 16.04.
I need to get an older version of afni (pre June 2019) for Ubuntu 16.04. Is this possible? Otherwise I am quite stuck.

error:
3dTcorr1D main
** Command line was:
3dTcorr1D -pearson -prefix scrubbed_preprocessed_correlation.nii.gz /home/jlee38/RS/wrk/resting_preproc_113051-100/sca_roi_0/_scan_funcf/_threshold_0.25/_compcor_ncomponents_5_selector_pc10.linear1.wm1.global1.motion1.quadratic0.gm0.compcor0.csf1/_bandpass_freqs_0.008.0.08/_mask_tt_mask_pad/3dTCorr1D/scrubbed_preprocessed.nii.gz /home/jlee38/RS/wrk/resting_preproc_113051-100/roi_timeseries_for_sca_0/_scan_funcf/_threshold_0.25/_compcor_ncomponents_5_selector_pc10.linear1.wm1.global1.motion1.quadratic0.gm0.compcor0.csf1/_bandpass_freqs_0.008.0.08/_mask_tt_mask_pad/3dROIstats/roi_stats.csv
** AFNI compile date = Jun 25 2019
** [[Precompiled binary linux_openmp_64: Jun 25 2019]]
** Program Crash **

------ CRASH LOG ------------------------------**
Fatal Signal 11 (SIGSEGV) received
… recent internal history …
-------THD_find_atr [7]: EXIT} (file=thd_atr.c line=452) to THD_datablock_apply_atr {500 ms}
+++++++THD_find_atr [7]: {ENTRY (file=thd_atr.c line=408) from THD_datablock_apply_atr {500 ms}
-------THD_find_atr [7]: EXIT} (file=thd_atr.c line=448) to THD_datablock_apply_atr {500 ms}
THD_datablock_apply_atr – brick statistics {500 ms}
+++++++THD_find_atr [7]: {ENTRY (file=thd_atr.c line=408) from THD_datablock_apply_atr {500 ms}
-------THD_find_atr [7]: EXIT} (file=thd_atr.c line=452) to THD_datablock_apply_atr {500 ms}
+++++++THD_find_atr [7]: {ENTRY (file=thd_atr.c line=408) from THD_datablock_apply_atr {500 ms}
-------THD_find_atr [7]: EXIT} (file=thd_atr.c line=448) to THD_datablock_apply_atr {500 ms}
+++++++THD_find_atr [7]: {ENTRY (file=thd_atr.c line=408) from THD_datablock_apply_atr {500 ms}
-------THD_find_atr [7]: EXIT} (file=thd_atr.c line=448) to THD_datablock_apply_atr {500 ms}
+++++++THD_find_atr [7]: {ENTRY (file=thd_atr.c line=408) from THD_datablock_apply_atr {500 ms}
-------THD_find_atr [7]: EXIT} (file=thd_atr.c line=448) to THD_datablock_apply_atr {500 ms}
+++++++THD_find_atr [7]: {ENTRY (file=thd_atr.c line=408) from THD_datablock_apply_atr {500 ms}
-------THD_find_atr [7]: EXIT} (file=thd_atr.c line=448) to THD_datablock_apply_atr {500 ms}
+++++++THD_find_atr [7]: {ENTRY (file=thd_atr.c line=408) from THD_datablock_apply_atr {500 ms}
-------THD_find_atr [7]: EXIT} (file=thd_atr.c line=448) to THD_datablock_apply_atr {500 ms}
------THD_datablock_apply_atr [6]: EXIT} (file=thd_initdblk.c line=1178) to THD_open_nifti {500 ms}
-----THD_open_nifti [5]: EXIT} (file=thd_niftiread.c line=800) to THD_open_one_dataset {500 ms}
+++++THD_patch_brickim [5]: {ENTRY (file=thd_loaddblk.c line=1274) from THD_open_one_dataset {500 ms}
-----THD_patch_brickim [5]: EXIT} (file=thd_loaddblk.c line=1313) to THD_open_one_dataset {500 ms}
+++++THD_report_obliquity [5]: {ENTRY (file=thd_coords.c line=704) from THD_open_one_dataset {500 ms}
++++++AFNI_process_environ [6]: {ENTRY (file=afni_environ.c line=112) from THD_report_obliquity {500 ms}
+++++++AFNI_suck_file [7]: {ENTRY (file=afni_environ.c line=25) from AFNI_process_environ {500 ms}
-------AFNI_suck_file [7]: EXIT} (file=afni_environ.c line=30) to AFNI_process_environ {500 ms}
------AFNI_process_environ [6]: EXIT} (file=afni_environ.c line=132) to THD_report_obliquity {500 ms}
-----THD_report_obliquity [5]: EXIT} (file=thd_coords.c line=711) to THD_open_one_dataset {500 ms}
----THD_open_one_dataset [4]: EXIT} (file=thd_opendset.c line=252) to THD_open_dataset {500 ms}
++++THD_patch_brickim [4]: {ENTRY (file=thd_loaddblk.c line=1274) from THD_open_dataset {500 ms}
----THD_patch_brickim [4]: EXIT} (file=thd_loaddblk.c line=1313) to THD_open_dataset {500 ms}
—THD_open_dataset [3]: EXIT} (file=thd_mastery.c line=171) to 3dTcorr1D main {500 ms}
+++mri_read_1D [3]: {ENTRY (file=mri_read.c line=2997) from 3dTcorr1D main {500 ms}

I am not sure it would know what to do with a .csv file. Are there actual commas in that? If so, it is surely out.

If it is a simple time series, please rename it to a .1D file, try again, and let us know how that goes.

Thanks,

  • rick

Looking more closely, it seems this is intended to work. However there is a function call that needs special defines depending on the system type. I expect this worked for you on a mac, but not on ubuntu.

Also, it seems that .csv files are required to have a header line. Does yours?

I am making an update so that this should work under Linux, and will run a new build tonight.

Thanks for letting us know,

  • rick