Hi AFNI experts (most directly, Gang),
I am trying to run a 3dLME command and am having some troubles with what appears to be the reading of the input files.
I am trying to measure the difference in the correlation between my I.V. (SSRT) and BOLD activity between two different time points. Time point 1 = BL, Time point 2 = FU2. The model pasted below is what I am using to run this. What I think I have set up is: GLT1 = the effect of SSRT on BOLD at BL; GLT2 = the effect of SSRT on BOLD at FU2; GLT3 = the difference between the effect of SSRT at BL and the effect of SSRT at FU2. Here are a few details that might be helpful:
- there are 538 total participants (only 5 shown here for brevity). All participants have data for both BL and FU2 time pints. In my dataTable, the last line of the table does not include a .
- The input files that I am calling on are separate contrasts for stop-related activity generated from the first level GLM. Therefore, there are no sub-briks. The file contains only 1 data point per voxel.
- I tried running the same model using 3dMVM (because I have equal number of subjects), but the results were completely null (either “N/A” or “0.00” throughout the results table).
- I have run this same command with and without the ’ surrounding the filenames, and it does not make a difference.
- the file paths are correct
Here is the command I am using:
3dLME -prefix SSRT -jobs 1
-mask StdGMmask_resampled+tlrc.HEAD
-model ‘SSRTTime’
-qVars ‘SSRT’
-qVarCenters ‘215’
-ranEff ‘~1+SSRT’
-SS_type 3
-num_glt 3
-gltLabel 1 ‘BL_Effects’ -gltCode 1 'Time : 1BL SSRT :’
-gltLabel 2 ‘FU2_Effects’ -gltCode 2 ‘Time : 1FU2 SSRT :’
-gltLabel 3 ‘BL_v_FU2’ -gltCode 3 'Time : -1BL 1*FU2 SSRT :’
-dataTable
Subjects Time SSRT InputFile
s01 BL 255.5 ‘/Volumes/NickBackUp/StopConRegressions/LME/BL_538/000000240546_BL_stopcon.nii’
s01 FU2 257 ‘/Volumes/NickBackUp/StopConRegressions/LME/FU2_538/000000240546_FU2_stopcon.nii’
s02 BL 202.46 ‘/Volumes/NickBackUp/StopConRegressions/LME/BL_538/000000297685_BL_stopcon.nii’
s02 FU2 206 ‘/Volumes/NickBackUp/StopConRegressions/LME/FU2_538/000000297685_FU2_stopcon.nii’
s03 BL 275 ‘/Volumes/NickBackUp/StopConRegressions/LME/BL_538/000000469693_BL_stopcon.nii’
s03 FU2 175 ‘/Volumes/NickBackUp/StopConRegressions/LME/FU2_538/000000469693_FU2_stopcon.nii’
s04 BL 246.83 ‘/Volumes/NickBackUp/StopConRegressions/LME/BL_538/000000556983_BL_stopcon.nii’
s04 FU2 223 ‘/Volumes/NickBackUp/StopConRegressions/LME/FU2_538/000000556983_FU2_stopcon.nii’
s05 BL 272.25 ‘/Volumes/NickBackUp/StopConRegressions/LME/BL_538/000000613223_BL_stopcon.nii’
s05 FU2 244 ‘/Volumes/NickBackUp/StopConRegressions/LME/FU2_538/000000613223_FU2_stopcon.nii’
…
when I run this, I get the following output:
Loading required package: nlme
Package nlme loaded successfully!
Loading required package: phia
Loading required package: car
Loading required package: carData
Package phia loaded successfully!
Warning messages:
1: package ‘car’ was built under R version 3.4.4
2: package ‘carData’ was built under R version 3.4.4
++++++++++++++++++++++++++++++++++++++++++++++++++++
***** Summary information of data structure *****
#####this goes on to list the summary of the subjects and data…
Contingency tables of subject distributions among the categorical variables:
Tabulation of subjects against all categorical variables
Subj vs Time:
BL FU2
s01 1 1
s02 1 1
s03 1 1
s04 1 1
s05 1 1
... ###### this goes on throughout all 538 participants....
###### AND THEN THIS ERROR:
No traceback available
e[7m** ERROR:e[0m Dset s01 could not be loaded
Warning in parse.AFNI.name(filename) :
filename >>s01<< not a character string
###### for each participant...
######and this at the very end:
** Error:
At least one of the input files has different dimensions!
Run "3dinfo -header_line -prefix -same_grid -n4 *.HEAD" in the directory where
the files are stored, and pinpoint out which file(s) is the trouble maker.
Replace *.HEAD with *.nii or something similar for other file formats.
###### when I run this 3difo command, all the data is exactly the same.
Please let me know where you think the error is coming from and what I should do to address this.
Thank you for the help,
Nick