I want to use this function with “Conv_PRF” signal model
in order to make pRF visual field coverage plot similar to Silson et al. Journal of Neuroscience (2015),
but I could not find any information on how I should put in “AFNI_MODEL_PRF_STIM_DSET” in the function nor the paper (nor Google).
I have looked up 3dNLfim -signal Conv_PRF -noise Zero -DAFNI_MODEL_HELP_CONV_PRF=Y -load_models
but here it only mentions:
"1. Generate the stimulus time series (currently, images must be square).
This should be a 2D+time dataset of visual stimuli over time. They
are viewed as binary masks by the model function.
If results are computed on a restricted grid (which is much faster
and is the default (see AFNI_MODEL_PRF_ON_GRID)), the resolution of
those X,Y results will come directly from this stimulus dataset.
It might be reasonable to have this be 100 or 200 (or 101 or 201)
voxels on a side. (Minor Question: what does this mean? how can I know whether my results are computed on a restricted grid (which is much faster
and is the default (see AFNI_MODEL_PRF_ON_GRID)) or not?)
The amount of memory used for the precomputation should be the size
of this dataset (in float format) times AFNI_MODEL_PRF_SIGMA_NSTEPS.
It is converted to floats because it is blurred internally.
The default AFNI_MODEL_PRF_SIGMA_NSTEPS is 100." (Minor Question: what does this mean? how can I constrain the amount of memory used for the precomputation to the size of “this”, which I don’t know what is, dataset times AFNI_MODEL_PRF_SIGMA_NSTEPS?)
and also mentions “setenv AFNI_MODEL_PRF_STIM_DSET stim.144.bmask.resam+tlrc”
Therefore here are my major questions. Could you tell me what type of file I should put in for using Conv_PRF signal model?
How can I put a 2D+time dataset of visual stimuli over time in .tlrc form?
Can I put it in other form such as BRIK/HEAD or .mat file?
(For example, I have made 2 bit matrix (.mat) of 100x100 pixels of moving bar stimulus apuncture TR-by-TR
when I tried to implement Kay’s matlab code on pRF analysis.)
Since you are matlab, can you make a 2D+time NIFTI
dataset from those images? Note that it would not
be a 3D dataset, exactly, but with non-trivial
dimensions in X, Y and T (where nZ = 1).
In our case, they started out as BMP images, and so
I used to3d to put them together. If there were
JPEG images, 3dTcat would work more directly. The
orientation was set to be LIA (meaning left-to-right
is fastest, then inf-to-sup is next, and there is
only one slice for ant-to-post). The dimensions
were resampled to 201x201 to basically make a [-1,1]
range at a resolution of 0.01.
The resolution of these stimulus images defines
the resolution of the solution space, since the X,Y
coordinates are based on the images.
Anyway, if you can put them into NIFTI format, say,
or even JPEG, then I can help put them together. It
might help for me to get a copy of that data though.
I have put my pRF Retinotopic scan stimulus into JPEG images, size of 201(pixels)*201(pixels)*360(TRS).
Here I put the url for the stimulus zip folder, if it is the copy of the data you were asking for. https://www.dropbox.com/s/trrdwgpq6hjwwwx/stimJpgResize.zip?dl=0
I was looking for a way to use 3dTcat to put this JPEG images into a 2D+time NIFTI dataset,
but I was not sure whether
I could put JPEG images as inputs instead of HEAD/BRIK/nii files.
how to set the orientation to be LIA
Could I know how I can make these JPEG files into the conv_PRF model input format?
Again I appreciate your detailed answer!
Thank you!
Yes, you can directly input JPEG images to 3dTcat, etc.,
but you might first need to zero-pad the image numbers,
so that they are alphabetical. Consider something like:
mkdir rename
cd stimJpgResize
foreach file ( *.jpg )
# extract index, zero-pad, insert
set ind = `echo $file:r | cut -b 8-`
set pad = `printf "%03d" $ind`
cp -pv $file ../rename/stim.$pad.jpg
end
cd ..
Then run 3dTcat (setting the TR?), then set the orientation:
3dTcat -prefix stim.360 -TR 2 rename/stim.*.jpg
3drefit -orient LIA stim.360+orig
How does that seem?
rick
The
National Institute of Mental Health (NIMH) is part of the National Institutes of
Health (NIH), a component of the U.S. Department of Health and Human
Services.