Dear AFNI team,
I am looking to downscale an image (PET, BOLD) to a lower resolution in order to match images with a lower resolution and SNR.
I am using images from different species, scanned with different machines, with very different SNRs.
Can you think of a meaningful way of downscaling a high resolution/SNR image to match an image at a lower resolution and SNR that would make them more “comparable”?
I guess smoothing would be a step, but would not really add noise to the high-resolution/SNR image.
Thank you in advance!
Can you show an example of what you mean?
for example, I have a human PET image, with its associated SNR and resolution.
I have a PET image in mice but with a lower SNR and higher resolution.
I would like to add “noise” to the human image in order to fit the mouse SNR and maybe signal distribution.
I hope that this is clearer :)-D
I did mean “show” with images, but okay. You don’t mean a TSNR map, but just a single volume of PET imaging from some scan of a radioisotope that shows whole brain coverage? I am assuming FDG as the tracer then? Are you looking at the mean, median image, peak or something else?
The SNR is computed as the mean signal within the brain compared to its own standard deviation as a ratio or to standard deviation outside the brain, in air, in corners,…, or only used in a more qualitative way? Are these uptake volume ratios normalized by a ratio to a particular part of the brain, like the cerebellum?
In a general sense, 3dUnifize normalizes data, but it does have a radius parameter that can be adjusted for different species. This is a non-smoothing operation, just varying intensities and scaling them to have a value of about 1000 in a sphere that covers a fair chunk of the input. I think you can ignore the GM option for noisy PET data. You can also use 3dLocalUnifize does a similar job but scales using the median value in a spherical neighborhood to values around 1.0. Again the radius is an option.
Making data noisier can be done with 3dcalc. There are various options for random numbers. I think I have only used gran in the past to add random noise, but you have choices.
- gran(m,s) returns a Gaussian deviate with mean=m, stdev=s
- uran(r) returns a uniform deviate in the range [0,r]
- iran(t) returns a random integer in the range [0…t]
- eran(s) returns an exponentially distributed deviate
with parameter s; mean=s
- lran(t) returns a logistically distributed deviate
with parameter t; mean=0, stdev=t*1.814
You may also want to blur the data in similar ways with 3dBlurToFWHM. The FWHM size should be approximately proportional to voxel sizes across the species.
Basically, everywhere a particular method uses a distance in mm, you would have to adjust for that to accommodate different species. That’s the purpose of the “feature_size” option in @animal_warper for passing a radius to 3dAllineate for alignment. Nonlinear alignment with 3dQwarp assumes voxel distances, and doesn’t need adjustment.
I’m not sure at all if any of that is helpful, so feel free to followup with more information and questions.