question about analysing preprocessed HCP data

Hi All,

I am analysing some minimally preprocessed resting state data from the human connectome project and need some pointers. I have attached a file that shows the different files available for subject. This data has been spatially normalized to the MNI space and minimally preprocessed. Looks like I have the motion corrected data and the motion parameters.

I need to apply detrending, band-pass filtering, despiking and motion scrubbing to this data. As you can see from the screenshot, the motion parameters have been estimated. The motion parameters are output in a twelve column text file with the following format: <derivative of×translation> <derivative of×rotation> . A demeaned and linearly detrended motion parameter file is provided as well for nuisance regression.

I am looking at 3Dbandpass and 3DTproject to do the above steps - detrending, band-pass filtering, despiking and motion scrubbing.

  1. Is the order of processing steps correct ?

  2. How can I estimate what needs to be scrubbed ? Which programs can I use to decide which points to scrub ?

Please let me know.

Thanks a lot


This query may have fallen through the cracks so reposting this message.

Please let me know your thoughts.


Rito is the program that will do all of this. It is able to call all of the other programs you mentioned. See the help for that program, paying particular attention to the examples that get progressively more complicated. We also have the AFNI academy videos (Start to Finish) and tutorials on

Thank you for the quick reply. Will I be able to invoke on the HCP minimally preprocessed dataset ? The data I have downloaded from HCP has motion parameter estimated and is nonlinearly registered to MNI.


You could run with options like:

-blocks blur scale regress
-regress_motion_file motion_params.txt

Then only include options for those 3 processing blocks.

  • rick