No FT file under AFNI_analysis


I have been following the FMRI Analysis: Start to End: Part 1 of 5 video. I downloaded AFNI_data6.
In the video, I can see that under ls AFNI_analysis, you have an FT input data, while in my attachment you can see that I do not. Rick then performs cd FT and then ls in order to reach for the timing files etc… My question is, why don’t I have the FT file under ls AFNI_analysis?

Thank so much for the help!


Hi, Jessica-

If you go into the distributed AFNI_data6/FT_analysis/ directory and type “ls”, then you should see these items (some files, some directories; if you have run some scripts, there might be more items):

compare.s05.vs.ex6b  README.txt             s02.ap.align     s15.proc.FT.uber.NL
compare.s05.vs.NL    results.clustsim.05    s03.ap.surface  s09.cleanup
FT                   results.QC_FT.05       s04.cmd.usubj   s11.proc.FT
index.html           results.QC_FT.05.NL    s05.ap.uber     s12.proc.FT.align  tutorial
PPI          s05.ap.uber.NL
Qwarp                s01.ap.simple     s15.proc.FT.uber

If you aren’t seeing all of these, I can only guess that there might have been an error in downloading or unpacking the CD.tgz directory. Perhaps you could move these directories that were unpacked by s2.* in CD/:
… and then try re-downloading and extracting the files:

curl -O
tar xvzf CD.tgz
cd CD
tcsh s2.cp.files . ~
cd ..


Hi Jessica,

Indeed, your ‘ls’ command does not show an FT directory. I verified that FT is part of the AFNI_data6.tgz package, and it is near the end of it. You are also missing the directories that come after FT, such as, roi_demo and group_results.

Anyway, as Paul suggested, try installing the data package again, and please let us know if you have trouble.

  • rick

Hi PT,

Thank you so much for your quick reply. Yes, it could be an error in downloading and unpacking the directory. How can I delete AFNI_data6 entirely so that I re-download again?

I am very new to AFNI, so I could not write the appropriate command which you told me to do: ‘‘Perhaps you could move these directories that were unpacked by s2.* in CD/:

Can you kindly tell me how to write this command clearly? Is it on the same sentence?

Many thanks for your help!


Hi, Jessica-

Are the existing directories your home directory? That would be the default location from our online setup instructions. You could make a new directory called “backup_bootcamp”, say, and put them there (just in case you want to check something later; if not, you can delete them).

Then the commands to install+unpack the Bootcamp data can be copy+pasted from here (for any operating system, the copy+paste commands are the same, I just happened to pick the Linux set):
After running the commands in the green text box, the directories should be unpacked in your home directory. You should be able to navigate to ~/AFNI_data6/FT_analysis/, and type “ls” and see the full set of directories and files, listed above.

Please let us know how that goes.


Hi PT,

Do you mean I only need to copy+paste the following?

curl -O
tar xvzf CD.tgz
cd CD
tcsh s2.cp.files . ~
cd …

If yes, before doing that, how can I completely delete the already downloaded afni_data6 so that my storage space does not fill up in Ubuntu?

Thank you so much for your time and help.


Hi, Jessica-

Yes, those lines are what should be copy+pasted.

----------------------- preamble to removing directories

Firstly, I cannot stress enough that removing files/directories with the “rm” command does not put them into a trash bin, but instead removes them permanently. If you later change your mind or make a mistake you cannot get them back… They gone, gone, gone. So:

  1. be really sure you want to delete something.
  2. be really sure you have specified that/those thing(s) correctly.

If you have the space currently, I would recommend moving the files/directories first, and go to remove them later. To check your disk space on the terminal, there is:

df -mh

… and you can navigate to the main “Filesystem” in question (it probably starts with /dev/SOMETHING).

Additionally, it is possible to use the “Files” application in Ubuntu to navigate through a GUI of directories structures (like you would in Windows, for example), you can select a directory name, right-click it, and move it to the trash, and then empty your trash. If you have lots of directories to remove, this can become tedious, though.

Finally, using wildcard characters can speed up the process of selecting files or directories with similar names. Sometimes we use that with listing things, such as listing all files or directories in your home directory that start with “AFNI_data” and might have any other characters after that start, including not having any more characters (the “-d” option means that it will only list items in the base directory, and not recursively lists contents of any directories that might start with “AFNI_data”):

ls -d ~/AFNI_data*

These are useful to select a lot of files/directories systematically and efficiently, but if you use them while removing, you reeeeeally want to be sure you are only selecting exactly what you want. I would not use wildcards in combination with removing unless you are very comfortable with them and shell commands.

Note Linux-wise, Rick has a nice tutorial on getting comfortable with command line (shell) syntax, which I would recommend everyone read and practice with:
Linux commands are really useful, and this is a nice intro to start getting comfortable and practicing with them.

------------------- ok, to the question at hand:

Removing files and directories has slightly different syntax.

To delete a file in the terminal, you would type:


or, if you have several, you can remove more than one at a time:


You could use while cards to specify your list of FILE_NAME* values, like “remove all files that start with pineapple”:

rm pineapple*

… and be sure that you know what exactly files start with pineapple, so that if you just wanted to remove “pineapple_tmp.txt” and “pineapple_old.txt”, but NOT “pineapple_thekeytolife.txt”, that you do not use the wildcard like this, but instead specify individual files…

To remove a directory, you would specify the directory name and provide an option to say to recursively remove everything in it, too.

rm -r DIR_NAME

Note: if you type the wrong directory name, the terminal won’t check with you, and the shell will remove stuff. Sooo, be very careful that DIR_NAME is not mistyped—putting accidental spaces would separate the name into multiple apparent directories, and the shell will try to remove each piece, which could be very bad depending on how the pieces split. You can use wildcards here, too, but I would not recommend that until you are very comfortable with the shell syntax, because it is too easy to remove too much (an accidental space before a wildcard can be catastrophic).

Again, I don’t mean to be scary, and playing around with Linux/shell commands is the only way to get better with them and to feel comfortable. It is just that the “rm” is quite powerful and not forgiving, and so one really just has to be careful and sure of what is being specified to remove in this way. I guess everyone has removed something they reeeeaallly didn’t want to at some time, and been very sad afterward, as part of the learning curve—I know I certainly have :(.


I will try to re-do this with caution and will let you know if the problem will be fixed. Thank you sooo much for your time and efforts!!

Hi again PT,

Just to update you, I tried to re-download the data 6 several times. For some reason, the download stops and the ‘current speed’ goes from 300k to 0 !

Please find attached to have a look at what is exactly happening when the download stops.

Why is this happening? Is it because there is a prerequisite that I still haven’t downloaded? or is it because of my internet speed? …

Many thanks and looking forward to your reply!


Hi, Jessica-

I think it is your internet speed. That curl command is simply downloading (no dependencies necessary). It is a big download—3.2GB.

I am using the same curl command on my desktop, and it worked OK, taking just over 6mins.


Yes you are right. I connected to a stronger wifi connection and it fully downloaded the 3.2 GB. I now have the FT file in afni_data6 :slight_smile:

On another note, I did the system check ( -check_all) and it said that R packages are missing although I have successfully installed the following:
setenv R_LIBS $HOME/R
mkdir $R_LIBS
echo ‘export R_LIBS=$HOME/R’ >> ~/.bashrc
echo ‘setenv R_LIBS ~/R’ >> ~/.cshrc
curl -O

Please find attached the output. Is there something that I still need to copy+paste in order not to have missing R packages?

Many thanks and looking forward to your reply.

Kind Regards,


Hi, Jessica-

Glad you got the data downloaded.

Could you please copy+paste the entire text of: -check_all

(the screenshot images are hard to read and not comprehensive).


Hi PT,

I did copy+paste the entire entire text. It showed that R packages are missing

Thank you,

Hi, Jessica-

I am not seeing the full copy+pasted text. The full text is probably about 100 lines long in the terminal. The first line of it is:

-------------------------------- general ---------------------------------

I see an image called Missing_R_Packages.png that contains the bottom part only of the full “ -check_all” output. It doesn’t contain all the information about the system setup.


Okay, sure I will copy+paste the 100 lines from the terminal.
But I don’t know why my Ubuntu on VirtualBox just went black screen and I am unable to log in! very weird.
I will try to see how to fix this and will get back to you with the full text.

Thank you!

Hi, Jessica-

OK, thanks.

Note that on a Windows 10 machine, there are native Linux/Ubuntu kernels now available. See here and links therein for setting one up and using AFNI there:

In general, these seem pretty nice, probably have better memory usage on a given system than the virtual boxes, and might be simpler for analyses. This might be something to consider switching to/using on your system, at some point.


Yes, I think when I downloaded the 3.2 GB my Ubuntu crashed due to memory storage limitations.

Thank you for your suggestion. If I do that, will I lose all my files on my Windows 10?



Hi, Jessica-

The Windows Subsystem Linux (WSL) is not replacing your Windows OS: it is an application that is run within Windows, a program/application that you open, use and can close (e.g,. like a browser or Word, etc.). But this is not a virtual machine, either—it has better performance. You can send files back and forth from the Windows filesystem to the Linux section. You can use a browser natively from your Linux part.

This WSL is different than partitioning your harddrive and adding Linux as a separate OS or overwriting your Windows (neither of which I am recommending here).


Hi PT,
Thanks so much for the clarification! Your constant help is much appreciated!

Take care,

I am trying to download the bootcamp data, too. Perhaps it is a link issue? I'm not able to access the site from my home macbook.

My code below is:

(base) srijaseenivasan@Srijas-MacBook-Air / % curl -O
tar xvzf CD.tgz
cd CD
tcsh s2.cp.files . ~
cd …
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0Warning: Failed to open the file CD.tgz: Read-only file system
100 226 100 226 0 0 1978 0 --:--:-- --:--:-- --:--:-- 2092
curl: (23) Failure writing output to destination
tar: Error opening archive: Failed to open 'CD.tgz'
cd: no such file or directory: CD
s2.cp.files: No such file or directory.
cd: no such file or directory: …

In case needed, this is output from: -check_all

========================= summary, please fix: =========================

  • just be aware: login shell 'zsh', but our code examples use 'tcsh'
  • missing R packages (see rPkgsInstall)
  • dot file test : want 1 modifications across 3 files:
  • insufficient data for AFNI bootcamp
    (see "Prepare for Bootcamp" on install pages)