installing AFNI from local copy

Dear AFNI experts,

I have some questions regarding the installations of AFNI on virtual machines (linux based system). Our IT structure requires us to access and analyse fMRI data using linux-based virtual machines (VM), but the VMs expire after one month. Every time when I launch a new VM, I need to install AFNI from scratch. For that I usually follow the instructions here[/url]. The problem is every time I am installing AFNI on a new VM, I am installing the [i]latest[/i] version, so that the I either end up having to re-do all analyses (including pre-processing) using the latest version or have inconsistent software versions while working on the same project. This is why I was wondering whether there is something I can do to keep the versions stable? With that I mean that it would be great to e.g. download and install the latest version now (e.g., Version AFNI_21.0.16 ‘Titus’) and save it locally to then install it from the local source so that I can use that version of AFNI on local machines that I will be launching while working on this project. Is something like that possible? After reading [url=]here, I was wondering whether these steps are correct or whether I am missing out on something?

[li] download latest binaries using


[li] save them at location “PATH_TO_FILE”
[/li][li] run

tcsh @update.afni.binaries -local_package PATH_TO_FILE/linux_ubuntu_16_64.tgz -do_extras

[li] follow the remaining installation steps from
I am not very versatile in this area, so please apologise if the questions are very basic.

I am looking forward to your feedback.

Best regards,

Hi, Stef-

That looks like exactly the way you would want to do that procedure. You could download whatever package version you want, put it on a USB, say, and then use the “-local_package PATH_TO_USB/BINARY_PACKAGE” option to install it from there.

Sorry you have to reinstall everything (with dependencies? Like R???) every month.

Something to note: a way to approach this all would be to have your initial data, and some scripts, and then keep building your scripts over time, even as the AFNI version might change. At the end, get the latest version of AFNI and run the scripts from the start again, with everything being The Newest, at once. That may or may not be so practicable, depending on how interleaved things are with other software with large runtimes (e.g., FreeSurfer’s recon-all). But another way to approach this.


Hey Paul,

Thank you for your reply. I successfully managed to install from the local copy without further problems.

With respect to the R installation, should I also save e.g. the @add_rcran_ubuntu_18.04.tgz in the same directory to install the corresponding R dependencies for the AFNI version I installed now, or are there no variations over time?

Also, thank you for your suggestion, this is probably good practice in the interest of computational reproducibility anyway.

Best wishes,

If you are using R, and the system gets fully wiped every week, you might have to keep the @add_rcran* script.

Note that I suspect you might be able to copy the ~/R directory and propagate to the new system, so you wait less time for the packages to build (this assumes you have the same operating system from session to session).

So, you could try setting up a system, and then:

tar -zcf R.tgz R

on it, and save R.tgz somewhere (e.g., on a USB), and then after install R next time, copy R.tgz back to the new home directory, and unpack it with:

tar -xf R.tgz

If the looks good, then great; you can even run

rPkgsInstall -pkgs ALL

again at that point, and it would hopefully tell you everything is uptodate.


Other ideas are Docker (like NeuroDocker), Singularity, or a VirtualBox virtual machine. For Windows machines, we sometimes create VirtualBox ova images which are basically whole operating systems that work inside another operating system. You haven’t said if you’re working within Windows or not, but that same idea could work for other operating systems than Windows too. It seems fairly obstructive to have systems wiped weekly, so I would hope your IT department has a solution already. If it is Windows, will they allow WSL (Windows Subsystem for Linux) without the weekly wipes? There are also cloud solutions like setting up Amazon (AWS) or Azure virtual workstations that can be restarted and cloned almost instantly.