Hi Paul,

LZC: It looks complicated on paper but it’s actually simple and fun when you get the hold of it. Suppose you have this time series:

0.5, 0.1, 0.1, 0.4, 0.5, 0.9, 5, 0.23, 999

First, you take the median of it (for thresholding)

0.1, 0.1, 0.23, 0.4, 0.5, 0.5, 0.9, 5, 999

The median is 0.5.

Second, we round the values to make them a binary sequence (binary in this case is arbitrary. You can actually make trinary (is that a word?) or even septary (definitely not a word).). The values that are equeal to or bigger than median are rounded to 1s, lower than median are rounded to 0s:

0.5, 0.1, 0.1, 0.4, 0.5, 0.9, 5, 0.23, 999 ==> 1 0 0 0 1 1 1 0 1

After that, we use a seperator.

1 | 0 0 0 1 1 1 0 1 ==> LZC= 1

We look at the values at right and see if it is in the left. It might sound tricky but look at the example and it will become easier:

1 | (0) 0 0 1 1 1 0 1 ==> LZC= 1

1 0 | 0 0 1 1 1 0 1 ==> LZC = 2 (0 was new so we upped the LZC)

1 0 | (0) 0 1 1 1 0 1 ==> LZC = 2 (0 was included on the left so LZC remains the same)

1 0 | (0 0) 1 1 1 0 1 ==> 1 0 0 0 | 1 1 1 0 1 ==> LZC = 3 (0 0 was new so we upped the LZC)

1 0 0 0 | (1) 1 1 0 1 ==> LZC = 3 (1 was included. LZC remains the same)

1 0 0 0 | (1 1) 1 0 1 ==> (1 1 is new) ==> 1 0 0 0 1 1 | 1 0 1 ==> LZC = 4

1 0 0 0 1 1 | (1) 0 1 ==> (1 is included. LZC remains same.)

1 0 0 0 1 1 | (1 0) 1 ==> (1 0 is included (firs two numbers), so LZC remains same)

1 0 0 0 1 1 | (1 0 1) ==> 1 0 1 is new, increase LZC ==> 1 0 0 0 1 1 1 0 1 ==> LZC = 5

Basically, this is how LZC is calculated. If you want to eliminate the effect of the length of the signal, you can normalize LZC by dividing it to loga(n); a is the different numbers in your transformed time series (2 in our case) and n is the number of time points. So that way you can compare different LZC values. But this normalization method doesn’t seem very convincing to me for some reason. It tickles the skeptic inside me.

For a really fun use of LZC: A guy compared pop lyrics from different time periods: https://pudding.cool/2017/05/song-repetition/

For the case of using LZC in biomedical signaling: https://www.researchgate.net/publication/242745630_Application_of_the_Lempel-Ziv_complexity_measure_to_the_analysis_of_biosignals_and_medical_images and https://www.researchgate.net/publication/6723694_Interpretation_of_the_Lempel-Ziv_Complexity_Measure_in_the_Context_of_Biomedical_Signal_Analysis

For implementation of this method to fMRI signal: https://www.biorxiv.org/content/10.1101/2020.06.11.106476v1

For an implementation (and many other cool measures) in MATLAB: https://github.com/SorenWT/dynameas (It would be really cool to have this in AFNILAND. Currently, I am extracting the signal using 3dmaskave and doing further stuff on MATLAB. Maybe a program called 3dLZC? 8-))

The interpretation of LZC: This is a tricky one. Authors from the link above think it can be used to calculate complexity (duh) of fMRI signal and it’s different in so-called core and periphery regions and it changes from rest to task. And median frequency (the frequency that seperates the power spectrum to two equal halves) modulates the complexity. I have some hypotheses too, but they are hypotheses at best (right now).

fSD: This is a simpler one. SD is the good old standart deviation. How can an SD be fractional? Take the SD of slow3 frequency and divide it by SD of whole frequency band. This is the fSD of slow3 (you can do the same to whatever frequency band you want).

Interpretation: It gives you a proxy of brain signal’s adaptability (and if it is too much, unstability). For a heathy case: https://www.jneurosci.org/content/31/12/4496#:~:text=A%20more%20variable%20brain%20is%20a%20more%20effective%20brain&text=With%20relatively%20greater%20variability%2C%20and,from%20one%20state%20to%20another.

For a pathological case: https://www.researchgate.net/publication/328563566_Opposing_patterns_of_neuronal_variability_in_the_sensorimotor_network_mediate_cyclothymic_and_depressive_temperaments

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4855585/#si1

So that’s all I guess. Sorry for the really long post. I got carried away and wrote a lot. And thank you.

Yasir