Flicker noise, also called pink noise or ${}^1\!/_{f}$ noise, was first observed in experiments by J.B. Johnson on early electronic amplifiers in 1925 and has since been observed in an amazingly diverse array of places: almost all electronic devices, heartbeats, DNA sequences, film shot length (that is the distribution of times you have to wait from cut to cut, see here). You have probably heard of white noise, and may be aware that there is pink, brown (random walks), blue (Cherenkov radiation) and others. To characterise the different noise colours we have to use a tool called a power spectrum.

The basic question that a power spectrum answers is “How much energy is accounted for by each frequency?”. Signals we measure, which are just functions of time $x(t)$, can be decomposed into various frequency components. That means we can find the various pure tones which together make up the observed signal, and ask how much energy is accounted for by each tone. Some equations may help: the energy of a signal is

\[

E = \int_{-\infty}^{\infty} |x(t)|^2 dt

\]

We take all the measurements, square them and add them all up. Now the idea of a “Spectral Density” is that instead of calculating energy from the signal in time, we take the signal, transform it to a function of frequency, $x(t) \rightarrow \hat{x}(f)$ and calculate the same quantity by summing over all frequencies instead of over all times.

\[

E = \int_{-\infty}^{\infty} |\hat{x}(f)|^2 df

\]

The quantity $|\hat{x}(f)|^2 $ is the spectral density and tells you how much energy is contributed by each frequency.

Pink noise is a signal with a spectral density proportional to ${}^1\!/_{f}$. White noise has a frequency spectrum proportional to a constant, Brown noise goes like ${}^1\!/_{f^2}$, blue noise goes like $f$. Pink noise will be our main interest here. If you are very sharp, you will have noticed that the integral ${}^1\!/_{f}$ over all values of $f$ will diverge, so no noise can be completely pink. In practice though, many real observations of ${}^1\!/_{f}$ processes show no indication of anything other than ${}^1\!/_{f}$ behaviour. This is because the divergence is of a particularly mild sort,

\[

\int_{f_1}^{f_2} \frac{1}{f} df = \ln(\frac{f_2}{f_1})

\]

following Milotti we can estimate that the lowest possible frequency in any conceivable experiment is the inverse age of the universe $\sim 10^{-17}$Hz and the highest possible frequency is something like the the inverse Plank time $\sim 10^{43}$. Thus using these times our divergent integral gives $138 \ll \infty$.

One of the most famous ${}^1\!/_{f}$ processes is the Gutenberg-Richter law, which states that the number of earthquakes of magnitude $M$, $N(M)$, is proportional to $10^{-bM}$ for some number $b \sim 1$ or

\[

log(N) = c – b M.

\]

Data on real earthquakes across quite a range of magnitudes obey this law. The classic model of earthquakes is the so-called Burridge-Knopoff model, which is running here. One tectonic plate moves to the left (the ‘marble table’) and others (the green blocks) are kept from sliding with the table by an attachment to the ‘ceiling’. The interplay between friction and the restoring force of the vertical springs causes the green plates to stick, then slip which counts as an earthquake. The details of the model, and how accurately it reproduces the frequency spectrum of real earthquakes, depends on what you choose for the friction force. Above I used $F_0 v / (|v| + |v|^2)$ but there are many different forms. For an alternative to my simulation try here. Predicting and modeling earthquakes is tricky, the Burridge-Knopoff model seems to break down at low frequencies, earthquakes are likely to be more complicated than a system of blocks and springs.

Since pink noise is observed in so many places is there a universal explanation? We might hope so – for example much of the physics of white noise is explained by independent, uncorrelated, random variables and the framework of the central limit theorem. Similarly brown noise is explained by the theory of random walks. Pink noise on the other hand has resisted such a unifying mathematical framework until very recently. Where it was originally found, vacuum tubes, an explanation was given by Schottky. The cathode surface has sites which trap electrons and release them according to an exponential relaxation process $N = N_0 e^{-at}$ which contributes to the observed current. Unfortunately this particular solution doesn’t seem applicable to earthquakes, much less movies.

Self organised criticality is one of the ‘C’ subjects that emerged in the late 20th century (catastrophe theory, chaos theory, cellular automata…) that nowadays are studied under the umbrella word – Complexity. Complexity is hard to define, even for people studying it. For our purposes I will give the following recursive definition: Complexity it is the study of how complex behaviour can arise from simple interactions. Those interactions are usually local and the ‘complex behaviour’ is difficult to specify precisely, but you know it when you see it. Usually it involves some kind of interesting macroscopic pattern, or some kind of non-obvious collective motion emerging from simple, local rules.

The famous “BTW” paper “Self-organized criticality: An explanation of the ${}^1\!/_{f}$ noise” proposed the sandpile model reproduced above. This is a cellular automaton (the sort beloved by Wolfram) that obeys the following rules: There is a grid with a number of sites. The number of sand grains at a site $(x,y)$ is given by $z(x,y)$. When the height reaches a certain critical height the pile topples and sends one of its grains to each of its neighbours

\[

z(x,y) \rightarrow z(x,y) – 4\\

z(x\pm 1,y\pm 1) \rightarrow z(x\pm 1,y\pm 1) +1

\]

By adding sand and studying the distribution of avalanche sizes, they showed that there is no length scale in the problem and scale invariant structures emerge. Empirically the power spectrum of lifetimes goes like ${}^1\!/_{f^\alpha}$ where $\alpha$ was measured to be close to 1.

The sandpile model came at a time when people were first starting to explore critical phenomona and collective behaviour in other areas of physics after the spectacular successes of renormalization group theory in statistical and particle physics. The sandpile authors clearly hoped that self-organised criticality was a widespread phenomenon. The original paper mentions everything from the flow of the river Nile to cosmic strings. In the language of dynamical systems, self organised criticality is observed in systems with a critical point as an attractor. Left to their own devices these systems will organise themselves into a state with structure on all scales. While many models exhibiting SOC have been constructed there is no good criteria for determining when a model will exhibit SOC. Some Danish interest – Jensen, Christensen and Fogedby showed that real sandpiles actually have a ${}^1\!/_{f^2}$ power spectrum. Real sandpiles do not exhibit SOC but rice piles between two glass plates do.

This is the crux of why SOC is no longer seen as a good explanation of pink noise processes. Since these are very diverse and robust observations and SOC seems to be difficult to define and observe it probably lacks the universality that was originally hoped for. There are alternative and more rigorous mathemathical explainations for pink noise. Given a sequence of data, $X(t)$ and assuming we have stationary increments,

\[

X(i+t) – X(i) = X(t) – X(0)

\]

and self similarity

\[

X(at) = a^H X(t)

\]

where $0 < H < 1$ is called the Hurst index and is something like a fractal dimension. We then find the miraculous relation
\[
Cov(X(t), X(s)) = [ |t|^{2H} + |s|^{2H} - |t-s|^{2H} ] Var(X(1)).
\]
This relationship implies that if we bin the data into bins of size $m$ and let the bin averages $Z$ be the new variables then
\[
Var(Z) \sim E(Z)^{2H}.
\]
Then using a very cool result in Fourier analysis called the Wiener–Khinchin theorem we get that any process with a variance to mean power law displays ${}^1\!/_{f}$ noise, or the more general ${}^1\!/_{f^\gamma}$ noise where $0 < \gamma < 1$. This is called the Tweedie convergence theorem, and with its central limit like effect on Tweedie distributions which have these power laws, it implies that ${}^1\!/_{f}$ noise is ubiquitous, due to the very general nature of these distributions (encompassing Poisson, Gauss and many others). These results are relatively new, and if you want a better explaination there is some local expertise. It may be that we will have Tweedie distributions alongside Gaussian distributions in future stats courses or perhap SOC will be repaired. Physics itself has not yet converged.