February 1, 2021

Infinite Possibility

All mimsy were the borogoves,
And the mome raths outgrabe.
Lewis Carroll (1832-1898), "Jabberwocky"

An ape sits hunched over a keyboard. A long hairy finger bangs a key, and the letter a appears on the computer screen. Another random stab produces n, then a space, then a, p, and e.

That an ape would generate this particular sequence of characters is, of course, highly improbable. In the realm of random processes, however, any conceivable sequence of characters is possible, from utter gibberish to the full text of this article.

The seemingly infinite possibilities offered by randomness have long intrigued me. Years ago when I was a high school student, I came across a provocative statement by Arthur Stanley Eddington (1882-1944), a prominent astronomer and physicist. "If an army of monkeys were strumming on typewriters, they might write all the books in the British Museum," he noted in his book The Nature of the Physical World (1928).


Eddington wanted to emphasize the improbability of such an outcome, and his remark was meant as an example of something that could happen in principle but never in practice.

The article that you are now reading contains nearly two thousand words, or roughly ten thousand characters, including spaces. An ape at a keyboard could choose among twenty-six letters and two dozen or so additional keys for punctuation, numbers, and other characters.

Suppose the ape has one chance in fifty of hitting a as the first letter, one chance in fifty of picking n nextand so on. To write the entire article, the ape would have to make the correct choice again and again.

The probability of such an occurrence is one in fifty multiplied by itself ten thousand times, or one in 5010,000. That figure is overwhelmingly larger than the estimated number of atoms in the universe, which is merely on the order of 1080.

You would have to wait an exceedingly long time before a member of a troop of apes (see "Infinite Monkey Theorem") happened to compose this article by chance, let along the millions of volumes in the British Library or the Library of Congress.

Sifting through the troop's vast output to find the flawless gems, including original works of significant merit, would itself be a notably frustrating, unrewarding task. By eschewing randomness, a human author, on the other hand, can generate a meaningful string of characters far more efficiently than an ape, and the output generally requires considerably less editing.

Most people, including mathematicians and scientists, would say that they have a good idea of what the word random (randomness) means. They can give all sorts of examples of random processes, from the flipping of a fair coin to the decay of a radioactive atomic nucleus.

They can also list phenomena in which chance doesn't appear to play a role, from the motion of Earth around the sun to the ricochets of a ball between the cushions of a billiard table and the steady vibrations of a violin's plucked string.

Often, we use the word random loosely to describe something that is apparently disordered, irregular, patternless, or unpredictable. We link it with chance, probability, luck, and coincidence. However, when we examine what we mean by random in various contexts, ambiguities and uncertainties inevitably arise.

Tackling the subtleties of randomness allows us to go to the root of what we can understand of the universe we inhabit and helps us to define the limits of what we can know with certainty.

We think of flipping a coin as a way of making a blind choice, yet in the hands of a skilled magician the outcome may be perfectly predictable. Moreover, a process governed entirely by chance can lead to a completely ordered result, whether in the domain of monkeys pounding on keyboards or atoms locking into place to form a crystal.

At the same time, a deterministic process can produce an unpredictable outcome, as seen in the waywardness of a ball rebounding within a stadium-shaped billiard  table (see "Billiards in the Round") or heard in the screeches of an irregularly vibrating violin string.

We can even invent mathematical formulas to generate predictable sequences of numbers (see "Random Bits") that computers can then use to simulate the haphazard wanderings of perfume molecules drifting through the air (see "Random Quivers").

It is useful to distinguish between a random process and the results of such a process. For example, we think of typing monkeys as generators of random strings of characters. If we know that such a random process is responsible for a given string, we may be justified in labeling or interpreting the string as random.

However, if we don't know the source of a given string, we are forced to turn to other methods to determine what, if anything, the string means. Indeed, reading itself invokes just such a search for meaning among the lines of characters printed on a page or displayed on a computer screen.

Consider the passage from Lewis Carroll's poem "Jabberwocky" that starts off this essay. From the presence of a few familiar words, the pattern of spaces, and the vowel-consonant structure of the remaining words, we would surmise that the author intended those lines to mean something, even though we don't understand many of the words.

If the same passage were to come from typing monkeys, however, we might very well reject it as gibberish, despite the fragments of structure and pattern.

Similarly, in flipping a coin we know from experience (or theory) that we're likely to obtain an equal number of heads and tails in a long sequence of tosses. So if we see twenty-five heads in a row, it might be the legitimate though improbable result of a random process.

At the same time, it might also be advisable to check whether the coin is fair and to find out something about the individual who's doing the flipping. The context determines how we interpret the data.

Moreover, just because we happen to see a sequence of roughly equal numbers of heads and tails doesn't mean that the results arise from tosses of a fair coin. It's possible to program a computer with a mathematical recipe that involves no randomness yet gives the same distribution of heads and tails.

Thus, given an arbitrary sequence of head and tails, there's really no way to tell with confidence whether it's the result of a random process or it's been generated by a formula based on simple arithmetic.

"From a purely operational point of view…the concept of randomness is so elusive as to cease to be viable," Mark Kac wrote in a 1983 essay on the nature of randomness. Kac also took a critical look at the different ways in which we sometimes interpret randomness in different contexts.

For example, in the 1970 book Chance and Necessity: Essay on the Natural Philosophy of Modern Biology, biochemist Jacques Monod (1910-1976) suggested that a distinction be made between "disciplined" chance, as used in physics to describe, say, radioactive decay, and "blind" chance. As an example, he cited the death of doctor who, on his way to see a patient, was killed by a brick that fell from the roof of a building.

Kac argued that the distinction Monod makes really isn't meaningful. Although statistics on doctors killed by falling bricks aren't readily available, there are extensive data on Prussian soldiers kicked to death by horses—events that also fall under the category of blind chance.

When you compare data on the number of soldiers killed in specified time intervals with data on the number of radioactive decays that have occurred in analogous periods, the two distributions of events look very similar.

Mathematics and statistics provide ways to sort through the various meanings of randomness and to distinguish between what we can and cannot know. They help us shape our expectations in different situations.

In many cases, we find that there are no guarantees, only probabilities. We need to learn to recognize such limitations on certainty.

The search for pattern is a pervasive theme in mathematics. It is this pursuit that brings to our attention the curious interplay of order hidden in randomness and the randomness that is embedded in order. It's part of what makes mathematics such an alluring sport for mathematicians.

Mathematics serves as a framework for understanding a wide range of phenomena, from the vagaries of roulette wheels to the synchronization of cells in a beating heart. It's like opening up a watch to see what makes it tick. Instead of gears, levers, springs, and wheels, we see equations and other pieces of mathematical apparatus.

Characterizing the vibrations of a drum's membrane (see "Drums That Sound Alike" and "Fractal Drum"), arranging points on the surface of a sphere, modeling the synchronized blink of a cloud of fireflies in Thailand, and playing games of chance are among the mathematical pastimes that provide connections to various aspects of everyday life.

Each of these playful activities has prompted new thinking in mathematics, Each one brings randomness into play.

Intriguingly, the mathematics of randomness, chaos, and order furnishes what may be a vital escape from absolute certainty—an opportunity to exercise free will in a deterministic universe. Indeed, in the interplay of order and disorder that makes life interesting, we appear perpetually poised in a state of precarious perplexity.

The universe is neither so crazy that we can't understand it at all nor so predictable that there's nothing left for us to discover.

No comments: