The mathematician and computer scientist Gregory Chaitin, who is well known for contributions to algorithmic information theory, questions the "reality" of real numbers [Chaitin, 2004, 2005]. He points out mathematical, philosophical, and computational difficulties with real numbers and concludes that these difficulties undermine the common assumption that real numbers underlie physical reality, strongly suggesting that physical reality may in fact be discrete, digital and computational.
In a first example of the difficulties posed by real numbers, Chaitin cites the French mathematician Emile Borel, best known for his foundational work in measure theory and probability. Borel, in 1927, in Chaitin's words,
pointed out that if you really believe in the notion of a real number as an infinite sequence of digits 3.1415926 ..., then you could put all of human knowledge into a single real number. [Chaitin, 2005]
Chaitin calls this number "Borel's amazing know-it-all real number."
One way to construct Borel's number is to list, in some order, all the yes-no questions that have answers (Borel's questions were in French, and they could be ordered by length in characters and then alphabetically). These can be listed because the set of all texts in any fixed written language is countable. Then Borel's number can be represented in binary as 0.b1b2b3 ... where bi is 0 if the answer to the i-th question is NO and 1 if it is YES. The resulting number is a real number between 0 and 1.
Chaitin implies that Borel's number could not exist in any reasonable sense, presumably because it is not possible to "know it all." There are some problems with this argument. First, what does it mean for a number to "exist"? The philosopher Immanuel Kant made the distinction between the world as it is, the thing-in-itself (das Ding an sich in German), and the phenomenal world, or the world as it appears to us. Let us assume that what Chaitin means by "exists" is that it is a thing-in-itself, in which case, whether the know-it-all number reveals itself to us, becoming a phenomenon, is irrelevant to its existence.
In fact, if Borel's number exists as a thing-in-itself, outside ourselves, then it cannot reveal itself to us. In Plato and the Nerd, I review Claude Shannon's channel capacity theorem [Shannon, 1948], which states that any noisy observation of anything conveys only a finite number of bits of information. Borel's know-it-all number cannot be encoded with finite number of bits unless the list all possible yes-no questions is finite, which it is not. It is easy to construct an infinite sequence of valid yes-no questions. For example, let the first question be "Is one a whole number?" Let the second question be "Is the answer to the first question YES?" Let the third question be "Is the answer to the second question YES?" And so on. As a consequence, Borel's number cannot reveal itself to us unless we invent some noiseless way of observing a thing in itself. Assuming no such noiseless channel exists, the channel capacity theorem implies that we cannot know Borel's number. But this in no way undermines its existence.
Returning to the original question, are real numbers real, I have to ask, are numbers real? There is real risk here of confusing the map with the territory. For numbers to be real, we have to assume a Platonic heaven where universal truths exist independent of humans. Numbers, be they whole numbers, rational numbers, or reals, would be premier citizens of such a heaven. Since that heaven's existence is independent of the existence of humans, then our knowlege of anything in it must be conveyed somehow to us through observation or through introspection. If it is conveyed to us through observation, then it will be subject to Shannon's channel capacity theorem, in which case we can only know about things that can be encoded in a finite number of bits. If it is conveyed to us through introspection, then its existence is a matter of faith, since its existence is independent of us, and there is no connection between that introspection and the thing-in-itself, by definition of introspection.
A second example of the problems posed by real numbers is Richard's paradox, first stated by the French mathematician Jules Richard in letter in 1905. Richard pointed out that all possible texts in French can be listed in some order in a manner similar to Borel's yes-no questions. A subset of these texts describe or name real numbers. But it is easy to describe a number that is not described on the list. Consider the phrase "the smallest number not describable in fewer than eleven words." These ten words seem to define a number that cannot be on the list of described numbers. A more rigorous form of the argument would use Cantor's diagonalization technique (see chapter 8 of P & N). The text would describe that diagonalization technique, and thereby describe a number that is not in the list of all describable or nameable numbers.
Every interpretation of a text in French or English as a number depends on the notion of semantics, discussed in chapter 9 of P & N, where I point out that "the notion of semantics can leverage the countable world of software in an uncountable variety of ways." Semantics connects human cognition with the formal and countable world of symbols, but we have no evidence that the cognitive world is formal and countable. Arguably, Richard's paradox demonstrates that written language must be ambiguous, perhaps because it bridges a countable world with an uncountable (cognitive) one. Chaitin points out that when we eliminate semantics, we lose a great deal:
Formal languages avoid the paradoxes by removing the ambiguities of natural languages. The paradoxes are eliminated, but there is a price. Paradoxical natural languages are evolving open systems. Artificial languages are static closed systems subject to limitative meta-theorems. You avoid the paradoxes, but you are left with a corpse! [Chaitin 2004]
In formal languages, paradoxes reduce to incompleteness (Goedel) and undecidability (Turing), soulless concepts compared to nuance and ambiguity in natural language.
Let us assume that numbers are models (maps) reflective of some reality (territories). Under this assumption, Chaitin's question becomes one of whether real numbers are accurate models of some physical reality. We could ask the question of whether real numbers are useful models, but that question seems trivial; we know they are. So let's focus on whether real numbers are accurate models of reality. Chaitin observes that some physicists argue that they are not:
The latest strong hints in the direction of discreteness come from quantum gravity [Smolin, 2000], in particular from the Bekenstein bound and the so-called "holographic principle." According to these ideas the amount of information in any physical system is bounded, i.e., is a finite number of 0/1 bits. [Chaitin 2004]
As I show in chapter 8 of P & N, this "digital physics" hypothesis is not falsifiable, and therefore not scientific according to the philosophy of Karl Popper . It can only be taken on faith. Moreover, the arguments for digital physics are based on a flawed interpretation of the Bekenstein bound that fails to recognize the distinction between the entropy of a discrete random variable (which represents information in bits) and the entropy of a continuous random variable (which does not represent information in bits).
Chaitin also leverages biology to bolster his digital faith:
Other hints come from ... molecular biology where DNA is the digital software for life ... [Chaitin 2004]
As pointed out by George Dyson in Turing's Cathedral,
the problem of self-reproduction is fundamentally a problem of communication, over a noisy channel, from one generation to the next. [Dyson, 2012]
Since reproduction is a noisy channel, it can convey only information that can be encoded with a finite number of bits. DNA, therefore, might as well be encoded digitally. There would be no point in a richer encoding. Does DNA encode humans? From P & N:
Only features that can be encoded with a finite number of bits can be passed from generation to generation, according to the channel capacity theorem. If the mind, or features of the mind such as knowledge, wisdom, and our sense of self, cannot be encoded with a finite number of bits, then these features cannot be inherited by our offspring. It certainly appears that DNA does not encode the mind because the mind of your offspring is not your own or even a combination of those of both biological parents. ...
Chaitin's objection is fundamentally to the notion of a continuum, which most certainly does lead to conceptual difficulties in the formal languages of logic and mathematics that humans have invented. But these formal languages live in a countable world, so it should not be surprising that they have difficulty comprehensively handling an uncountable world. Despite these difficulties, the cognitive notion of a continuum is not at all difficult to grasp. The difficulties arise only when trying to communicate, for example by naming or describing all the real numbers. But one can understand without communicating. In fact, conveying understanding is notoriously difficult. We call it "teaching."
The hypothesis that the mind can be encoded digitally is, like digital physics, not falsifiable unless we can invent some way of noiselessly measuring the mind. Since we have no such noiseless measurement, this hypothesis is not scientific. If in fact the mind relies on a continuum for its cognitive functions, this would explain why cognitive functions cannot be inherited and why our minds can deal with real numbers, despite the paradoxes.
Chaitin rests on the ancient Greeks when drawing his sweeping conclusion:
According to Pythagoras everything is number, and God is a mathematician. This point of view has worked pretty well throughout the development of modern science. However now a neo-Pythagorian doctrine is emerging, according to which everything is 0/1 bits, and the world is built entirely out of digital information. In other words, now everything is software, God is a computer programmer, not a mathematician, and the world is a giant information-processing system, a giant computer [Chaitin, 2004].
This statement describes a faith, not a scientific principle.
Edward Ashford Lee