9.7 Angels and Archetypes 

The Stranger is the preparer of the way of the quaternity which he follows. Women and children follow him gladly and he sometimes teaches them. He sees his surroundings and particularly me as ignorant and uneducated. He is no antiChrist, but in a certain sense an antiscientist… 
Pauli describing a dream to Anna Jung, 1950 

Impressed by the seemingly limitless scope and precision of Newton’s laws, some of his successors during the Enlightenment imagined a fully deterministic world. Newton himself had tried to forestall any such conclusion by often referring to an active role for the divine spirit in the workings of nature, especially in establishing the astonishingly felicitous “initial conditions”, but also in restoring the world’s vitality and order from time to time. Nevertheless, he couldn’t resist demonstrating the apparently perfect precision with which the observed phenomena (at least in celestial mechanics) adhered to the mathematical principles expressed by the laws of motion and gravity, and this was the dominant impression given by his work. One of the most prominent and influential proponents of Newtonian determinism was Laplace, who famously wrote 

Present events are connected with preceding ones by the principle that a thing cannot occur without a cause which produces it. This axiom, known as the principle of sufficient reason, extends even to actions which are considered indifferent… We ought then to regard the present state of the universe as the effect of the anterior state and as the cause of the one that is to follow… If an intelligence, for one instant, recognizes all the forces which animate Nature, and the respective positions of the things which compose it, and if that intelligence is also sufficiently vast to subject these data to analysis, it will comprehend in one formula the movements of the largest bodies of the universe as well as those of the minutest atom. Nothing would be uncertain, and the future, as the past, would be present to its eyes. 

Notice that he initially conceives of determinism as a temporally ordered chain of implication, but then he describes a Gestalt shift, leading to the view of an atemporal "block universe" that simply exists. He doesn’t say so, but the concepts of time and causality in such a universe would be (at most) psychological interpretations, lacking any active physical significance, because in order for time and causality to be genuinely active, a degree of freedom is necessary; without freedom there can be no absolute direction of causal implication. For example, we ordinarily say that electromagnetic effects propagate at the speed of light, because the state of the electromagnetic field at any given event (time and place) is fully determined by the state of the field within the past light cone of that event, but since the laws of electromagnetism are timesymmetrical, the state of the field at any event is just as fully determined by state of the field within its future light cone, and by the state of the field on the same time slice as the event. We don’t conclude from this that electromagnetic effects therefore propagate instantaneously, let alone backwards in time. This example merely shows that when considering just a deterministic and timesymmetrical field, there is no unambiguous flow of information, because there can be no source of information in such a field. In order to even consider the flow of information, we must introduce a localized source of new information, i.e., an effect not implied by the field itself. Only then can we examine how this signal propagates through the field. 

Of course, although this signal is independent of the “past”, it is certainly not independent of the “future”. Owing to the timesymmetry of electromagnetism, we can begin with the effects and project “backwards” in time using the deterministic field equations to arrive at the supposedly freely produced signal. So even in this case, one can argue that the introduction of the signal was not a “free” act at all. We can regard it as a fully deterministic antecedent of the future, just as other events are regarded as fully deterministic consequences of the past. Acts that we regard are “free” are characterized by a kind of singularity, in the sense that when we extrapolate backwards from the effects to the cause, using the deterministic field laws, we reach a singularity at the cause, and cannot extrapolate through it back to a state that would precede it according to the deterministic field laws. The information emanating from a “free act”, when extrapolated backwards from its effects to the source, must annihilate itself at the source. This is analogous to extrapolating the ripples in a pond backwards in time according to the laws of surface wave propagation, until reaching the event of a pebble entering the water, prior to which there is nothing in the quiet surface of the pond that implies (by the laws of the surface) the impending disturbance. Such backward accounts seem implausible, because they require such a highly coordinated arrangement of information from separate locations in the future, so we ordinarily prefer to conceive of the flow of information in the opposite direction. 

Likewise, even in a block universe, it may be that certain directions are preferred based on the simplicity with which they can be described and conceptually grasped. For example, it may be possible to completely specify the universe based on the contents of a particular crosssectional slice, together with a simple set of fixed rules for recursively inferring the contents of neighboring slices in a particular sequence, whereas other sequences may require a vastly more complicated “rule”. However, in a deterministic universe this chain of implication is merely a descriptive convenience, not an effective mechanism by which the events “come into being”. 

The concept of a static complete universe is consistent not only with the Newtonian physics discussed by Laplace, but also with the theory of relativity, in which the worldlines of objects (through spacetime) can be considered already existent in their entirety. In fact, it can be argued that this is a necessary interpretation for some general relativistic phenomena such as genuine black holes merging together in an infinite universe, because, as discussed in Section 7.2, the trousers model implies that the event horizons for two such black holes are continuously connected to each other in the future, as part of the global topology of the universe. There is no way for two black holes that are not connected to each other in the future to ever merge. This may sound tautological, but the global topological feature of the spacetime manifold that results in the merging of two black holes cannot be formed “from the past”, it must already be part of the final state of the universe. So, in this sense, relativity is perhaps an even more deterministic theory than Newtonian mechanics. The same conclusion could be reached by considering the lack of absolute simultaneity in special relativity, which makes it impossible to say which of two spacelike separated events preceded the other. 

Admittedly, the determinism of classical physics (including relativity) has sometimes been challenged, usually by pointing out that the longterm outcome of a physical process may be exponentially sensitive to the initial conditions. The concept of classical determinism relies on each physical variable being a real number (in the mathematical sense) representing and infinite amount of information. One can argue that this premise is implausible, and it certainly can’t be proven. We must also consider the possibility of singularities in classical physics, unless they are simply excluded on principle. Nevertheless, if the premise of infinite information in each real variable is granted, and if we exclude singularities, classical physics exhibits the distinctive feature of determinism. 

In contrast, quantum mechanics is widely regarded as decidedly nondeterministic. Indeed, as we saw in Section 9.6, there is a famous theorem of von Neumann that purports to rule out determinism (in the form of hidden variables) in the realm of quantum mechanics. However, as Einstein observed 

Whether objective facts are subject to causality is a question whose answer necessarily depends on the theory from which we start. Therefore, it will never be possible to decide whether the world is causal or not. 

The word “causal” is being used here as a synonym for deterministic, since Einstein had in mind strict causality, with no free choices, as summarized in his famous remark that “God does not play dice with the universe”. We've seen that von Neumann’s proof was based on a premise which is effectively equivalent to what he was trying to prove, nicely illustrating Einstein’s point that the answer depends on the theory from which we start. An assertion about what is recursively possible can be meaningful only if we place some constraints on the allowable recursive "algorithm". For example, the nth state vector of a system may be the kn+1 through k(n+1) digits of p. This would be a perfectly deterministic system, but the relations between successive states would be extremely obscure. In fact, assuming the digits of the two transcendental numbers p and e are normally distributed (as is widely believed, though not proven), any finite string of decimal digits occurs infinitely often in their decimal expansions, and each string occurs with the same frequency in both expansions. (It's been noted that, assuming normality, the digits of p would make an inexhaustible source of highquality "random" number sequences, higher quality than anything we can get out of conventional pseudorandom number generators). Therefore, given any finite number of digits (observations), we could never even decide whether the operative “algorithm” was p or e, nor whether we had correctly identified the relevant occurrence in the expansion. Thus we can easily imagine a perfectly deterministic universe that is also utterly unpredictable. (Interestingly, the recent innovation that enables computation of the nth hexadecimal digit of p (with much less work than required to compute the first n digits) implies that we could present someone with a sequence of digits and challenge them to determine where it first occurs in the decimal expansion of p, and it may be practically impossible for them to find the answer.) 

Even worse, there need be no simple rule of any kind relating the events of a deterministic universe. This highlights the important distinction between determinism and the concepts of predictability and complexity. There is no requirement for a deterministic universe to be predictable, or for its complexity to be limited in any way. Thus, we can never prove that any finite set of observations could only have occurred in a nondeterministic algorithm. In a sense, this is trivially true, because a finite Turing machine can always be written to generate any given finite string, although the algorithm necessary to generate a very irregular string may be nearly as long as the string itself. Since determinism is inherently undecidable, we may try to define a more tractable notion, such as predictability, in terms of the exhibited complexity manifest in our observations. This could be quantified as the length of the shortest Turing machine required to reproduce our observations, and we might imagine that in a completely random universe, the size of the required algorithm would grow in proportion to the number of observations (as we are forced to include ad hoc modifications to the algorithm to account for each new observation). On this basis it might seem that we could eventually assert with certainty that the universe is inherently unpredictable (on some level of experience), i.e., that the length of the shortest Turing machine required to duplicate the results grows in proportion with the number of observations. In a sense, this is what the "no hidden variables" theorems try to do. 

However, we can never reach such a conclusion, as shown by Chaitin's proof that there exists an integer k such that it's impossible to prove that the complexity of any specific string of binary bits exceeds k (where "complexity" is defined as the length of the smallest Turing program that generates the string). This is true in spite of the fact that "almost all" strings have complexity greater than k. Therefore, even if we (sensibly) restrict our meaningful class of Turing machines to those of complexity less than a fixed number k, rather than allowing the complexity of our model to increase in proportion to the number of observations, it's still impossible for any finite set of observations (even if we continue gathering data forever) to be provably inconsistent with a Turing machine of complexity less than k. Naturally we must be careful not to confuse the question of whether "there exist" sequences of complexity greater than k with the question of whether we can prove that any particular sequence has complexity greater than k. 

When Max Born retired from his professorship at the University of Edinburgh in 1953, a commemorative volume of scientific papers was prepared. Einstein contributed a paper, in which (as Born put it) Einstein’s “philosophical objection to the statistical interpretation of quantum mechanics is particularly cogently and clearly expressed”. The two men took up the subject in their private correspondence (which had started nearly 50 years earlier when they were close friends in Berlin during the first world war), and the ensuring argument strained their friendship nearly to the breaking point. Eventually they appealed to a mutual friend, Wolfgang Pauli, who tried to clarify the issues. Born was sure that Einstein’s critique of quantum mechanics was focused on the lack of determinism, but Pauli explained (with the benefit of discussing the matter with Einstein personally at Princeton) that this was not the case. Pauli wrote to Born that 

Einstein does not consider the concept of ‘determinism’ to be as fundamental as it is frequently held to be (as he told me emphatically many times), and he denied energetically that he ever put up a postulate such as (your letter, para 3) ‘the sequence of such conditions must also be objective and real, that is, automatic, machinelike, deterministic’. In the same way, he disputes that he uses as criterion for the admissibility of a theory the question ‘Is it rigorously deterministic?’ 

This should not be surprising, given that Einstein knew it is impossible to ever decide whether or not the world is deterministic. Pauli went on to explain the position that Einstein himself had already described in the EPR paper years earlier, i.e., the insistence on what might be called complete realism. Pauli summarized his understanding of Einstein’s view, along with his own response to it, in the letter to Born, in which he tried to explain why he thought it was “misleading to bring the concept of determinism into the dispute with Einstein”. He wrote 

Einstein would demand that the 'complete real description of the System', even before an observation, must already contain elements which would in some way correspond with the possible differences in the results of the observations. I think, on the other hand, that this postulate is inconsistent with the freedom of the experimenter to select mutually exclusive experimental arrangements… 

Born accepted Pauli’s appraisal of the dispute, and conceded that he (Born) had been wrong in thinking Einstein’s main criterion was determinism. Born’s explanation of his misunderstanding was that he simply couldn’t believe Einstein would demand a “complete real description” beyond that which can be perceived. The great lesson that Born, Heisenberg, and the other pioneers of quantum mechanics had taken from Einstein’s early work on special relativity was that we must insist on operational definitions for all the terms of a scientific theory, and deny meaning to concepts or elements of a theory that have no empirical content. But Einstein did not hold to that belief, and even chided Born for adopting the positivistic maxim esse est percepi. 

There is, however, a certain irony in Pauli’s position, since he asserts the irrelevance of the concept of determinism, but at the same time criticizes Einstein’s “postulate” by saying that it is “inconsistent with the freedom of the experimenter to select mutually exclusive experimental arrangements”. As discussed in the previous section, this freedom is itself a postulate, an unproveable proposition, and one that is obviously inconsistent with determinism. Einstein argued that determinism is an undecidable proposition in the absolute sense, and hence not a suitable criterion for physical theories, whereas Born and Pauli implicitly demanded nondeterminism of a physical theory. 

By the way, Pauli and his psychoanalyst Carl Jung spent much time developing a concept which they called synchronicity, loosely defined as the coincidental occurrence of noncausally related events that nevertheless exhibit seemingly meaningful correlations. This was presented as a complementary alternative to the more scientific principle of causation. One notable example of synchronicity was the development of the concept of synchronicity itself, along side Einstein’s elucidation of nonclassical correlations between distant events implied by quantum mechanics. But Pauli (like Born) didn’t place any value on Einstein’s “realist” reasons for rejecting their quantum mechanics as a satisfactory theory. Pauli wrote to Born 

One should no more rack one’s brain about the problem of whether something one cannot know anything about exists all the same, than about the ancient question of how many angels are able to sit on the point of a needle. But it seems to me that Einstein’s questions are ultimately always of this kind. 

It’s interesting that Pauli referred to the question of how many angles can sit on the point of a needle, since one of his most important contributions to quantum mechanics was the exclusion principle, which in effect answered the question of how many electrons can fit into a single quantum state. He and Jung might have cited this as an example of the collective unconscious reaching back to the scholastic theologians. Pauli seems to given credence to Jung’s theory of archetypes, according to which the same set of organizing principles and forms (the “unus mundus”) that govern the physical world also shape the human mind, so there is a natural harmony between physical laws and human thoughts. To illustrate this, Pauli wrote an essay on Kepler, which was published along with Jung’s treatise on synchronicity. 

The complementarity interpretation of quantum mechanics, developed by Bohr, can be seen as an attempted compromise with Einstein over his demand for realism (similar to Einstein’s effort to reconcile relativity with the language of Lorentz’s ether). Two requirements of a classical description of phenomena are that they be strictly causal and that they be expressed in terms of space and time. According to Bohr, these two requirements are mutually exclusive. As summarized by Heisenberg 

There exists a body of exact mathematical laws, but these cannot be interpreted as expressing simple relationships between objects existing in space and time. The observable predictions of this theory can be approximately described in such terms, but not uniquely… This is a direct result of the indeterminateness of the concept “observation”. It is not possible to decide, other than arbitrarily, what objects are to be considered as part of the observed system and what as part of the observer’s apparatus. The concept “observation” … can be carried over to atomic phenomena only when due regard is paid to the limitations placed on all spacetime descriptions by the uncertainty principle. 

Thus any description of events in terms of space and time must include acausal aspects, and conversely any strictly causal description cannot be expressed in terms of space and time. This of course was antithetical to Einstein, who maintained that the general theory of relativity tells us something exact about space and time. (He wrote in 1949 that “In my opinion the equations of general relativity are more likely to tell us something precise than all other equations of physics”.) To accept that the fundamental laws of physics are incompatible with space and time would require him to renounce general relativity. He occasionally contemplated the possibility that this step might be necessary, but never really came to accept it. He continued to seek a conceptual framework that would allow for strictly causal descriptions of objects in space and time – even if it required the descriptions to involve purely hypothetical components. In this respect his attitude resembled that of Lorentz, who, in his later years, continued to argue for the conceptual value of the classical ether and absolute time, even though he was forced to concede that they were undetectable. 
