Global Reversibility of Cellular Automata

 

Consider the one-dimensional Boolean cellular automata defined such that the new value of each cell is a fixed function of the current value of the cell and its two nearest neighbors. There are eight possible 3-bit strings, and the function ("rule") assigns the successor value for each of these strings. (This are discussed in more detail in Cellular Automata) One of the most interesting of the 256 possible "rules" is the one based on the mapping indicated below:

 

 

Since the binary number 00011110 equals decimal 30, this is commonly called "Rule 30". The result of applying this local rule iteratively to a 200-bit string (loop) beginning with a random set of initial values is shown below.

 

 

In this figure the initial string is the bottom row, and the successor rows progress in the upward direction. The light colored cells are logical "0", and the dark cells are logical "1".

 

In the previous article we discussed the prospects for reversing this type of cellular automata, i.e., given the values of all the cells at one instant, to find the predecessors. We noted that, of the 256 possible rules (or the 88 equivalence classes), only a few are directly invertible. Most rules cannot be locally inverted, because (in general) more than one predecessor string leads to the same successor string. Thus, in a sense, information is lost by the application of the rule. This is very clear in cases (such as Rule 37) where many strings cannot possibly be produced by the application of the rule to any predecessor string. Such rules represent contractions. (See Impossible Strings for Cellular Automata.) On the other hand, there are some rules (such as Rule 30) that are capable of producing all possible successor strings (from a suitable predecessor string), so they are not contractions, but they are nevertheless not invertible. In fact, as mentioned in the previous article, the inverse of Rule 30 is "completely indeterminate" in the sense that no 3-bit pattern yields a unique antecedent for the central bit. However, these comments all refer to local reversibility. If we consider global reversibility, especially in the context of a closed loop of cells, we find that Rule 30 actually is reversible. In other words, although the predecessor values in any range of cells cannot be inferred based on local knowledge of the successor cells, it is possible to infer the entire predecessor string based on knowledge of the entire successor string.

 

To evaluate the distribution of (linear) predecessor strings over the n-bit successor strings, we have two options. We can consider n-bit predecessors or (n+2)-bit predecessors. For Rule 30, each n-bit string is the successor of exactly four (n+2) bit strings, and it is a possible successor of exactly three n-bit strings. To put this another way, for each n-bit string, there are three possible n-bit predecessors (neglecting any boundary interference). For example, the possible 3-bit predecessors of each of the 3-bit strings under Rule 30 are listed below.

 

 

Referring to three consecutive cells as a "character", we might think that since a character in one part of the space has three possible predecessor characters, and a character in another region also has three possible predecessors, that there are 3 ´ 3 = 9 possible combinations of predecessors for these two characters, but that is not the case, because the overall string extending from one character to the other has just three possible predecessors. Thus, for an infinite space with no boundaries, there is necessarily a correlation between distant predecessors imposed by the global requirements of the rule, even though the predecessors are both indeterminate with respect to local information.

 

To see how this works, suppose the following 17-bit string has been produced by applying Rule 30 to some unspecified predecessor string:

 

 

What can we infer about the predecessor string? Taking the left-most "character" (i.e., the bits 110), we know from the table above that the predecessor string must begin with either 010, 011, or 100. The next (overlapping) character in the successor is 101, so the next character of the predecessor must be either 000, 101, or 110, and this must overlap with two bits of the first character. Hence there are only three possibilities: 0101, 0110, or 1000, so these are the three possible predecessors of the first four bits. Continuing in this way, we can continue to add bits, and at each stage we will find that exactly three strings are viable as the predecessor. The entire process is summarized in the figure below.

 

 

Each line represents a viable predecessor string, each open circle is a logical 0, and each dark circle is a logical 1. A few interesting regularities are apparent, if we set aside the first three columns, which represent the initial conditions. First, a string "splits" only at a logical 1, never at a logical 0. Second, a split occurs when, and only when, the new bit of the successor string is a logical 0. Third, there is exactly one logical 1 at each stage.

 

We could continue indefinitely adding more arbitrary bits to the successor string, and inferring the three possible predecessor strings. However, suppose we create a closed finite string with no boundary by wrapping around from the right-most end of the string back to the left. This gives a closed loop consisting of a cyclical arrangement of 17 bits (in the above example). This effectively defines the bits to the left of the initial bit in the above figure, which enables us to refine the representation of the three predecessor characters of the left-most character. There are several equivalent ways of expressing those characters, and we need to now use one that is compatible with the leftward bits that are wrapped around from the right. Also, we need to identify which of the threads leading off the right end of the drawing connect with the threads entering from the left. The result is illustrated below.

 

 

For any sub-string in this loop the three possible predecessors are found by beginning with the three distinct right-most bits and following the threads to the left. However, if we consider the entire loop (finite but without boundary), we find that there is only one predecessor string that is globally consistent, the "γ" string, colored in red. Repeating this process, we can determine the unique predecessor of this new loop (with respect to Rule 30) as shown below.

 

 

In special circumstances it is possible for a closed loop to not have a unique predecessor under Rule 30, at least formally. For example, consider the 9-bit "repunit" loop, i.e., the loop {111111111}. In this case the predecessor algorithm yields the result shown below.

 

 

It could be argued that each thread is a distinct loop, but of course they are all just rotated versions of each other, and the descendant loop is symmetrical under rotations, so there is essentially just a single predecessor loop, namely, {001001001}, and all of its rotations.

 

Although Rule 30 can produce any open string, it is limited in the closed loops that it can produce. For example, a "repunit" successor loop such as the one shown above is possible only if the loop length is a multiple of 3. If, instead, we posit a repunit loop of length 5 (for example), then we find that no thread will close, so there is no predecessor loop. Therefore, despite the fact that every finite open string has a predecessor with respect to Rule 30, not all closed loops have predecessors, which shows that this rule (when applied to closed loops) is actually a contraction. In other words, the set of possible states in the future is smaller than the set of possible states in the past (where the terms "future" and "past" refer to the usual directions of automaton evolution). In order to achieve closure for a 5-bit loop, at least one of the bits must be 0. The figure below shows the predecessor threads for one such possibility.

 

Only the "α" thread (shown in red) is globally consistent, so the only 5-bit loop predecessor for the loop {11110} is {10010}.

 

The behavior of simple cellular automata in the forward and reverse directions can be used to illustrate, and perhaps even to model, some interesting ideas in physics. For example, consider the possibility that the forward direction of physical time, as we ordinarily perceive it, may actually correspond to the reverse direction of "automata-time" in the fundamental machinery underlying the phenomena of nature. It is usually held that the physical future, at least on the level of quantum phenomena, cannot be inferred from the past (contrary to Laplace's notion of determinism), but if the future state of the system can not be inferred from the information contained in the past, then, in a sense, information must be created. For example, although often not fully appreciated, the hypothesis of free will necessarily entails the creation of new information. Even the occurrence of truly random events represents, in some sense, new information.

 

According to this view, information is created in the forward direction of physical time, so there is a loss of information in the reverse physical time direction. Indeed, the physical past is, almost by definition, that which is determined, and as we conceptually move into the past we lose information about the future. There is some plausibility to the idea that sentient beings living within (and as part of) a cellular automata would necessarily regard the automata-past as the physical-future, and vice versa. Relative to any given state, the automata-future is fully determined, i.e., the information of the automata-future is immediately available, in a sense that is very much like our consciousness of the physical-past. In the opposite direction (i.e., the physical future), information about the automata-past is not available - at least not locally - because the reverse automaton algorithm is locally under-specified. As a consequence, information is actually lost as the system advances in automata-time, which implies that a being in an automata-universe would have perfect ability to predict and anticipate the automata-future, whereas the automata-past would be undecided and indeterminate (locally). We cannot locally infer the predecessor of our current state, in a sense that is very similar to our lack of knowledge of the physical future, and our inability to infer the physical future from the current state.

 

We have qualified our statements about indeterminism with the word "locally", because we saw in the reverse behavior of Rule 30 that it's possible for the predecessor string to be locally indeterminate and yet globally determinate. In addition, we saw how two separate localities, each with their own respective sets of locally indeterminate predecessors (which they may regard as their unpredictable physical futures), would nevertheless find that when their experiences are combined into a single joint locality, there is a definite correlation between them. The degrees of freedom of the global physical future are far less than the Cartesian product of the local degrees of freedom. In fact, we saw that for Rule 30 the degrees of freedom are identical at all levels (until reaching global closure, at which point there is no freedom at all). This is conceptually and formally very similar to the phenomena of quantum entanglement, as illustrated by EPR experiments. In such experiments, supposedly free and independent choices are made at separate locations, and classically we would expect that the joint results would be totally uncorrelated, and yet we find that they are rigidly correlated, to a degree that defies explanation in terms of the classical view of causality and the flow of information.

 

Most people who have studied quantum entanglement have eventually felt compelled to question the directionality of physical time. This leads some to the idea of an atemporal universe, or of interacting advanced and retarded waves going back and forth between past and future, or of causality flowing along null light-like intervals. Each of these has some analogy in the behavior of simple cellular automata, especially if physical time is identified with reverse automata-time. Another school of thought invokes the idea of branching worlds, multiplying at each quantum event. The proliferation of worlds in this model - and the failure to convincingly relate these worlds to our experience - has limited the acceptance of these ideas. It's interesting to consider that, although we might expect our reverse automata to imply the multiplication of local alternatives at each point, we find that in fact the degrees of freedom remain constant at all scales, and vanish completely with global closure. This can be regarded as an illustration of the idea of relative states.

 

From the standpoint of thermodynamics, it has often been suggested that the "arrow of time" is a consequence of the universe having begun in a very special state of extremely low entropy. According to this view, as physical time advances, the entropy of the universe naturally tends to increase, as the state proceeds from its very improbable initial macro state of low entropy to more probable macro states of high-entropy. The implication is that the (physically) early macro states represent only a small number of potential micro states, whereas each (physically) later macro state corresponds to far more possible micro states. (Naturally this mode of speaking pre-supposes a particular partitioning ("course graining") of the underlying micro state space into physically meaningful macro states.)  Again we find that the direction of physical time seems to be opposite the direction of automata-time for cellular automata that model this situation, because it is when we run an automaton in reverse that each actual state is just one of multiple possible states (at least locally).

 

The model of the universe as a time-reversed automaton suggests an answer to a puzzling question involving the second law of thermodynamics. The naive "reason" for the inexorable tendency for entropy to increase is that the present universe is in a very small macro state relative to all possible macro states, and hence any random excursions in state space are bound to move the universe into larger macro states. The puzzle arises when we try to extrapolate the present state of the universe into the physical past. On the premise that the fundamental laws of physics governing the evolution of the state vector are time-symmetric, we would expect to find that the entropy of the universe was greater in the physical past than it is at the present low-entropy moment. In other words, just as we would expect entropy to increase in the physical future direction, we would also expect it to increase in the direction of the physical past, i.e., that entropy must have decreased in arriving at the present state from the physical past. (This assumes the set of all possible states is not changing significantly over the relevant time scale. We might hypothesize, to the contrary, that the set of all possible states is continuously increasing, or at least that the sizes of the other possible macro states are increasing, but we won't pursue that possibility here.) Therefore, the naive explanation is unsatisfactory not only because it pushes the question back onto the inexplicably low entropy of the initial universe, it also fails to account for the manifest asymmetry of the present situation.

 

The answer suggested (albeit vaguely) by time-reversed automata models is that the "earlier" states of the universe correspond to the physical future, and that the effect of the automata evolution rule (like Maxwell's tiny demons) in the forward automaton-time direction is to drive the system inexorably into progressively smaller macro states, i.e., to conditions of lower entropy, beginning from a huge initial macro state. The increasing size of the universe's macro state as physical time advances then corresponds to the (local) ambiguity of the reverse automaton rule. For example, with Rule 30, each state (of an open string) has three predecessors, and if we regard each of these possibilities as a micro state of the actual macro state, then the size of the macro state of the system n steps previously (in automata-time) was 3n. This corresponds to an increase in entropy with physical time. Of course, in a sense this "explanation" simply rests on the premise that the automaton rule underlying the laws of physics is not time symmetrical, so it then becomes necessary to explain why the physical laws emerging from this automaton are perceived by us to be symmetrical with respect to physical time. Also, the global reversibility of Rule 30 (discussed above), despite the local predecessor ambiguity, seems to conflict with the notion that the predecessor states represent a larger number of possible micro states.

 

Another interesting and suggestive phenomena arising from cellular automata loops is the potential for loop doubling, which is similar to slicing a Mobius strip down the middle to create a single strip with twice the length. To illustrate, recall that we considered the 9-bit loop {111111111} (which happens to be the predecessor under Rule 30 of the null loop {000000000}), and we found that the essentially unique (up to rotations) global predecessor is {001001001}. Repeating this reverse procedure, the next predecessor is {101101101}, but this is as far as we can go, because this last loop has no self-consistent predecessor - at least not in the form of a 9-bit loop. However, if we examine the local predecessor threads, and allow them to be "analytically continued" as shown below, we notice an interesting fact.

None of the local predecessor threads closes over a single loop, but they do close if we go twice around the loop. This is reminiscent of the topology of Riemann surfaces in complex analysis. In a sense, we allow a loop to be multi-valued (like complex functions), so that the predecessor braid can be analytically continued, without requiring closure over each trip around the loop. Super-imposing the two "covers" {101000101} and {000101000}, we notice that some of the bits are consistent, which we may represent using the notation {*0**0**0*}, so these can be regarded as resonant nodes of the analytically continued predecessor braid.

 

Return to MathPages Main Menu