Tuesday, April 1, 2008

Relative Discretion, Relative Complexity

In “Probable Geometries” Greg Lynn defines the joint concepts of discrete and continuous using Jackob Bernoulli’s 1713 description of discrete probability. Using the statistical analogy of a coin toss, Bernoulli states that a discrete system would include a coin, whose symmetry [and geometric form] permit the speculator to state in advance that the coin will almost always land on one of its two sides and thus almost always come to rest in a position that reveals only either a “head” or a “tail”. A continuous system, according to Lynn and Bernoulli, includes a coin which, due to certain deformations in its geometry, does not permit the speculator to state the possible positions in which the object might fall into rest in advance. Therefore, an experimental series must be performed to establish the frequency [or probability] of these possible positions.

To construct one such experiment, take a coin, a nickel as we know it today—Jefferson on one side, his home always his opposite, and inflate it from its centroid to form a perfect sphere. Jefferson and Monticello are deformed equally into two hemispheres connected along their equator by a seam—previously the sides of the original nickel cylinder—now similar to the make-up of a red rubber bouncy ball, less the red, rubber, bouncy. The nickel, once approximately the diameter of a marble, now, in sphere form, closer to the size of a bee-bee; as no alchemy has been performed here. No additional molecules have been added to the nickel in order to bring it to its new shape; both the original and the new nickel are equal in weight, mass, density and the like. The first penny when tossed performs as Bernoulli suspects while the second now rolls until finding a perfectly flat position on the ground and reveals, when viewed orthogonally in plan, most of Jefferson’s ponytail, “Liberty”, the mint date, along with “Unum”, the right third of Montec-E-L-L-O, the word “America” and the smooth shiny seam holding the two hemispheres together. The probability that the nickel will land to reveal exactly the same image again once in the next one-hundred tries is slim at best.

As a system we can understand Bernoulli’s analogy conceptually in architecture and in literature and in most other arts and soft sciences. As a physicist, however, I imagine that both of our nickels are either both discrete and/or both continuous. Both of the nickels are made up of atoms and are probably divisible as such; economically each is still divisible into 5 pennies; what is important, though, is that the difference between one and the other is their distinct formal qualities. Yet taste, smell, listen and other typical analyses all produce the conclusion that each is qualitatively the same. Neither is more or less continuous than the other. Yet Bernoulli, and Lynn, and myself continue to believe in the coin toss analogy. This has to do with the notion of relative discretion. Each system must be able to define, under its own pretense, its own discretion. Under this law a toroid may be considered discrete if it is the primitive geometry from which the system begins. Until acted on by forces; fluid, dynamic, or otherwise, the toroid will remain discrete and may serve as a constant against which all post-operative geometries within the same experiment may be measured. In this way the coffee cup may be considered continuous although the two geometries may contain the same number of vertices, edges, faces, etcetera.

In each of these models an operation is performed on the discrete to produce the continuous.

It would seem also that the notion of relative discretion may pertain to the algorithm itself. We have discussed at length the complexity produced by such simple rule based algorithms as cellular automata and Turing’s universal machine. Specific conversations in class have debated the actual complexity of the cellular automaton. During one particular session the argument was made that in each the CA and the UM, because the rules are stated before the experiment is begun, it would be conceivable that the result of each may be predicted with some certainty. The opposite side of the debate acknowledged that, while indeed the rules are known from the start, the aggregation of these simple rules cannot be predicted and that the only sure way to realize the results would be to run the algorithm. Again, statistically this must be true, although the complexity of the rule set will always have a direct relationship with the odds of predicting the outcome. The coin, for instance, must be tossed in order for us to know the result with 100% certainty. Before the coin is tossed, though, the speculator is given 50% odds. The CA must be predictable to some degree of certainty in the same way.


An analogy can be made to the game of blackjack which comes with its own set of rules. Similar to the rules of a cellular automaton the rules of blackjack are established at the start. These rules are largely Boolean—hit or stand—with some added complexity in split, double, and bust. Given three cards, the two in the speculators hand and the one dealer card shown, a single response is expected. The speculator’s hand includes an ace and a nine while the dealer shows a two—the speculator is expected to stand. The speculator has two eights while the dealer shows a nine—the speculator is expected to double his or her wager and hit. Although the chart would indicate that blackjack is significantly more complex than the coin toss, the pit boss knows that blackjack odds are always 42%. Tipped only slightly in his or her favor. Before each card is dealt the speculator knows what to do with each new circumstance and exactly what those new circumstances might be; the ability to count cards, where he or she is seated at the table, the size of the deck and other advantages may even help to bring these 42% odds closer to the 50% available at the coin toss. As discretion is relative so too is complexity and 42% can hardly be considered complex. This is not to say that complexity is not available in blackjack. It does suggest that the complexity must come from a secondary influence or a second level algorithm. Studying a full black jack table you will notice that the response of a speculator immediately to the right of the dealer will directly impact the game of the speculator to his or her right and so on around the table. It is not uncommon for tempers to flare if one speculator offers a response not specified by the given rule set. Such a response will spike the flow of the next player’s game, resulting in him or her receiving the card before or after the card they would have been dealt had the game been carried out by a computer alone. The possibility for this type of random operator asserts an open-endedness to the system and allows for the unpredictable result.

Similarly, Lynn uses the term incomplete to describe the random section method use by both Le Corbusier and Rem Koolhaas at Maison Citrohan and Tres Grande Bibliotheque, respectively, to produce probable geometries. In "Strategizing the Void" Koolhaas uses the terms regular and irregular.

“Imagine a building consisting of regular and irregular spaces, where the most important parts of the building consist of an absence of building. The regular is the storage, the irregular is the reading rooms, not designed, simply carved out. Could this formulation liberate us from the sad mode of simulating invention?”

Here a discrete system is established by regular floor plates and structural organizations and an algorithm is deployed for networking the system vertically through elevators and escalators. While the system is technically made continuous in this operation alone it is still relatively un-complex. A secondary, random operator, is then run through the system to “excavate where efficient”—thus producing the irregular.

A similar process can also be seen in Tschumi which he describes as Space/Event (Movement) and tests at Parc de la Villete. He includes the term movement but always secondarily and often removes it completely to say Event Space. A more indexical description of his theory would reorder the terms, space-movement-event. Here a discrete system is established with the construction of space. The movement of the user then activates the space to produce the event.

No comments: