I’m not sure what to do with this article about the Tracy-Widom distribution, but it seems like it might contribute to Boyd’s philosophy of conflict on several levels. I’ll offer it in the spirit that Boyd spent a lot of time mining physics and biology for parts to use in his snowmobiles.
The article is “At the Far Ends of a New Universal Law,” by Natalie Wolchover in the October 15th edition of Quanta Magazine.
Usually I’m skeptical about applying physical or statistical concepts to the problems of strategy because these laws assume that the particles don’t behave like participants in a conflict, i.e., that they don’t lie, engage in deception, try to panic the scientists, and so on. As Boyd put it, on chart 132 of Patterns, instead of choosing the alternative that you think will be the most effective, select the one that your opponent will least expect, ideally something your opponent thinks is impossible. Do this just for the panic effect if nothing else. Typically, particles obeying the laws of physics don’t do this.
This article, though, starts off with conflict:
[Biologist Robert May] wanted to figure out whether a complex ecosystem can ever be stable or whether interactions between species inevitably lead some to wipe out others.
Reminiscent of Boyd’s relying on the the theory of evolution as one of the foundations of his work.
The there’s the notion of orders of phase transitions:
Similarly, the physicists realized, the energy curves of certain strongly correlated systems have a kink at √2N. The associated peak for these systems is the Tracy-Widom distribution, which appears in the third derivative of the energy curve — that is, the rate of change of the rate of change of the energy’s rate of change. This makes the Tracy-Widom distribution a “third-order” phase transition.
Recall that Boyd (in “New Conception”) began by considering the state of an aircraft — its altitude, airspeed, and direction. Maneuverability was defined as the rate of change of the state, things like turn rate and rate of climb. Agility was the ability to change maneuver state, go from a 9 deg/sec turn in one direction to 9 deg/sec in another, for example. The F-16 is an example of a highly agile aircraft.
The third derivative of state is sometimes called the “jerk,”and this fits Boyd nicely because he often described his philosophy as “jerky and disorienting.” But there’s a deeper way to look at this.
Near the end of his life, Boyd floated the idea of Behendigkeit, which usually translates as “agility.” He regarded it as “mental agility,” however, and based his concept on the idea that you can get trapped in a pattern of ideas and ride it to the bitter end. Behendigkeit was the ability to change patterns, hop on a new snowmobile as it were. If you’re in a conflict, for example, and you see that agility (the second derivative) isn’t working, then Behendigkeit is the ability to come up with something else on the fly. In other words, we’re talking not so much about the aircraft’s ability to change states but our own ability to change how we think about things.
This can be fiendishly difficult to pull off, especially if you were the person who sold your organization the original snowmobile. So Behendigkeit really is a deep concept and will repay a lot of pondering and experimenting. You might find something useful in this article.
Chet, sending you an e-mail with a piece I did some time ago that’s related to the severe negative start OODA sequence or unconventional crisis (from Dr. Erwan Lagadec now at Elliot School GWU).
Question for me is what to do with the math? Does it help to explain to senior decision makers that the possibilities of flawed decisions in highly complex severely novel situations has a true foundation in math related to events that are indeed highly coupled in a community environment?
Doing high availability computer products studied how things failed, no-single-point-of-failure, “fencing failures”, redundancies, failure recovery, etc.
In highly optimized infrastructures “fencing” (off failures) and redundancies tend to be eliminated. In ’80s financial operations this was frequently referred to as “systemic risk” … where single failures can cascade into taking down the whole infrastructure.
Many times, people responsible will try and claim that it was just to complex to understand … when they really mean was at the time, they didn’t care.
A periodic problem is that if you do too good a job and never let a failure/problem manifest itself, those in charge frequently will start wanting to cut resources (directly contributing to future problems manifesting themselves)
“Many times, people responsible will try and claim that it was just to complex to understand … when they really mean was at the time, they didn’t care.”
OR they delude themselves, with, or WITHOUT a formal risk assessment,
not to spend the money, in redundancy and backup.
http://www.quantamagazine.org/20141015-at-the-far-ends-of-a-new-universal-law/
this link worked
Thanks! Fixed now.
“Typically, particles obeying the laws of physics don’t do this.”
At the quantum level though, there is randomness, it’s been characterized
as a quantum foam, or boiling froth.
The further removed you become in scale and time, the more smoothness, and predictability
one perceives, we live, and our senses are afterall geared to this scale, if not
the scale of intergalactic super clusters, where time passes so very slowly,
and a human time scale is but a pico second.
And so, in the organizational or conflict scenario,
randomness does occur, and while it’s difficult to say when and where exactly,
from an outside or hierarchical perspective, it’s easier to predict with confidence, that sooner or later, an individual solder, might end up killing his comrades and/or commanding
officer.
I like the bell curve analysis technique, and being familiar with the mechanics of
such in Engineering applications. Another approach that might be interesting,
could be based on the pervasive current approach to data compression,
video in particular. This involves the conversion of raw data (binaries)
to frequency components (Fourier transformation). Take all the plans, decisions,
& events successful, and unsuccessful, in military campaigns, and then analyze those
on a frequency basis. You could distribute those categorically (which I am sure has been done)
or from the perspective, of just, what happens the most frequently.
Or if you prefer, how, when, and why ? “S%^t Happens” ,,,,,
Quote true. Randomness, but not lying and deceit.
Agreed, I think it was Einstein (Albert, not Bob) who had a difficulty with acceptance of Quantum mechanics, “God does not play dice”
“God does not play dice with the universe.” Whereupon Niels Bohr told him, “Albert, stop telling God what to do.”
“randomness does occur, and while it’s difficult to say when and where exactly,”
Heaven help us if it occurs in our decision making process, because there is nothing worst than randomness when it comes to defining a tipping point, in our decision-making process.
I mean in the context of how important Orientation is in the over-all process.