VII. Hierarchy Theory Literature

The aim of the preceding sections was to introduce for the general reader the concept of hierarchical thinking – to raise awareness of how both the observer and the observed contribute in perception, not unlike how the fishing net and the water contents interact to determine what kind of fish is captured. Only potential patterns exist independently of the observer. At its very simplest, hierarchical thinking is a position of self-consciousness about how our knowledge, while not an arbitrary construct, is the exploitation of dynamics that arise from mindless self-complication processes which, given the right filter, will translate to useful regularities.

Hierarchical thinking therefore naturalizes “knowledge” and “understanding” as psychological phenomena, dependent on reality, but equally subjects to scientific study themselves. The study of knowledge, called “epistemology” as commonly taught, is purely philosophical without this scientific component, and most high school students graduate with the impression that pi is indeed in the sky and knowledge something metaphysically absolute “up there somewhere”. More worryingly, they graduate with the belief that science is the pursuit of truth, and not just the pursuit of the most useful description to serve mankind’s purposes. There have been well-intentioned initiatives for making epistemology part of mainstream education, such as the “Theory of Knowledge” course of the IB diploma program curriculum, but that is based on classical (and obsolete) Western philosophy, without the coherent cognitive/systems perspective that makes hierarchy theory so powerful.

The perspective that regards knowledge as a tool and not as a transcendent truth is called “pragmatic”, to distinguish it from “platonic”. The pragmatism of hierarchical thinking is what allows it to guide more productive debate free from semantic squabbling and cognitive biases, as well as to encourage more effective, unified and intellectually satisfying education, free from intuition-hampering misconceptions and “just so” attitudes. Hierarchy theory has yet to penetrate the public consciousness, but I passionately believe that teaching hierarchical thinking for the next generation – along with a healthy dose of skepticism and humility – is imperative, long overdue, and potentially transformative of the entire educational experience.

It is a shame then that for further reading, there is no cohesive literature available, for “hierarchy theory” straddles a bustling borderland of areas including cognitive science, informatics, and systems theory, a borderland that is more meta-disciplinary than inter-disciplinary. This section therefore provides brief digests of the books that informed this discussion. They were found through a mix of serendipity and relentless perusal of Amazon recommendation lists. My contribution has been a modest one of synthesis and illustration with not much original to add to the discussion – most examples are borrowed from the books listed here. The book list is very nontechnical and introductory simply because I myself am not literate in the technical literature – this text represents my own understandings stretched to their snapping point. Any advanced high school student will have no difficulties reading them.

Ahl, Valerie & Allen, T.F.H (1996): ”Hierarchy Theory: A Vision, Vocabulary, and Epistemology”

This is probably the book that the current discussion is the most indebted to, if not the only nontechnical book I am aware of to explicitly call for a hierarchical framework as an analytical approach in all sciences (it is a rather well-established concept within ecology already). When I chanced upon this light, unassuming volume in the university library, it resonated profoundly with everything I had read in complexity science. If philosophy syllabus-writers across the world would have to choose one work to include, I suppose this is the one.

The book defines the current epistemology of science as “naïve realism”, the idea that knowledge derives from the external world independently of observer decisions. Ahl and Allen then attempt the monumental task to characterize the interaction of the observer’s constructivist role and the impinging reality’s passive role, and the importance of understanding this in order to develop more reliable predictions. The analogy of observation as a fishing net and the qualitative description of subsystems as “functional components held in freefall” are both courtesy of them. Essential concepts include:

  • Entity” / “Context”: Discrete entities emerge from a continuous, undifferentiated backdrop depending on observer criteria like grain and extent (spatiotemporal scales).
  • Frequency characteristics”: Lower levels have entities with relatively fast-moving, high-frequency behavior, so that higher levels serve as constraining contexts for lower ones.
  • Surface”: Surface is the discontinuities that arise due to different reaction rates between inside and outside.
  • Response rate”: As windows in surfaces let information through, they change structurally. Some windows may be “long-term” so that they average information over time, others respond more sensitively.
  • Definitional / empirical entity”: The ordering of levels in the latter depends on the research question, whereas the former is a logical exercise to organize thoughts.
  • Nested / non-nested hierarchy”: Some hierarchies are nested, e.g. some soldiers have a higher rank than others but are still soldiers, while others are non-nested, e.g. the judicial systems where judges are not policemen. Whether a system is characterized as nested or non-nested depends on what is most useful.

Mathematics has developed certain concepts to specify “hierarchy”. It is a partially ordered set, a collection of elements with ordered, asymmetric relationship inside a set. Apart from Herbert Simon, they also recognize chemist Ilya Prigogine and author Arthur Koestler (who in his classic work “The ghost in the machine” presents a related concept called “holon”) as famous representatives for hierarchy theory.

Cohen, Jack & Stewart, Ian: “The Collapse of Chaos” (1994) and “Figment of Reality” (1997)

 

I mostly read e-books, but of these I have physical copies. I have read them multiple times and my annotations in them are so dense so as to defeat their purpose. Figments can be considered a continuation of Collapse, with a quite substantial overlap in content, with both following a whimsical, self-aware format of quirky chapter titles, poetic imagery, and expository, metaphorical dialogue between two aliens. They ambitiously address many of the big questions, like the notions of complexity and simplicity, determinism and randomness, how best to characterize evolution, the nature of consciousness, perception, and free will. What binds it all together is their notion of “contextualism”, which they offer as an explanatory paradigm complementary to reductionism. It is the “why?” to reductionism’s “how?”.

Cohen and Stewart are best friends with “Disc World” science fiction writer Terry Pratchett and are often consulted to ensure that the imaginative alternative world feels internally consistent and plausible. Science fiction worlds can be considered points on the “World Space” phase space of all logically conceivable worlds, and back to explanations in more earthly matters, contextualism can be understood as the attempt to explain why the state is exactly this one and not another possible state in the phase space. For example, a wing can be reductionistically “explained” in terms of genetics, but genetics cannot answer the question of why wings evolved in the first place, or why wings evolved when balloon-like flying devices also would have worked. Are there attractors inherent in the phase space dynamic (“universals”) or is it due to random accidents and historical contingencies (“parochials”)? If evolution were a time-lapse movie that could be re-run with slightly different initial conditions, just how different would the world look?

What Cohen and Stewart’s argument boils down to throughout the two books is that “complicity” – the interaction of phase spaces – leads to “simplexities” (emergent simplicities) wherever there are complex systems, so that the hierarchies that result are a source of universals in the unfolding of evolution. Evolution is explained as the complicity of DNA space and phenotype space, and consciousness as the complicity of brain and language/culture. Their vivid prose and wide extent makes these books really suitable for high school students who want a richer appreciation of the emergence of complexity without the still-ingrained nature-nurture dichotomy.

Mitchell, Melanie (2009): “Complexity – A Guided Tour”

This is a very lucid book that provides the general reader with a personal insider-view on the activities of the legendary Santa Fe Institute in New Mexico, which is dedicated to studying the behavior of complex systems, and the breakthroughs that preceded its establishment. It is rich in diagrams, pictures, and profiles of key figures like Boltzmann, Poincaré, Gödel, Turing, Gould, Neumann, Holland, Wolfram, Kauffman, among many others. Hierarchy plays no central role in this account, probably because she uses it only in the sense of structural nestedness, rather than in our more expanded sense.

I particularly appreciate Mitchell’s persistence in the first section in trying to clarify the many vague notions with which scientific discourse is rife. Complex systems can without controversy be defined as “a system in which large networks of components with no central control and simple rules of operation give rise to complex collective behavior, sophisticated information processing, and adaptation via learning or evolution” (p. 13). Her discussion on how complexity can be quantified is much more interesting.

“Complexity” can be qualitatively defined as behavior that violates our intuitive expectations of how matter should behave. Randomly generated patterns, such as sand scattered on the pavement, may be very complicated, but do not necessarily feel “unnatural” and do not exemplify complexity because they accomplish nothing interesting. We expect matter to behave disorderly (“increase entropy”) unless something intelligent intervenes, because randomly moving entities are statistically more probable to spread out, quite like how air molecules diffuse instead of accumulating in a corner.

If matter behaves in statistically more surprising ways (as in the evolved solar energy-dissipating beings that are humans, as well as the artifacts humans produce), then it can be said to contain information, for it correlates with (informs about) whatever influence caused this improbable behavior. Think about a handwritten letter: the patterns of ink on it are not what you would expect from someone dropping an inkpot on the paper. Instead, its conspicuous orderliness suggests intention, the intervention of a human, that the letter is a communicative act. But equally so a population of neurons in a rat brain would not alter their connections in response to random stimulation – it requires repeated stimulation only patterns (improbable dynamics) in the environment can bring about. The brain encodes information just like the ink does, and the environment is like the letter-writer – an intervener. The concept of information has therefore expanded beyond that of a conscious, human communicator, and its application is wide if sometimes not very straightforward.

Complexity is obviously linked with information content, but the link is subtle. Borrowing the language of computer science, a feature ceases to be surprising (i.e. informative) if it can be predicted from some generating mechanism, the way a sequence of numbers can be specified by an algorithm. Similarly, considerable biological complexity is known to arise from relatively simple genetic rules, so that information is compressed in these subroutines. The complexity that remains after such information-compression has taken place must stem from an evolutionary past, so that a certain amount of information about the past behavior of a system is needed to predict its future behavior (e.g. how the organism will develop). This quantification of complexity is known as “statistical complexity”. (Mitchell also mentions “degree of hierarchy” as another measure of complexity, name-dropping Simon.)

In the third section she also has a fascinating discussion on what really is meant by the notion of “computation”. Like how “information” originally referred to communication performed by humans, “computation” originally referred to calculations performed by humans. Physicists describe the universe as one big “quantum computer” and in biology, the term is for example applied to ant colonies and the immune system.  In machines, “information” can be said to be what the program (instructions) acts upon, as opposed to “meaning”, which is the human knowledge of the task performed. Programs are ultimately strings of bits – 1s and 0s – but important to understand is that a single string does not process the information to accomplish the task, nor does a single cell process information in the immune system. Instead, information processing refers to the collective actions, in its statistics and dynamics. The immune system, for example, “represents” the pathogen population in the body through the spatial distribution and temporal dynamics of lymphocytes, by having each (randomly moving) lymphocyte sample concentrations of certain molecules, and act accordingly. Mitchell mentions Hofstadter’s “parallel terraced scan” to sum this up.

Meadows, Donella H.: “Thinking in Systems: A Primer” (2008)

I was all bubbly with delight when I found this book. Meadows has written an introduction to systems theory that is brimful with examples and amazingly approachable. You can feel your IQ surge as you relish in the parallels and analogies transecting each other in this fabric vibrating with insight. Its aim is to promote more sophisticated understanding of topics often fraught with over-simplification and cognitive biases. We readily allocate blame for economic recessions to political leaders, even though recessions are inherent in the market economy dynamic. We try to find one cause for drug addiction – either the addict or his environment – rather than seeing it as embedded in a large set of factors. We want linear, arrow-like chains of causation to put in our textbooks, and seem unable to resist our reductionist instincts to zoom in and pick apart. But rational decision-making and redesign of dynamics can only be done only if we think of these phenomena as systems.

What is a system? Meadows defines “system” as “a set of things interconnected in such a way that they produce their own pattern of behavior over time” (p. 2), thus focusing on self-maintenance and resilience. Sand on a road is not a system, since there are no interrelations between the sand to prevent the arrangement from being disturbed by a passing wind. That is also why a dead organism is no longer a system – the molecules cease to interact in interesting ways. But systems are partly mental constructs – they rarely have real, or any single, legitimate boundary – and often behave in surprising ways as a result of counter-intuitive “nonlinear relationships” where a change does not cause a proportionate effect. Imagine a curvy graph – one step along the x-axis may cause a negligible change in the y-value, whereas the next step may change it dramatically.

The book goes through key concepts such as feedback (the mechanisms that restore the system after perturbations – its “homeostasis”), self-organization (their tendency to complexify), limiting factor (the input that is most important – e.g. capital invested in developing countries to promote technological development there may fail  because it is not the limiting factor), bounded rationality (a concept of Simon’s about how apparently rational decision-making on the basis of incomplete information can lead to suboptimal results) and stock (the memory of the changing flows within the system – like the gene pool, the increasing complexity of the earth crust can be regarded as an information-storing device).

It even has its own section on hierarchy. Meadows observes that a system, as a source of resilience, often generates a hierarchy of aggregated subsystems from bottom up. Hierarchies may become unstable due to “sub-optimization”, when a subsystem’s goals dominate to the detriment of the total system, or “over-centralization” to the detriment of self-maintenance mechanism. Feeding into our discussion of understanding is her section on “leverage points” – the aim of science, she argues, is to locate the place in systems where a small change leads to large behavioral shifts, because these findings are the most powerful.

Skyttner, Lars: “General Systems Theory” (2001)

This book is an engrossing, hubris-inducing read that will enrich any learning experience in any subject. It is more in-depth and historical than Meadows in perspective, but equally lucid. Throughout the book Skyttner includes witty poems and quotes, as well as stimulating review questions at the end of each chapter.

Skyttner begins with situating Systems Theory against the historical backdrop of holistic world views, starting with Medieval scholasticism’s esoteric idea of reality as an encyclopedic collection of facts that could be classified, to the Machine Age’s “clockwork universe” where phenomena were uniformly explained using mechanical metaphors (for example, the heart was viewed as a pump), and the influential concept of “Laplace’s demon” – the idea that if we knew the position and speed of every particle in the universe, we could use Newton’s laws to predict anything forwards or backwards.  Reality, in other words, is computationally reversible. This paved the way for “reductionism”, the position that phenomena can be explained given its constituents without accounting for context or observer, which proved untenable in the 20th century when quantum physics exotica shed new light on the role of the observer in observation.

Such was the philosophical climate in the 1920s when it became increasingly evident in biology and psychology that the phenomena studied would lose important properties if reduced to its components. A need for expanding phenomena with focus on dynamic properties rather than structural decomposition was recognized. Meanwhile there was a strong desire for synthesizing the different disciplines by abstracting away fine details and identifying common design principles across them, with holistic movements like “cybernetics” and “Gestalt psychology” cropping up all over the field. In 1954, the International Society for General Systems Theory was founded, hoping to create a skeleton for science by integrating disparate areas through analogies. Its view was that “Man is more the creator of reality than its discoverer” and that reality could be redressed in new terms like “goal seeking”, “transformation”, “input/output”, “convergent evolution” and “divergent evolution”.

Skyttner includes an interesting discussion on the nature of a system. According to biologist Paul Weiss, “A system is anything unitary enough to deserve a name”, while according to Kenneth Boulding it is anything “that is not chaos”. It is a way of organizing thoughts around things with order, pattern, purpose and constancy, with an apparent identity and goal. He includes an extensive taxonomy of different Greek-sounding distinctions, and a treasure chest of concepts and cognitive tools, among them hierarchy, of which he writes:  “Virtually all complex systems recognized in the real world have the tendency to organize hierarchically” (p. 60).

 Other sources:

  • Barrow, John D., (2007): “New Theories of Everything”. Popular book that mostly covers exotic theories from elementary physics, but also has a nice section on how this reductionist quest relates to the quest for basic organizational principles, thermodynamics, and the metaphysics of mathematics, information and computation. Less cognitive scientific than ideal, but has very good explanations.
  • Davies, Paul & Gregersen, Niels Henrik., (2010): ”Information and the Nature of Reality”. Really nice compilation of essays on the history of world views and materialism, and the concept of computation/information in fundamental physics and biology.
  • Holland, John H.(1996) “Hidden order: how adaptation builds complexity” & (2012) “Signals and Boundaries”. Holland is a pioneer in genetic algorithms that simulate evolution by featuring a virtual population of instruction scripts (genomes) that randomly recombine with each other (sexual reproduction), often subject to random noise (mutations), so that complex diversity results downstream over the running course of the program. This background makes sense, as he models the evolution of complex adaptive systems in terms of strings of bits, where boundaries are conditional statement strings that operate on signal strings. As the conditional string is modified by reproduction and mutation, its range changes, and the result is a “default hierarchy” where more general rules remain to fall back on as the genome continually explores the competence of more specific rules. The two books overlap a lot in content and with a bit of math in them.
  • Mitchell Waldrop, M., (1992): “Complexity: The Emerging Science at the Edge of Order and Chaos”. An exciting book with a similar coverage as “Complexity – A guided tour”.
  • McShea, Daniel W. & Brandon, Robert N. (2010): “Biology’s first law”. Haven’t really read this book, but it is about how the incorporation of error in reproductive systems leads to inevitable complexification, so it seems to be a good read to supplement Holland.
  • Morowitz, Harold J., (2002): “The Emergence of Everything”. Basically covers milestones in the self-complication of the universe from Big Bang to language and technology. He does not explain the process as much as he goes into the details of each milestone in depth, so the book requires patience to read.
  • Simon, Herbert A., (1962): “The Architecture of Complexity”. This is the fundamental work in hierarchy theory. It is a sharp, light read with some mathematics you can skim through. Google Simon while you’re at it.
  • Vedral, Vlatko., (2010): “Decoding Reality”. Takes an informatics viewpoint on human concerns like biology and economics down to determinism and quantum physics.