Complexity

Complexity, like all other human notions involves the blending to various degrees of both a subjective (extrinsic, context dependent) interpretation and an objective (intrinsic) description of some system. However, due to the particularly strong interactions with the system of interest, the subjective aspect is particularly strong. In fact, for some such as Rosen (1977) and von Foerster (1984), complexity is NOT an intrinsic property of systems. Rather, they consider them to be a completely subjective, context and language dependent property. Take for example the complexity of the brain in the perspectives a neuroscientist and a five year old.

The interaction of observer with the observed is generally considered anathema by scientists practicing the venerable scientific method. Thus, even though a rather formidable Schrödinger’s cat exists in the the study of complexity, it tends to be politely ignored by most respectable scientists. For these latter, complexity is simply a feature of the structural and behavioral/functional components of systems. The primary statistics used include: the number of components/functions; the number of interactions/connectivity; etc. Often these numbers are scaled in some manner (e.g., by following an information-theoretic approach). While the approach has been somewhat useful, the subjectivists’ concerns are completely ignored.

Perhaps the most general and promising approach that I am aware of that seems to provide a solution potentially acceptable to both camps is that of the algorithmic or computational complexity of a system (e.g., see von Foerster 1984). That is, what is the shortest program that can describe a given system (in the context of a Turing Machine). While the length of the program can change dramatically depending upon how clever the programmer may be and the availability of meta-algorithms/language (adding a sequence of numbers and dividing but the sum in one language is a more complicated equivalent to the mean – the context-dependence), if compiled to some standard machine code, there may be an asymptotic levelling-off of what “cleverness” can do and so a convergence to some threshold number of computational steps may be expected. And of course, through the application of Occam’s razor, the convergent solution may be considered a heuristic but nearly objective measure of system complexity.

The next question is how do such measures of complexity evolve and change under the influence of time and perturbations. Sound familiar?

[ Obviously, this page is not complete and likely will never be considered compete. But I will add more as time permits.]

References

Heinz von Foerster. 1984. Disorder/Order: Discovery or Invention. In: Disorder and Order, Proceedings of the Stanford International Symposium, P. Livingston (Hg.), Anma Libri, Saratoga, pp. 177-189.