06 June 2022

Systems Thinking: On Mechanisms (Quotes)

"The concept of teleological mechanisms however it be expressed in many terms, may be viewed as an attempt to escape from these older mechanistic formulations that now appear inadequate, and to provide new and more fruitful conceptions and more effective methodologies for studying self-regulating processes, self-orienting systems and organisms, and self-directing personalities. Thus, the terms feedback, servomechanisms, circular systems, and circular processes may be viewed as different but equivalent expressions of much the same basic conception." (Lawrence K Frank, 1948)

"As a metaphor - and I stress that it is intended as a metaphor - the concept of an invariant that arises out of mutually or cyclically balancing changes may help us to approach the concept of self. In cybernetics this metaphor is implemented in the ‘closed loop’, the circular arrangement of feedback mechanisms that maintain a given value within certain limits. They work toward an invariant, but the invariant is achieved not by a steady resistance, the way a rock stands unmoved in the wind, but by compensation over time. Whenever we happen to look in a feedback loop, we find the present act pitted against the immediate past, but already on the way to being compensated itself by the immediate future. The invariant the system achieves can, therefore, never be found or frozen in a single element because, by its very nature, it consists in one or more relationships - and relationships are not in things but between them." (Ernst von Glasersfeld German, "Cybernetics, Experience and the Concept of Self", 1970)

"Self-organization can be defined as the spontaneous creation of a globally coherent pattern out of local interactions. Because of its distributed character, this organization tends to be robust, resisting perturbations. The dynamics of a self-organizing system is typically non-linear, because of circular or feedback relations between the components. Positive feedback leads to an explosive growth, which ends when all components have been absorbed into the new configuration, leaving the system in a stable, negative feedback state. Non-linear systems have in general several stable states, and this number tends to increase (bifurcate) as an increasing input of energy pushes the system farther from its thermodynamic equilibrium. To adapt to a changing environment, the system needs a variety of stable states that is large enough to react to all perturbations but not so large as to make its evolution uncontrollably chaotic. The most adequate states are selected according to their fitness, either directly by the environment, or by subsystems that have adapted to the environment at an earlier stage. Formally, the basic mechanism underlying self-organization is the (often noise-driven) variation which explores different regions in the system’s state space until it enters an attractor. This precludes further variation outside the attractor, and thus restricts the freedom of the system’s components to behave independently. This is equivalent to the increase of coherence, or decrease of statistical entropy, that defines self-organization." (Francis Heylighen, "The Science Of Self-Organization And Adaptivity", 1970)

"To adapt to a changing environment, the system needs a variety of stable states that is large enough to react to all perturbations but not so large as to make its evolution uncontrollably chaotic. The most adequate states are selected according to their fitness, either directly by the environment, or by subsystems that have adapted to the environment at an earlier stage. Formally, the basic mechanism underlying self-organization is the (often noise-driven) variation which explores different regions in the system’s state space until it enters an attractor. This precludes further variation outside the attractor, and thus restricts the freedom of the system’s components to behave independently. This is equivalent to the increase of coherence, or decrease of statistical entropy, that defines self-organization." (Francis Heylighen, "The Science Of Self-Organization And Adaptivity", 1970)

"It is the intertwined and interacting mechanisms of evolution and ecology, each of which is at the same time a product and a process, that are responsible for life as we see it, and as it has been." (James W. Valentine, "Evolutionary Paleoecology of the Marine Biosphere", 1973)

"The unfoldings are called catastrophes because each of them has regions where a dynamic system can jump suddenly from one state to another, although the factors controlling the process change continuously. Each of the seven catastrophes represents a pattern of behavior determined only by the number of control factors, not by their nature or by the interior mechanisms that connect them to the system's behavior. Therefore, the elementary catastrophes can be models for a wide variety of processes, even those in which we know little about the quantitative laws involved." (Alexander Woodcock & Monte Davis, "Catastrophe Theory", 1978)

"Every system of whatever size must maintain its own structure and must deal with a dynamic environment, i.e., the system must strike a proper balance between stability and change. The cybernetic mechanisms for stability (i.e., homeostasis, negative feedback, autopoiesis, equifinality) and change (i.e., positive feedback, algedonodes, self-organization) are found in all viable systems." (Barry Clemson, "Cybernetics: A New Management Tool", 1984)

"We define a semantic network as 'the collection of all the relationships that concepts have to other concepts, to percepts, to procedures, and to motor mechanisms' of the knowledge." (John F Sowa, "Conceptual Structures", 1984)

"Mental models are the mechanisms whereby humans are able to generate descriptions of system purpose and form, explanations of system functioning and observed system states, and predictions of future system states." (William B Rouse & Nancy M Morris, "On looking into the black box: Prospects and limits in the search for mental models", Psychological Bulletin (3), 1986)

"In cybernetics a system is normally described as a black box whereby the whole of a system's generative mechanisms are lumped into a single transfer function (TF). This acts on an input to produce an output. To ensure that the output is monitored, so that a system may remain homeostatic (the critical variables remain within acceptable limits) or attain a new steady state (according to input decisions, say), the output of the TF is brought back into its input where the difference between the desired and actual levels is identified. This is known as feedback." (Robert L Flood & Ewart R Carson, "Dealing with Complexity: An introduction to the theory and application of systems", 1988)

"Man's attempts to control, service, and/ or design very complex situations have, however, often been fraught with disaster. A major contributory factor has been the unwitting adoption of piecemeal thinking, which sees only parts of a situation and its generative mechanisms. Additionally, it has been suggested that nonrational thinking sees only the extremes (the simple 'solutions' ) of any range of problem solutions. The net result of these factors is that situations exhibit counterintuitive behavior; outcomes of situations are rarely as we expect, but this is not an intrinsic property of situations; rather, it is largely caused by neglect of, or lack of respect being paid to, the nature and complexity of a situation under investigation." (Robert L Flood & Ewart R Carson, "Dealing with Complexity: An introduction to the theory and application of systems", 1988)

"Negative feedback only improves the precision of goal-seeking, but does not determine it. Feedback devices are only executive mechanisms that operate during the translation of a program." (Ernst Mayr, "Toward a New Philosophy of Biology: Observations of an Evolutionist", 1988)

"[…] the standard theory of chaos deals with time evolutions that come back again and again close to where they were earlier. Systems that exhibit this eternal return" are in general only moderately complex. The historical evolution of very complex systems, by contrast, is typically one way: history does not repeat itself. For these very complex systems with one-way evolution it is usually clear that sensitive dependence on initial condition is present. The question is then whether it is restricted by regulation mechanisms, or whether it leads to long-term important consequences." (David Ruelle, "Chance and Chaos", 1991)

"The systems' basic components are treated as sets of rules. The systems rely on three key mechanisms: parallelism, competition, and recombination. Parallelism permits the system to use individual rules as building blocks, activating sets of rules to describe and act upon the changing situations. Competition allows the system to marshal its rules as the situation demands, providing flexibility and transfer of experience. This is vital in realistic environments, where the agent receives a torrent of information, most of it irrelevant to current decisions. The procedures for adaptation - credit assignment and rule discovery - extract useful, repeatable events from this torrent, incorporating them as new building blocks. Recombination plays a key role in the discovery process, generating plausible new rules from parts of tested rules. It implements the heuristic that building blocks useful in the past will prove useful in new, similar contexts." (John H Holland, "Complex Adaptive Systems", Daedalus Vol. 121 (1), 1992)

"There must be, however, cybernetic or homeostatic mechanisms for preventing the overall variables of the social system from going beyond a certain range. There must, for instance, be machinery for controlling the total numbers of the population; there must be machinery for controlling conflict processes and for preventing perverse social dynamic processes of escalation and inflation. One of the major problems of social science is how to devise institutions which will combine this overall homeostatic control with individual freedom and mobility." (Kenneth Boulding, "Economics of the coming spaceship Earth", 1994)

"The second law of thermodynamics, which requires average entropy (or disorder) to increase, does not in any way forbid local order from arising through various mechanisms of self-organization, which can turn accidents into frozen ones producing extensive regularities. Again, such mechanisms are not restricted to complex adaptive systems." (Murray Gell-Mann, "What is Complexity?", Complexity Vol 1 (1), 1995)

"By irreducibly complex I mean a single system composed of several well-matched, interacting parts that contribute to the basic function, wherein the removal of any one of the parts causes the system to effectively cease functioning. An irreducibly complex system cannot be produced directly (that is, by continuously improving the initial function, which continues to work by the same mechanism) by slight, successive modification of a precursor, system, because any precursors to an irreducibly complex system that is missing a part is by definition nonfunctional." (Michael Behe, "Darwin’s Black Box", 1996)

"This distinction is familiar in natural science, where one is not expected to mistake, say, the cardiovascular system for the circulation of the blood or the brain with mental processes. But it is unusual in social studies. [...] Mechanism is to system as motion is to body, combination (or dissociation) to chemical compound, and thinking to brain. [In the systemic view], agency is both constrained and motivated by structure, and in turn the latter is maintained or altered by individual action. In other words, social mechanisms reside neither in persons nor in their environment – they are part of the processes that unfold in or among social systems. […] All mechanisms are system-specific: there is no such thing as a universal or substrate-neutral mechanism." (Mario Bunge, "The Sociology-philosophy Connection", 1999)

"We are accustomed to thinking that a System acts like a machine, and that if we only knew its mechanism, we could understand, even predict, its behavior. This is wrong. The correct orientation is: - and if the machine is large and complex enough, it will act like a large System. We simply have our metaphors backwards." (John Gall, "Systemantics: The Systems Bible", 2002)

"A perturbation in a system with a negative feedback mechanism will be reduced whereas in a system with positive feedback mechanisms, the perturbation will grow. Quite often, the system dynamics can be reduced to a low-order description. Then, the growth or decay of perturbations can be classified by the systems’ eigenvalues or the pseudospectrum." (Gerrit Lohmann, "Abrupt Climate Change Modeling", 2009)

"What advantages do diagrams have over verbal descriptions in promoting system understanding? First, by providing a diagram, massive amounts of information can be presented more efficiently. A diagram can strip down informational complexity to its core - in this sense, it can result in a parsimonious, minimalist description of a system. Second, a diagram can help us see patterns in information and data that may appear disordered otherwise. For example, a diagram can help us see mechanisms of cause and effect or can illustrate sequence and flow in a complex system. Third, a diagram can result in a less ambiguous description than a verbal description because it forces one to come up with a more structured description." (Robbie T Nakatsu, "Diagrammatic Reasoning in AI", 2010)

"Cyberneticists argue that positive feedback may be useful, but it is inherently unstable, capable of causing loss of control and runaway. A higher level of control must therefore be imposed upon any positive feedback mechanism: self-stabilising properties of a negative feedback loop constrain the explosive tendencies of positive feedback. This is the starting point of our journey to explore the role of cybernetics in the control of biological growth. That is the assumption that the evolution of self-limitation has been an absolute necessity for life forms with exponential growth." (Tony Stebbing, "A Cybernetic View of Biological Growth: The Maia Hypothesis", 2011)

"Stated loosely, models are simplified, idealized and approximate representations of the structure, mechanism and behavior of real-world systems. From the standpoint of set-theoretic model theory, a mathematical model of a target system is specified by a nonempty set - called the model’s domain, endowed with some operations and relations, delineated by suitable axioms and intended empirical interpretation." (Zoltan Domotor, "Mathematical Models in Philosophy of Science" [Mathematics of Complexity and Dynamical Systems, 2012])

"Systems subjected to randomness - and unpredictability - build a mechanism beyond the robust to opportunistically reinvent themselves each generation, with a continuous change of population and species." (Nassim N Taleb, "Antifragile: Things that gain from disorder", 2012)

"The work around the complex systems map supported a concentration on causal mechanisms. This enabled poor system responses to be diagnosed as the unanticipated effects of previous policies as well as identification of the drivers of the sector. Understanding the feedback mechanisms in play then allowed experimentation with possible future policies and the creation of a coherent and mutually supporting package of recommendations for change." (David C Lane et al, "Blending systems thinking approaches for organisational analysis: reviewing child protection", 2015)

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...