05 October 2025

❄️Systems Thinking: On Equifiniality (Quotes)

"There exist models, principles and laws that apply to generalized systems or their subclasses irrespective of their particular kind, the nature of the component elements, and the relations or "forces" between them. We postulate a new discipline called General System Theory. General System Theory is a logico-mathematical field whose task is the formulation and derivation of those general principles that are applicable to 'systems' in general. In this way, exact formulations of terms such as wholeness and sum, differentiation, progressive mechanization, centralization, hierarchical order, finality and equifinality, etc., become possible, terms which occur in all sciences dealing with 'systems' and imply their logical homology [...]" (Ludwig von Bertalanffy, 1947/1955)

"The classical principle of causality held that similar conditions produce similar effects, and consequently dissimilar results are due to dissimilar conditions. Bertalanffy, in analyzing the self-regulating, or morphostatic, features of open biological systems, loosened this classical conception by introducing the concept of 'equifinality'. This holds that, in ontogenesis for example, a final normal adult state may be reached by any number of devious developmental routes. Morphogenetic processes, however, go even further and suggest an opposite principle that might be called "multifinality": similar initial conditions may lead to dissimilar end-states. Thus, two cultures developing in very similar ecological environments may end up with very different sociocultural systems." (Walter F Buckley, "Sociology and Modern Systems Theory", 1967)

"If a steady state is reached in an open system, it is independent of the initial conditions, and determined only by the system parameters, i.e., rates of reaction and transport. This is called equifinality as found in many organismic processes, e.g., in growth." (Ludwig von Bertalanffy, "General System Theory", 1968)

"The fact that certain principles apply to systems in general, irrespective of the nature of the systems and of the entities concerned, explains that corresponding conceptions and laws appear independently in different fields of science, causing the remarkable parallelism in their modern development. Thus, concepts such as wholeness and sum, mechanization, centralization, hierarchical order, stationary and steady states, equifinality, etc., are found in different fields of natural science, as well as in psychology and sociology." (Ludwig von Bertalanffy, "General System Theory", 1968)

"The first is the principle of equifinality. In any closed system, the final state is unequivocally determined by the initial conditions: e.g., the motion in a planetary system where the positions of the planets at a time t are unequivocally determined by their positions at a time t0. Or in a chemical equilibrium, the final concentrations of the reactants naturally depend on the initial concentrations. If either the initial conditions or the process is altered, the final state will also be changed. This is not so in open systems. Here, the same final state may be reached from different initial conditions and in different ways. This is what is called equifinality, and it has a significant meaning for the phenomena of biological regulation." (Ludwig von Bertalanffy, "General System Theory", 1968)

"The theory of open systems is an important generalization of physical theory, kinetics and thermodynamics. It has led to new principles and insight, such as the principle of equifinality, the generalization of the second thermodynamic principle, the possible increase of order in open systems, the occurrence of periodic phenomena of overshoot and false start, etc."  (Ludwig von Bertalanffy, "General System Theory", 1968)

"[...] equifiniality, the tendency towards a characteristic final state from different initial states and in different ways, based upon dynamic interaction in an open system attaining a steady state; [...]" (Ludwig von Bertalanffy, "General System Theory", 1968)

"There is, however, yet another basis for organic regulations. This is equifinality-i.e., the fact that the same final state can be reached from different initial conditions and in different ways. This is found to be the case in open systems, insofar as they attain a steady state. It appears that equifinality is responsible for the primary regulability of organic systems-i.e., for all those regulations which cannot be based upon predetermined structures or mechanisms, but on the contrary, exclude such mechanisms and were regarded therefore as arguments for vitalism." (Ludwig von Bertalanffy, "General System Theory", 1968)

"Open systems, in contrast to closed systems, exhibit a principle of equifinality, that is, a tendency to achieve a final state independent of initial conditions. In other words, open systems tend to 'resist' perturbations that take them away from some steady state. They can exhibit homeostasis." (Anatol Rapaport, "The Uses of Mathematical Isomorphism in General System Theory", 1972)

"The classical Aristotelian approach was to isolate and analyse, to reduce to simple, lineal, unidirectional cause-effect chains on a time sequence of prior cause and present effect - the 'why' of single causation. Sociological structures have replaced this by complex reticulate circuits, by networks of field and system and process, appealing to timeless, simultaneous states of related variables of 'equifinality', of 'multifinality', that is of devious developmental routes leading to similar final results or of similar conditions leading to dissimilar endstates, ranging from past and present to future and marking a revolutionary conceptual shift of attention from energy to information flow." (Patrick de Maré, "Perspectives in Group Psychotherapy", 1972)

"Positive, or deviation-amplifying feedback is seen in biological evolution and societal development, which exhibit processes of morphogenesis or, the elaboration of the system's form, organization or state discussed earlier. The implications of this distinction for the classic principle of causality has led to the further concept of equifinality in morphostatic system processes, which holds that an ultimate state may be reached by various developmental routes, and to the opposite principle of multifinality in morphogenetic situations where by similar initial conditions may lead to dissimilar end states” (Paul H Haynes, "Toward a Concept of Monitoring", The Town Planning review Vol. 45 (1), 1974)

"The same function can be performed in different ways by using different combinations of elements. There are several routes to the same goal-the principle sometimes described as equifinality." (Albert Cherns, "The Principles of Sociotechnical Design", Human Relations Vol. 29 (8), 1976)

"Every system of whatever size must maintain its own structure and must deal with a dynamic environment, i.e., the system must strike a proper balance between stability and change. The cybernetic mechanisms for stability" (i.e., homeostasis, negative feedback, autopoiesis, equifinality) and change" (i.e., positive feedback, algedonodes, self-organization) are found in all viable systems." (Barry Clemson, "Cybernetics: A New Management Tool", 1984)

"From these four propositions we can derive two basic principles of development, morphostasis and morphogenesis. Morphostasis refers to those processes in complex systemenvironment exchanges which tend to maintain a systems given form or organization. Morphostasis may represent developmental processes that lend themselves to observation of continuity. Morphogenic processes may by their nature be harder to observe and predict since they involve changes in a systems structure, state or functioning. Morphogenic processes may involve discontinuities in development. At the extreme morphogenesis can be related to catastrophes and radical changes (e.g. in terms of continuing levels of Y at some point leading to a steep change in X). Morphogenesis involves at least two kinds of developmental paths: 1) equifinality and 2) multifinality. Equifinality holds that a given outcome can be reached from any number of different developmental paths. In this case, similar outcomes may not be the result of similar initial conditions or mediating processes.[…] Multifinality is the opposite developmental principle to equifinality, Where by similar initial conditions may lead to dissimilar outcomes." (Candice Feiring & Michael Lewis, "Equifinality and Multifinality: Diversity in Development from Infancy into Childhood. Biennial Meeting of the Society for Research in Child Development", 1987)

"Systems thinking, rooted in twentieth-century biology and cybernetics, posits open systems, accepts an observer in the system and equifinality, focuses on the kinds of relations among phenomena, on structures and forms, on exchanges of information, and on circular causality." (Tom LeClair, "The Art of Excess: Mastery in contemporary American fiction", 1989)

"[equifinality] quite simply means that different sorts of internal arrangements are perfectly compatible with identical contextual or environmental states. The principle goes against the idea of a quasi-ideal ‘match’ which is inherent in the principle of correspondence. Whereas correspondence [i.e. Contingency] theory suggests that rigid and bureaucratic structures are not a good match for volatile and shifting product markets, equifinality theorists claim that it may very well turn out to be a good match but only if the level and diversity of the workforce is large and organization culture produces motivated and flexible actors." (Ardnt M Sorge, "Organization behaviour", [Sorge and M Warner (eds), "The IEBM Handbook of Organizational Behaviour"] 1997)

"In mechanistic systems there is a direct cause-and-effect relationship between the initial conditions and the final state. Biological and social systems operate differently. Equifinality suggests that certain results may be achieved with different initial conditions and in different ways. It offers us a basis for the flexibility, agility, and choice needed in today's dynamic world." (Stephen G Haines, "The Manager's Pocket Guide to Systems Thinking & Learning, 1998)

"Equifinality and multifinality: Open systems have equally valid alternative ways of attaining the same objectives from different initial conditions (convergence) or, from a given initial state, obtain different, and mutually exclusive, objectives (divergence)."  (Lars Skyttner, "General System Theory: Ideas & applications", 2001)

"Given negative feedback, a system's equilibrium state is invariant over a wide range of initial conditions (equifinality)." (Lars Skyttner, "General System Theory: Ideas & applications", 2001)

"Equifinality is the ability to reach the same final state from different initial conditions and in different ways. It depends on the existence of feedback and regulation." (Michael C Jackson, "Systems Approaches to Management", 2002)

"In open systems, for example, biological and social systems, final states or objectives may be reached in different ways and from disparate starting points. This property of finding equally valid ways is called equifinality. The reverse condition, achievement of different ends through use of the same means, is called multifinality." (Lars Skyttner, "General System Theory: Problems, Perspectives & Practice" 2nd. Ed., 2005)

"General systems theory is concerned with open systems. It explains how living systems function through importing energy from a pre-given environment, across the system’s boundary with that environment, transforming the imports into a form of functioning, and then exporting waste to the environment. The boundary is a given and its formation is not due to the functioning of the system. The theory explains how the system sustains homeostasis, or equilibrium, through adaptation
to the environment. The history of the system is not important in that what matters is the process of adaptation to the current environment. The principle of equifinality means that the state of homeostasis can be achieved from a large number of starting points and it is this that renders history unimportant." (Ralph D Stacey, "Strategic Management and Organisational Dynamics: The Challenge of Complexity" 5th Ed., 2007)

"Equifinality [...] states that different sorts of internal arrangements and structures can be perfectly compatible with identical contextual or environmental states. Put simply, this means that there is more than one way for organisations to structure themselves in order to achieve their goals." (Bernard Burnes, "Managing change: a strategic approach to organisational dynamics" 5th ed., 2009)

"Equifinality: It is often possible to achieve similar results using different means or paths. Strategists should recognize that achieving a successful outcome is more important than imposing the method of achieving it. It may be possible to generate new alternatives that give equal results but with far greater potential for gaining commitment." (Fred R David, "Strategic Management: Concepts and cases" 13th Ed., 2011)

"Principle of Equifinality: If a steady state is reached in an open system, it is independent of the initial conditions, and determined only by the system parameters, i.e. rates of reaction and transport." (Kevin Adams & Charles Keating, "Systems of systems engineering", 2012)

"Multifinality is the opposite developmental principle to equifinality, whereby similar initial conditions lead to dissimilar outcomes. This indicates that for the investigator either the mechanisms are not understood or the relevant aspects have been left out." (Rob Dekkers, "Applied Systems Theory", 2014,

"The principles of equifinality, multifinality, homeostasis and heterostasis have farstretching implications for the application of deductive reasoning The paradox of equifinality and multifinality means that when observing the behaviour of a system, it might be moving towards a final state irrespective of the initial state or moving away from an initial state without being able to predict the final outcome." (Rob Dekkers, "Applied Systems Theory", 2014)

"As organizations do well in offsetting the entropic processes, they tend to grow and develop, which results in greater differentiation of products and services, along with the congruent need for integration and coordination. A final characteristic of organization according to open-system theory is the principle of equifinality, which essentially means that for any given goal, there are multiple paths that organizational members can take to achieve it. In short, it should be clear that managers of organizations need to be constantly aware that they are managing a system that has permeable boundaries, is dependent on its environment for survival, and will go out of existence unless it is actively attended to (managed)." (W Warner Burke, "Organization Change: Theory & Practice" 5th Ed., 2018)

28 September 2025

❄️Systems Thinking: On States (Quotes)

"The true nature of the universal principle of synergy pervading all nature and creating all the different kinds of structure that we observe to exist, must now be made clearer. Primarily and essentially it is a process of equilibration, i.e., the several forces are first brought into a state of partial equilibrium. It begins in collision, conflict, antagonism, and opposition, and then we have the milder phases of antithesis, competition, and interaction, passing next into a modus vivendi, or compromise, and ending in collaboration and cooperation. […] The entire drift is toward economy, conservatism, and the prevention of waste." (James Q Dealey & Lester F Ward, "A Text-book of Sociology", 1905)

"Nature prefers the more probable states to the less probable because in nature processes take place in the direction of greater probability. Heat goes from a body at higher temperature to a body at lower temperature because the state of equal temperature distribution is more probable than a state of unequal temperature distribution." (Max Planck, "The Atomic Theory of Matter", 1909)

"True equilibria can occur only in closed systems and that, in open systems, disequilibria called ‘steady states’, or ‘flow equilibria’ are the predominant and characteristic feature. According to the second law of thermodynamics a closed system must eventually attain a time-independent equilibrium state, with maximum entropy and minimum free energy. An open system may, under certain conditions, attain a time-independent state where the system remains constant as a whole and in its phases, though there is a continuous flow of component materials. This is called a steady state. Steady states are irreversible as a whole. […] A closed system in equilibrium does not need energy for its preservation, nor can energy be obtained from it. In order to perform work, a system must be in disequilibrium, tending toward equilibrium and maintaining a steady state, Therefore the character of an open system is the necessary condition for the continuous working capacity of the organism." (Ludwig on Bertalanffy, "Theoretische Biologie: Band 1: Allgemeine Theorie, Physikochemie, Aufbau und Entwicklung des Organismus", 1932)

"A state of equilibrium in a system does not mean, further, that the system is without tension. Systems can, on the contrary, also come to equilibrium in a state of tension" (e.g., a spring under tension or a container with gas under pressure). The occurrence of this sort of system, however, presupposes a certain firmness of boundaries and actual segregation of the system from its environment" (both of these in a functional, not a spatial, sense). If the different parts of the system are insufficiently cohesive to withstand the forces working toward displacement" (i.e., if the system shows insufficient internal firmness, if it is fluid), or if the system is not segregated from its environment by sufficiently firm walls but is open to its neighboring systems, stationary tensions cannot occur. Instead, there occurs a process in the direction of the forces, which encroaches upon the neighboring regions with diffusion of energy and which goes in the direction of an equilibrium at a lower level of tension in the total region. The presupposition for the existence of a stationary state of tension is thus a certain firmness of the system in question, whether this be its own inner firmness or the firmness of its walls." (Kurt Lewin, "A Dynamic Theory of Personality", 1935)

"The process moves in the direction of a state of equilibrium only for the system as a whole. Part processes may at the same time go on in opposed directions, a circumstance which is of the greatest significance for, for example, the theory of detour behavior. It is hence important to take the system whole which is dominant at the moment as basis." (Kurt Lewin, "A Dynamic Theory of Personality", 1935)

"An isolated system or a system in a uniform environment" (which for the present consideration we do best to include as a part of the system we contemplate) increases its entropy and more or less rapidly approaches the inert state of maximum entropy. We now recognize this fundamental law of physics to be just the natural tendency of things to approach the chaotic state (the same tendency that the books of a library or the piles of papers and manuscripts on a writing desk display) unless we obviate it." (The analogue of irregular heat motion, in this case, is our handling those objects now and again without troubling to put them back in their proper places.)" (Erwin Schrödinger, "What is Life?", 1944)

"Every process, event, happening - call it what you will; in a word, everything that is going on in Nature means an increase of the entropy of the part of the world where it is going on. Thus a living organism continually increases its entropy - or, as you may say, produces positive entropy – and thus tends to approach the dangerous state of maximum entropy, which is death. It can only keep aloof from it, i.e. alive, by continually drawing from its environment negative entropy – which is something very positive as we shall immediately see. What an organism feeds upon is negative entropy. Or, to put it less paradoxically, the essential thing in metabolism is that the organism succeeds in freeing itself from all the entropy it cannot help producing while alive." (Erwin Schrödinger, "What is Life?", 1944)

"The study of the conditions for change begins appropriately with an analysis of the conditions for no change, that is, for the state of equilibrium." (Kurt Lewin, "Quasi-Stationary Social Equilibria and the Problem of Permanent Change", 1947)

"[…] the characteristic tendency of entropy is to increase. As entropy increases, the universe, and all closed systems in the universe, tend naturally to deteriorate and lose their distinctiveness, to move from the least to the most probable state, from a state of organization and differentiation in which distinctions and forms exist, to a state of chaos and sameness." (Norbert Wiener, "The Human Use of Human Beings", 1950)

"Physical irreversibility manifests itself in the fact that, whenever the system is in a state far removed from equilibrium, it is much more likely to move toward equilibrium, than in the opposite direction." (William Feller, "An Introduction To Probability Theory And Its Applications", 1950)

"Every stable system has the property that if displaced from a state of equilibrium and released, the subsequent movement is so matched to the initial displacement that the system is brought back to the state of equilibrium. A variety of disturbances will therefore evoke a variety of matched reactions." (W Ross Ashby, "Design for a Brain: The Origin of Adaptive Behavior", 1952)

"The primary fact is that all isolated state-determined dynamic systems are selective: from whatever state they have initially, they go towards states of equilibrium. These states of equilibrium are always characterised, in their relation to the change-inducing laws of the system, by being exceptionally resistant." (W Ross Ashby, "Design for a Brain: The Origin of Adaptive Behavior", 1952)

"Reversible processes are not, in fact, processes at all, they are sequences of states of equilibrium. The processes which we encounter in real life are always irreversible processes." (Arnold Sommerfeld, "Thermodynamics and Statistical Mechanics", Lectures on Theoretical - Physics Vol. V, 1956)

"Stability is commonly thought of as desirable, for its presence enables the system to combine of flexibility and activity in performance with something of permanence. Behaviour that is goal-seeking is an example of behaviour that is stable around a state of equilibrium. Nevertheless, stability is not always good, for a system may persist in returning to some state that, for other reasons, is considered undesirable." (W Ross Ashby, "An Introduction to Cybernetics", 1956)

"Clearly, if the state of the system is coupled to parameters of an environment and the state of the environment is made to modify parameters of the system, a learning process will occur. Such an arrangement will be called a Finite Learning Machine, since it has a definite capacity. It is, of course, an active learning mechanism which trades with its surroundings. Indeed it is the limit case of a self-organizing system which will appear in the network if the currency supply is generalized." (Gordon Pask, "The Natural History of Networks", 1960)

"Cybernetics is the general science of communication. But to refer to communication is consciously or otherwise to refer to distinguishable states of information inputs and outputs and /or to information being processed within some relatively isolated system." (Henryk Greniewski, "Cybernetics without Mathematics", 1960)

"To say a system is 'self-organizing' leaves open two quite different meanings. There is a first meaning that is simple and unobjectionable. This refers to the system that starts with its parts separate" (so that the behavior of each is independent of the others' states) and whose parts then act so that they change towards forming connections of some type. Such a system is 'self-organizing' in the sense that it changes from 'parts separated' to 'parts joined'. […] In general such systems can be more simply characterized as 'self-connecting', for the change from independence between the parts to conditionality can always be seen as some form of 'connection', even if it is as purely functional […]  'Organizing' […] may also mean 'changing from a bad organization to a good one' […] The system would be 'self-organizing' if a change were automatically made to the feedback, changing it from positive to negative; then the whole would have changed from a bad organization to a good." (W Ross Ashby, "Principles of the self-organizing system", 1962)

"[...] in a state of dynamic equilibrium with their environments. If they do not maintain this equilibrium they die; if they do maintain it they show a degree of spontaneity, variability, and purposiveness of response unknown in the non-living world. This is what is meant by ‘adaptation to environment’ […] [Its] essential feature […] is stability - that is, the ability to withstand disturbances." (Kenneth Craik, 'Living organisms', “The Nature of Psychology”, 1966)

"Conventional physics deals only with closed systems, i.e. systems which are considered to be isolated from their environment. [...] However, we find systems which by their very nature and definition are not closed systems. Every living organism is essentially an open system. It maintains itself in a continuous inflow and outflow, a building up and breaking down of components, never being, so long as it is alive, in a state of chemical and thermodynamic equilibrium but maintained in a so-called steady state which is distinct from the latter." (Ludwig von Bertalanffy, "General System Theory", 1968)

"We may state as characteristic of modern science that this scheme of isolable units acting in one-way causality has proven to be insufficient. Hence the appearance, in all fields of science, of notions like wholeness, holistic, organismic, gestalt, etc., which all signify that, in the last resort, we must think in terms of systems of elements in mutual interaction […]." (Ludwig von Bertalanffy, "General System Theory", 1968)

"In complex systems cause and effect are often not closely related in either time or space. The structure of a complex system is not a simple feedback loop where one system state dominates the behavior. The complex system has a multiplicity of interacting feedback loops. Its internal rates of flow are controlled by nonlinear relationships. The complex system is of high order, meaning that there are many system states (or levels). It usually contains positive-feedback loops describing growth processes as well as negative, goal-seeking loops. In the complex system the cause of a difficulty may lie far back in time from the symptoms, or in a completely different and remote part of the system. In fact, causes are usually found, not in prior events, but in the structure and policies of the system." (Jay W Forrester, "Urban dynamics", 1969)

"The structure of a complex system is not a simple feedback loop where one system state dominates the behavior. The complex system has a multiplicity of interacting feedback loops. Its internal rates of flow are controlled by non-linear relationships. The complex system is of high order, meaning that there are many system states (or levels). It usually contains positive-feedback loops describing growth processes as well as negative, goal-seeking loops." (Jay F Forrester, "Urban Dynamics", 1969)

"To adapt to a changing environment, the system needs a variety of stable states that is large enough to react to all perturbations but not so large as to make its evolution uncontrollably chaotic. The most adequate states are selected according to their fitness, either directly by the environment, or by subsystems that have adapted to the environment at an earlier stage. Formally, the basic mechanism underlying self-organization is the" (often noise-driven) variation which explores different regions in the system’s state space until it enters an attractor. This precludes further variation outside the attractor, and thus restricts the freedom of the system’s components to behave independently. This is equivalent to the increase of coherence, or decrease of statistical entropy, that defines self-organization." (Francis Heylighen, "The Science Of Self-Organization And Adaptivity", 1970)

"Self-organization can be defined as the spontaneous creation of a globally coherent pattern out of local interactions. Because of its distributed character, this organization tends to be robust, resisting perturbations. The dynamics of a self-organizing system is typically non-linear, because of circular or feedback relations between the components. Positive feedback leads to an explosive growth, which ends when all components have been absorbed into the new configuration, leaving the system in a stable, negative feedback state. Non-linear systems have in general several stable states, and this number tends to increase" (bifurcate) as an increasing input of energy pushes the system farther from its thermodynamic equilibrium." (Francis Heylighen, The Science Of Self-Organization And Adaptivity", 1970)

"[The] system may evolve through a whole succession of transitions leading to a hierarchy of more and more complex and organized states. Such transitions can arise in nonlinear systems that are maintained far from equilibrium: that is, beyond a certain critical threshold the steady-state regime become unstable and the system evolves into a new configuration." (Ilya Prigogine, Gregoire Micolis & Agnes Babloyantz, "Thermodynamics of Evolution", Physics Today 25" (11), 1972) 

"General systems theory is the scientific exploration of 'wholes' and 'wholeness' which, not so long ago, were considered metaphysical notions transcending the boundaries of science. Hierarchic structure, stability, teleology, differentiation, approach to and maintenance of steady states, goal-directedness - these are a few of such general system properties." (Ervin László, "Introduction to Systems Philosophy", 1972)

"In an isolated system, which cannot exchange energy and matter with the surroundings, this tendency is expressed in terms of a function of the macroscopic state of the system: the entropy." (Ilya Prigogine, "Thermodynamics of Evolution", 1972) 

"Open systems, in contrast to closed systems, exhibit a principle of equifinality, that is, a tendency to achieve a final state independent of initial conditions. In other words, open systems tend to 'resist' perturbations that take them away from some steady state. They can exhibit homeostasis." (Anatol Rapaport, "The Uses of Mathematical Isomorphism in General System Theory", 1972)

"There is nothing supernatural about the process of self-organization to states of higher entropy; it is a general property of systems, regardless of their materials and origin. It does not violate the Second Law of thermodynamics since the decrease in entropy within an open system is always offset by the increase of entropy in its surroundings." (Ervin László, "Introduction to Systems Philosophy", 1972)

"When a system is considered in two different states, the difference in volume or in any other property, between the two states, depends solely upon those states themselves and not upon the manner in which the system may pass from one state to the other." (Rudolf Arnheim, "Entropy and Art: An Essay on Disorder and Order", 1974) 

"A system may be specified in either of two ways. In the first, which we shall call a state description, sets of abstract inputs, outputs and states are given, together with the action of the inputs on the states and the assignments of outputs to states. In the second, which we shall call a coordinate description, certain input, output and state variables are given, together with a system of dynamical equations describing the relations among the variables as functions of time. Modern mathematical system theory is formulated in terms of state descriptions, whereas the classical formulation is typically a coordinate description, for example a system of differential equations." (E S Bainbridge, "The Fundamental Duality of System Theory", 1975)

"In any system governed by a potential, and in which the system's behavior is determined by no more than four different factors, only seven qualitatively different types of discontinuity are possible. In other words, while there are an infinite number of ways for such a system to change continuously" (staying at or near equilibrium), there are only seven structurally stable ways for it to change discontinuously" (passing through non-equilibrium states)." (Alexander Woodcock & Monte Davis, "Catastrophe Theory", 1978)

"Cellular automata may be considered as discrete dynamical systems. In almost all cases, cellular automaton evolution is irreversible. Trajectories in the configuration space for cellular automata therefore merge with time, and after many time steps, trajectories starting from almost all initial states become concentrated onto 'attractors'. These attractors typically contain only a very small fraction of possible states. Evolution to attractors from arbitrary initial states allows for 'self-organizing' behaviour, in which structure may evolve at large times from structureless initial states. The nature of the attractors determines the form and extent of such structures." (Stephen Wolfram, "Nonlinear Phenomena, Universality and complexity in cellular automata", Physica 10D, 1984)

"If a system is in a state of equilibrium" (a steady state), then all sub-systems must be in equilibrium. If all sub-systems are in a state of equilibrium, then the system must be in equilibrium." (Barry Clemson, "Cybernetics: A New Management Tool", 1984)

"When one combines the new insights gained from studying far-from-equilibrium states and nonlinear processes, along with these complicated feedback systems, a whole new approach is opened that makes it possible to relate the so-called hard sciences to the softer sciences of life - and perhaps even to social processes as well. […] It is these panoramic vistas that are opened to us by Order Out of Chaos." (Ilya Prigogine, "Order Out of Chaos: Man's New Dialogue with Nature", 1984)

"We will treat problem solving as a process of search through a state space. A problem is defined by an initial state, one or more goal states to be reached, a set of operators that can transform one state into another, and constraints that an acceptable solution must meet. Problem-solving methods are procedures for selecting an appropriate sequence of operators that will succeed in transforming the initial state into a goal state through a series of steps." (John H Holland et al, "Induction: Processes Of Inference, Learning, And Discovery", 1986)

"Everywhere […] in the Universe, we discern that closed physical systems evolve in the same sense from ordered states towards a state of complete disorder called thermal equilibrium. This cannot be a consequence of known laws of change, since […] these laws are time symmetric- they permit […] time-reverse. […] The initial conditions play a decisive role in endowing the world with its sense of temporal direction. […] some prescription for initial conditions is crucial if we are to understand […]" (John D Barrow, "Theories of Everything: The Quest for Ultimate Explanation", 1991)

"Regarding stability, the state trajectories of a system tend to equilibrium. In the simplest case they converge to one point" (or different points from different initial states), more commonly to one" (or several, according to initial state) fixed point or limit cycle(s) or even torus(es) of characteristic equilibrial behaviour. All this is, in a rigorous sense, contingent upon describing a potential, as a special summation of the multitude of forces acting upon the state in question, and finding the fixed points, cycles, etc., to be minima of the potential function. It is often more convenient to use the equivalent jargon of 'attractors' so that the state of a system is 'attracted' to an equilibrial behaviour. In any case, once in equilibrial conditions, the system returns to its limit, equilibrial behaviour after small, arbitrary, and random perturbations." (Gordon Pask, "Different Kinds of Cybernetics", 1992)

"A strange attractor, when it exists, is truly the heart of a chaotic system. If a concrete system has been in existence for some time, states other than those extremely close to the attractor might as well not exist; they will never occur. For one special complicated chaotic system - the global weather - the attractor is simply the climate, that is, the set of weather patterns that have at least some chance of occasionally occurring." (Edward N Lorenz, "The Essence of Chaos", 1993)

"How can deterministic behavior look random? If truly identical states do occur on two or more occasions, it is unlikely that the identical states that will necessarily follow will be perceived as being appreciably different. What can readily happen instead is that almost, but not quite, identical states occurring on two occasions will appear to be just alike, while the states that follow, which need not be even nearly alike, will be observably different. In fact, in some dynamical systems it is normal for two almost identical states to be followed, after a sufficient time lapse, by two states bearing no more resemblance than two states chosen at random from a long sequence. Systems in which this is the case are said to be sensitively dependent on initial conditions. With a few more qualifications, to be considered presently, sensitive dependence can serve as an acceptable definition of chaos [...]" (Edward N Lorenz, "The Essence of Chaos", 1993)

"When a system has more than one attractor, the points in phase space that are attracted to a particular attractor form the basin of attraction for that attractor. Each basin contains its attractor, but consists mostly of points that represent transient states. Two contiguous basins of attraction will be separated by a basin boundary." (Edward N Lorenz, "The Essence of Chaos", 1993)

"Complex adaptive systems have the property that if you run them - by just letting the mathematical variable of 'time' go forward - they'll naturally progress from chaotic, disorganized, undifferentiated, independent states to organized, highly differentiated, and highly interdependent states. Organized structures emerge spontaneously. [...]A weak system gives rise only to simpler forms of self-organization; a strong one gives rise to more complex forms, like life." (J Doyne Farmer, The Third Culture: Beyond the Scientific Revolution", 1995)

"Cybernetics is a science of purposeful behavior. It helps us explain behavior as the continuous action of someone" (or thing) in the process, as we see it, of maintaining certain conditions near a goal state, or purpose." (Jeff Dooley, "Thoughts on the Question: What is Cybernetics", 1995)

"A dictionary definition of the word ‘complex’ is: ‘consisting of interconnected or interwoven parts’ […] Loosely speaking, the complexity of a system is the amount of information needed in order to describe it. The complexity depends on the level of detail required in the description. A more formal definition can be understood in a simple way. If we have a system that could have many possible states, but we would like to specify which state it is actually in, then the number of binary digits" (bits) we need to specify this particular state is related to the number of states that are possible." (Yaneer Bar-Yamm, "Dynamics of Complexity", 1997)

"Linear programming and its generalization, mathematical programming, can be viewed as part of a great revolutionary development that has given mankind the ability to state general goals and lay out a path of detailed decisions to be taken in order to 'best' achieve these goals when faced with practical situations of great complexity. The tools for accomplishing this are the models that formulate real-world problems in detailed mathematical terms, the algorithms that solve the models, and the software that execute the algorithms on computers based on the mathematical theory." (George B Dantzig & Mukund N Thapa, "Linear Programming" Vol I, 1997)

"Something of the previous state, however, survives every change. This is called in the language of cybernetics (which took it form the language of machines) feedback, the advantages of learning from experience and of having developed reflexes." (Guy Davenport, "The Geography of the Imagination: Forty Essays", 1997)

"Self-organization is seen as the process by which systems of many components tend to reach a particular state, a set of cycling states, or a small volume of their state space" (attractor basins), with no external interference." (Luis M Rocha, "Syntactic Autonomy", Proceedings of the Joint Conference on the Science and Technology of Intelligent Systems, 1998)

"Cybernetics is the science of effective organization, of control and communication in animals and machines. It is the art of steersmanship, of regulation and stability. The concern here is with function, not construction, in providing regular and reproducible behaviour in the presence of disturbances. Here the emphasis is on families of solutions, ways of arranging matters that can apply to all forms of systems, whatever the material or design employed. [...] This science concerns the effects of inputs on outputs, but in the sense that the output state is desired to be constant or predictable – we wish the system to maintain an equilibrium state. It is applicable mostly to complex systems and to coupled systems, and uses the concepts of feedback and transformations" (mappings from input to output) to effect the desired invariance or stability in the result." (Chris Lucas, "Cybernetics and Stochastic Systems", 1999)

"From a functional point of view, mental models can be described as symbolic structures which permit people: to generate descriptions of the purpose of a system, to generate descriptions of the architecture of a system, to provide explanations of the state of a system, to provide explanations of the functioning of a system, to make predictions of future states of a system." (Gert Rickheit & Lorenz Sichelschmidt, "Mental Models: Some Answers, Some Questions, Some Suggestions", 1999)

"Just as dynamics arise from feedback, so too all learning depends on feedback. We make decisions that alter the real world; we gather information feedback about the real world, and using the new information we revise our understanding of the world and the decisions we make to bring our perception of the state of the system closer to our goals." (John D Sterman, "Business dynamics: Systems thinking and modeling for a complex world", 2000)

"True systems thinking, on the other hand, studies each problem as it relates to the organization’s objectives and interaction with its entire environment, looking at it as a whole within its universe. Taking your organization from a partial systems to a true systems state requires effective strategic management and backward thinking." (Stephen G Haines, "The Systems Thinking Approach to Strategic Planning and Management", 2000)

"The basic concept of complexity theory is that systems show patterns of organization without organizer" (autonomous or self-organization). Simple local interactions of many mutually interacting parts can lead to emergence of complex global structures. […] Complexity originates from the tendency of large dynamical systems to organize themselves into a critical state, with avalanches or 'punctuations' of all sizes. In the critical state, events which would otherwise be uncoupled became correlated." (Jochen Fromm, "The Emergence of Complexity", 2004)

"At the foundation of classical thermodynamics are the first and second laws. The first law formulates that the total energy of a system is conserved, while the second law states that the entropy of an isolated system can only increase. The second law implies that the free energy of an isolated system is successively degraded by diabatic processes over time, leading to entropy production. This eventually results in an equilibrium state of maximum entropy. In its statistical interpretation, the direction towards higher entropy can be interpreted as a transition to more probable states." (Axel Kleidon & Ralph D Lorenz, "Entropy Production by Earth System Processes" [in "Non- quilibrium Thermodynamics and the Production of Entropy"], 2005)

"Of course, the existence of an unknown butterfly flapping its wings has no direct bearing on weather forecasts, since it will take far too long for such a small perturbation to grow to a significant size, and we have many more immediate uncertainties to worry about. So, the direct impact of this phenomenon on weather prediction is often somewhat overstated." (James Annan & William Connolley, "Chaos and Climate", 2005)

"When defining living systems, the term dynamic equilibrium is essential. It does not imply something which is steady or stable. On the contrary, it is a floating state characterized by invisible movements and preparedness for change. To be in dynamic equilibrium is adapting adjustment to balance. Homeostasis stands for the sum of all control functions creating the state of dynamic equilibrium in a healthy organism. It is the ability of the body to maintain a narrow range of internal conditions in spite of environmental changes." (Lars Skyttner, "General Systems Theory: Problems, Perspective, Practice", 2005)

"Complexity arises when emergent system-level phenomena are characterized by patterns in time or a given state space that have neither too much nor too little form. Neither in stasis nor changing randomly, these emergent phenomena are interesting, due to the coupling of individual and global behaviours as well as the difficulties they pose for prediction. Broad patterns of system behaviour may be predictable, but the system's specific path through a space of possible states is not." (Steve Maguire et al, "Complexity Science and Organization Studies", 2006)

"Physically, the stability of the dynamics is characterized by the sensitivity to initial conditions. This sensitivity can be determined for statistically stationary states, e.g. for the motion on an attractor. If this motion demonstrates sensitive dependence on initial conditions, then it is chaotic. In the popular literature this is often called the 'Butterfly Effect', after the famous 'gedankenexperiment' of Edward Lorenz: if a perturbation of the atmosphere due to a butterfly in Brazil induces a thunderstorm in Texas, then the dynamics of the atmosphere should be considered as an unpredictable and chaotic one. By contrast, stable dependence on initial conditions means that the dynamics is regular." (Ulrike Feudel et al, "Strange Nonchaotic Attractors", 2006)

"Thus, nonlinearity can be understood as the effect of a causal loop, where effects or outputs are fed back into the causes or inputs of the process. Complex systems are characterized by networks of such causal loops. In a complex, the interdependencies are such that a component A will affect a component B, but B will in general also affect A, directly or indirectly.  A single feedback loop can be positive or negative. A positive feedback will amplify any variation in A, making it grow exponentially. The result is that the tiniest, microscopic difference between initial states can grow into macroscopically observable distinctions."" (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)"

"The methodology of feedback design is borrowed from cybernetics" (control theory). It is based upon methods of controlled system model’s building, methods of system states and parameters estimation" (identification), and methods of feedback synthesis. The models of controlled system used in cybernetics differ from conventional models of physics and mechanics in that they have explicitly specified inputs and outputs. Unlike conventional physics results, often formulated as conservation laws, the results of cybernetical physics are formulated in the form of transformation laws, establishing the possibilities and limits of changing properties of a physical system by means of control." (Alexander L Fradkov, "Cybernetical Physics: From Control of Chaos to Quantum Control", 2007)

"The second law of thermodynamics states that in an isolated system, entropy can only increase, not decrease. Such systems evolve to their state of maximum entropy, or thermodynamic equilibrium. Therefore, physical self-organizing systems cannot be isolated: they require a constant input of matter or energy with low entropy, getting rid of the internally generated entropy through the output of heat" ('dissipation'). This allows them to produce ‘dissipative structures’ which maintain far from thermodynamic equilibrium. Life is a clear example of order far from thermodynamic equilibrium." (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)

"Thus, nonlinearity can be understood as the effect of a causal loop, where effects or outputs are fed back into the causes or inputs of the process. Complex systems are characterized by networks of such causal loops. In a complex, the interdependencies are such that a component A will affect a component B, but B will in general also affect A, directly or indirectly.  A single feedback loop can be positive or negative. A positive feedback will amplify any variation in A, making it grow exponentially. The result is that the tiniest, microscopic difference between initial states can grow into macroscopically observable distinctions." (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)

"We have to be aware that even in mathematical and physical models of self-organizing systems, it is the observer who ascribes properties, aspects, states, and probabilities; and therefore entropy or order to the system. But organization is more than low entropy: it is structure that has a function or purpose." (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)

"A characteristic of such chaotic dynamics is an extreme sensitivity to initial conditions" (exponential separation of neighboring trajectories), which puts severe limitations on any forecast of the future fate of a particular trajectory. This sensitivity is known as the ‘butterfly effect’: the state of the system at time t can be entirely different even if the initial conditions are only slightly changed, i.e., by a butterfly flapping its wings." (Hans J Korsch et al, "Chaos: A Program Collection for the PC", 2008)

"Two systems concepts lie at the disposal of the architect to reflect the beauty of harmony: parsimony and variety. The law of parsimony states that given several explanations of a specific phenomenon, the simplest is probably the best. […] On the other hand, the law of requisite variety states that for a system to survive in its environment the variety of choice that the system is able to make must equal or exceed the variety of influences that the environment can impose on the system." (John Boardman & Brian Sauser, "Systems Thinking: Coping with 21st Century Problems", 2008)

"Generally, these programs fall within the techniques of reinforcement learning and the majority use an algorithm of temporal difference learning. In essence, this computer learning paradigm approximates the future state of the system as a function of the present state. To reach that future state, it uses a neural network that changes the weight of its parameters as it learns." (Diego Rasskin-Gutman, "Chess Metaphors: Artificial Intelligence and the Human Mind", 2009)

"If universality is one of the observed characteristics of complex dynamical systems in many fields of study, a second characteristic that flows from the study of these systems is that of emergence. As self-organizing systems go about their daily business, they are constantly exchanging matter and energy with their environment, and this allows them to remain in a state that is far from equilibrium. That allows spontaneous behavior to give rise to new patterns." (Terry Cooke-Davies et al, "Exploring the Complexity of Projects", 2009)

"[...] a high degree of unpredictability is associated with erratic trajectories. This not only because they look random but mostly because infinitesimally small uncertainties on the initial state of the system grow very quickly - actually exponentially fast. In real world, this error amplification translates into our inability to predict the system behavior from the unavoidable imperfect knowledge of its initial state." (Massimo Cencini et al, "Chaos: From Simple Models to Complex Systems", 2010)

"[…] the law of requisite complexity […] states that in order to fully regulate/control a system, the complexity of the controller has to be at least as great as the complexity of the system that’s being controlled. To put it in even simpler terms, only complexity can destroy complexity." (John L Casti, "X-Events: The Collapse of Everything", 2012)

"Principle of Equifinality: If a steady state is reached in an open system, it is independent of the initial conditions, and determined only by the system parameters, i.e. rates of reaction and transport." (Kevin Adams & Charles Keating, "Systems of systems engineering", 2012) 

23 September 2025

❄️Systems Thinking: On Expansion (Quotes)

"From that time, the universe has steadily become more complex and less reducible to a central control. With as much obstinacy as though it were human, it has insisted on expanding its parts; with as much elusiveness as though it were feminine, it has evaded the attempt to impose on it a single will. Modern science, like modern art, tends, in practice, to drop the dogma of organic unity. Some of the mediaeval habit of mind survives, but even that is said to be yielding before the daily evidence of increasing and extending complexity. The fault, then, was not in man, if he no longer looked at science or art as an organic whole or as the expression of unity. Unity turned itself into complexity, multiplicity, variety, and even contradiction." (Henry Adams, "Mont Saint Michel and Chartres", 1904)

"The homeostatic principle does not apply literally to the functioning of all complex living systems, in that in counteracting entropy they move toward growth and expansion." (Daniel Katz, "The Social Psychology of Organizations", 1966)

"For some years now the activity of the artist in our society has been trending more toward the function of the ecologist: one who deals with environmental relationships. Ecology is defined as the totality or pattern of relations between organisms and their environment. Thus the act of creation for the new artist is not so much the invention of new objects as the revelation of previously unrecognized relation- ships between existing phenomena, both physical and metaphysical. So we find that ecology is art in the most fundamental and pragmatic sense, expanding our apprehension of reality." (Gene Youngblood, "Expanded Cinema", 1970)

"In the Systems Age we tend to look at things as part of larger wholes rather than as wholes to be taken apart. This is the doctrine of expansionism. Expansionism brings with it the synthetic mode of thought much as reductionism brought with it." (Russell L Ackoff, "Redesigning the future", 1974)

"In the new systems thinking, the metaphor of knowledge as a building is being replaced by that of the network. As we perceive reality as a network of relationships, our descriptions, too, form an interconnected network of concepts and models in which there are no foundations. For most scientists such a view of knowledge as a network with no firm foundations is extremely unsettling, and today it is by no means generally accepted. But as the network approach expands throughout the scientific community, the idea of knowledge as a network will undoubtedly find increasing acceptance." (Fritjof Capra," The Web of Life: a new scientific understanding of living systems", 1996)

"Networks constitute the new social morphology of our societies, and the diffusion of networking logic substantially modifies the operation and outcomes in processes of production, experience, power, and culture. While the networking form of social organization has existed in other times and spaces, the new information technology paradigm provides the material basis for its pervasive expansion throughout the entire social structure." (Manuel Castells, "The Rise of the Network Society", 1996)

"A more extreme form of exponential growth was probably responsible for the start of the universe. Astronomer and physicists now generally accept the Big Bang theory, according to which the universe started at an unimaginably small size and then doubled in a split second 100 times, enough to make it the size of a small grapefruit. This period of 'inflation' or exponential growth then ended, and linear growth took over, with an expanding fireball creating the universe that we know today." (Richar Koch, "The Power Laws", 2000)

"Systems thinking expands the focus of the observer, whereas analytical thinking reduces it. In other words, analysis looks into things, synthesis looks out of them. This attitude of systems thinking is often called expansionism, an alternative to classic reductionism. Whereas analytical thinking concentrates on static and structural properties, systems thinking concentrates on the function and behaviour of whole systems. Analysis gives description and knowledge; systems thinking gives explanation and understanding." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"This new model of development would be based clearly on the goal of sustainable human well-being. It would use measures of progress that clearly acknowledge this goal. It would acknowledge the importance of ecological sustainability, social fairness, and real economic efficiency. Ecological sustainability implies recognizing that natural and social capital are not infinitely substitutable for built and human capital, and that real biophysical limits exist to the expansion of the market economy." (Robert Costanza, "Toward a New Sustainable Economy", 2008)

"The self-organizing map is a subtype of artificial neural networks. It is trained using unsupervised learning to produce low dimensional representation of the training samples while preserving the topological properties of the input space. The self-organizing map is a single layer feed-forward network where the output syntaxes are arranged in low dimensional (usually 2D or 3D) grid. Each input is connected to all output neurons. Attached to every neuron there is a weight vector with the same dimensionality as the input vectors. The number of input dimensions is usually a lot higher than the output grid dimension. SOMs are mainly used for dimensionality reduction rather than expansion. (Larbi Esmahi et al, "Adaptive Neuro-Fuzzy Systems", 2009)

"[…] economics is a profession grounded in the belief that 'the economy' is a machine and a closed system. The more clearly that machine is understood, the more its variables are precisely measured, the more we will be able to manage and steer it as we choose, avoiding the frenetic expansions and sharp contractions. With better indicators would come better policy, and with better policy, states would be less likely to fall into depression and risk collapse." (Zachary Karabell, "The Leading Indicators: A short history of the numbers that rule our world", 2014)

21 September 2025

❄️Systems Thinking: On Reductionism (Quotes)

"Beauty had been born, not, as we so often conceive it nowadays, as an ideal of humanity, but as measure, as the reduction of the chaos of appearances to the precision of linear symbols. Symmetry, balance, harmonic division, mated and mensurated intervals - such were its abstract characteristics." (Herbert E Read, "Icon and Idea", 1955)

"Science is the reduction of the bewildering diversity of unique events to manageable uniformity within one of a number of symbol systems, and technology is the art of using these symbol systems so as to control and organize unique events. Scientific observation is always a viewing of things through the refracting medium of a symbol system, and technological praxis is always handling of things in ways that some symbol system has dictated. Education in science and technology is essentially education on the symbol level." (Aldous L Huxley, "Essay", Daedalus, 1962)

"Whereas traditional reductionism sought to find the commonality underlying diversity in reference to a shared substance, such as material atoms, contemporary systems theory seeks to find common features in terms of shared aspects of organization." (Ervin László, "The Systems View of the World: A Holistic Vision for Our Time", 1972)

"For any system the environment is always more complex than the system itself. No system can maintain itself by means of a point-for-point correlation with its environment, i.e., can summon enough 'requisite variety' to match its environment. So each one has to reduce environmental complexity - primarily by restricting the environment itself and perceiving it in a categorically preformed way. On the other hand, the difference of system and environment is a prerequisite for the reduction of complexity because reduction can be performed only within the system, both for the system itself and its environment." (Thomas Luckmann & Niklas Luhmann, "The Differentiation of Society", 1977)

"There is a strong current in contemporary culture advocating ‘holistic’ views as some sort of cure-all […] Reductionism implies attention to a lower level while holistic implies attention to higher level. These are intertwined in any satisfactory description: and each entails some loss relative to our cognitive preferences, as well as some gain [...] there is no whole system without an interconnection of its parts and there is no whole system without an environment." (Francisco Varela, "On being autonomous: The lessons of natural history for systems theory", 1977)

"Systems theory is antireductionist; it asserts that no system can be adequately understood or totally explained once it has been broken down into its component parts." (Charles Zastrow, "Introduction to Social Work and Social Welfare: Empowering People", 1993)

"In such systems, the whole is more than the sum of the parts, not in an ultimate, metaphysical sense, but in the important pragmatic sense that, given the properties of the parts and the laws of their interaction, it is not a trivial matter to infer the properties of the whole. In the face of complexity, an in-principle reductionist may be at the same time a pragmatic holist." (Charles François (Ed.) "International Encyclopedia of Cybernetics and Systems", 1997)

"[...] information feedback about the real world not only alters our decisions within the context of existing frames and decision rules but also feeds back to alter our mental models. As our mental models change we change the structure of our systems, creating different decision rules and new strategies. The same information, processed and interpreted by a different decision rule, now yields a different decision. Altering the structure of our systems then alters their patterns of behavior. The development of systems thinking is a double-loop learning process in which we replace a reductionist, narrow, short-run, static view of the world with a holistic, broad, long-term, dynamic view and then redesign our policies and institutions accordingly." (John D Sterman, "Business dynamics: Systems thinking and modeling for a complex world", 2000)

"Systems thinking expands the focus of the observer, whereas analytical thinking reduces it. In other words, analysis looks into things, synthesis looks out of them. This attitude of systems thinking is often called expansionism, an alternative to classic reductionism. Whereas analytical thinking concentrates on static and structural properties, systems thinking concentrates on the function and behaviour of whole systems. Analysis gives description and knowledge; systems thinking gives explanation and understanding." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"In particular, complexity examines how components of a system can through their dynamic interaction 'spontaneously' develop collective properties or patterns, such as colour, that do not seem implicit, or at least not implicit in the same way, within individual components.  Complexity investigates emergent properties, certain regularities of behaviour that somehow transcend the ingredients that make them up. Complexity argues against reductionism, against reducing the whole to the parts. And in so doing it transforms scientific understanding of far-from-equilibrium structures, of irreversible times and of non-Euclidean mobile spaces. It emphasizes how positive feedback loops can exacerbate initial stresses in the system and render it unable to absorb shocks to re-establish the original equilibrium. Positive feedback occurs when a change tendency is reinforced rather than dampened clown. Very strong interactions occur between the parts of such systems, with the absence of a central hierarchical structure that unambiguously' governs' and produces outcomes. These outcomes are to be seen as both uncertain and irreversible." (John Urry, "Global Complexity", 2003)

"There exists an alternative to reductionism for studying systems. This alternative is known as holism. Holism considers systems to be more than the sum of their parts. It is of course interested in the parts and particularly the networks of relationships between the parts, but primarily in terms of how they give rise to and sustain in existence the new entity that is the whole whether it be a river system, an automobile, a philosophical system or a quality system." (Michael C Jackson, "Systems Thinking: Creative Holism for Manager", 2003)

"In general [...] a dissipative structure may become unstable at a certain threshold and break down, enabling the emergence of a new structure. As the introduction of corresponding order parameters results from the elimination of a huge number of degrees of freedom, the emergence of dissipative order is combined with a drastic reduction of complexity." (Klaus Mainzer, "Thinking in Complexity: The Computational Dynamics of Matter, Mind, and Mankind", 2004)

"This reduction principle - the reduction of the behavior of a complex system to the behavior of its parts - is valid only if the level of complexity of the system is rather low." (Andrzej P Wierzbicki & Yoshiteru Nakamori, "Creative Space: Models of Creative Processes for the Knowledge Civilization Age", Studies in Computational Intelligence Vol.10, 2006)

"Mechanistic reductionism suggested that the universe, including life, were considered as 'mechanisms'. Consequently, understanding any system required the application of the mental strategy of engineering: the whole system should be reduced to its parts. Knowing the parts was thought to imply the complete understanding of the whole." (Péter Érdi, "Complexity Explained", 2008)

"Complex systems are not easily predictable, and the principles of reductionism do not bear fruit when laboring to understand them, as system behavior emerges on all levels of the system. Although they are not fully knowable, within reason there may be some prediction possible. (Andreas Tolk et al, "Epistemological Constraints When Evaluating Ontological Emergence with Computational Complex Adaptive Systems", [in Unifying Themes in Complex Systems IX, Eds. Alfredo J Morales et al], 2018)

"In principle, reduction works and reductive logic does not fail, despite practical difficulty conducting reduction in complicated contexts. Careful reductive analysis reveals even convoluted, non-linear, evolved, self-organized, hierarchically complex, emergent phenomena, such as consciousness, remain open to unyielding reduction. However, among the anomalies associated with reductive logic," (J Rowan Scott, "Descartes, Gödel and Kuhn: Epiphenomenalism Defines a Limit on Reductive Logic", [in Unifying Themes in Complex Systems IX, Eds. Alfredo J Morales et al], 2018)

"From a modeller’s perspective a dynamic hypothesis is a particularly important step of ‘complexity reduction’ - making sense of a messy situation in the real world. A feedback systems thinker has in mind a number of structure-behaviour pairs that give valuable clues or patterns to look for when explaining puzzling dynamics." (John Morecroft, "System Dynamics", [in "Systems Approaches to Making Change: A Practical Guide", Ed. Martin Reynolds & Sue Holwell] 2020)

19 September 2025

❄️Systems Thinking: On Phase Space (Quotes)

"Finite systems of deterministic ordinary nonlinear differential equations may be designed to represent forced dissipative hydrodynamic flow. Solutions of these equations can be identified with trajectories in phase space. For those systems with bounded solutions, it is found that nonperiodic solutions are ordinarily unstable with respect to small modifications, so that slightly differing initial states can evolve into considerably different states. Systems with bounded solutions are shown to possess bounded numerical solutions. (Edward N Lorenz, "Deterministic Nonperiodic Flow", Journal of the Atmospheric Science 20, 1963)

"The ‘eyes of the mind’ must be able to see in the phase space of mechanics, in the space of elementary events of probability theory, in the curved four-dimensional space-time of general relativity, in the complex infinite dimensional projective space of quantum theory. To comprehend what is visible to the ‘actual eyes’, we must understand that it is only the projection of an infinite dimensional world on the retina." (Yuri I Manin, "Mathematics and Physics", 1981)

"[…] physicists have come to appreciate a fourth kind of temporal behavior: deterministic chaos, which is aperiodic, just like random noise, but distinct from the latter because it is the result of deterministic equations. In dynamic systems such chaos is often characterized by small fractal dimensions because a chaotic process in phase space typically fills only a small part of the entire, energetically available space." (Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

"When a system has more than one attractor, the points in phase space that are attracted to a particular attractor form the basin of attraction for that attractor. Each basin contains its attractor, but consists mostly of points that represent transient states. Two contiguous basins of attraction will be separated by a basin boundary." (Edward N Lorenz, "The Essence of Chaos", 1993)

"Chaos appears in both dissipative and conservative systems, but there is a difference in its structure in the two types of systems. Conservative systems have no attractors. Initial conditions can give rise to periodic, quasiperiodic, or chaotic motion, but the chaotic motion, unlike that associated with dissipative systems, is not self-similar. In other words, if you magnify it, it does not give smaller copies of itself. A system that does exhibit self-similarity is called fractal. [...] The chaotic orbits in conservative systems are not fractal; they visit all regions of certain small sections of the phase space, and completely avoid other regions. If you magnify a region of the space, it is not self-similar." (Barry R Parker, "Chaos in the Cosmos: The stunning complexity of the universe", 1996)

"One of the reasons we deal with the pendulum is that it is easy to plot its motion in phase space. If the amplitude is small, it's a two-dimensional problem, so all we need to specify it completely is its position and its velocity. We can make a two-dimensional plot with one axis (the horizontal), position, and the other (the vertical), velocity." (Barry R Parker, "Chaos in the Cosmos: The stunning complexity of the universe", 1996)

"The chance events due to deterministic chaos, on the other hand, occur even within a closed system determined by immutable laws. Our most cherished examples of chance - dice, roulette, coin-tossing - seem closer to chaos than to the whims of outside events. So, in this revised sense, dice are a good metaphor for chance after all. It's just that we've refined our concept of randomness. Indeed, the deterministic but possibly chaotic stripes of phase space may be the true source of probability." (Ian Stewart, "Does God Play Dice: The New Mathematics of Chaos", 1997)

"The chance events due to deterministic chaos, on the other hand, occur even within a closed system determined by immutable laws. Our most cherished examples of chance - dice, roulette, coin-tossing – seem closer to chaos than to the whims of outside events. So, in this revised sense, dice are a good metaphor for chance after all. It's just that we've refined our concept of randomness. Indeed, the deterministic but possibly chaotic stripes of phase space may be the true source of probability." (Ian Stewart, "Does God Play Dice: The New Mathematics of Chaos", 2002)

"Roughly spoken, bifurcation theory describes the way in which dynamical system changes due to a small perturbation of the system-parameters. A qualitative change in the phase space of the dynamical system occurs at a bifurcation point, that means that the system is structural unstable against a small perturbation in the parameter space and the dynamic structure of the system has changed due to this slight variation in the parameter space." (Holger I Meinhardt, "Cooperative Decision Making in Common Pool Situations", 2012)

"The impossibility of predicting which point in phase space the trajectory of the Lorenz attractor will pass through at a certain time, even though the system is governed by deterministic equations, is a common feature of all chaotic systems. However, this does not mean that chaos theory is not capable of any predictions. We can still make very accurate predictions, but they concern the qualitative features of the system’s behavior rather than the precise values of its variables at a particular time. The new mathematics thus represents the shift from quantity to quality that is characteristic of systems thinking in general. Whereas conventional mathematics deals with quantities and formulas, nonlinear dynamics deals with qualities and patterns." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"Bifurcation is a qualitative, topological change of a system’s phase space that occurs when some parameters are slightly varied across their critical thresholds. Bifurcations play important roles in many real-world systems as a switching mechanism. […] There are two categories of bifurcations. One is called a local bifurcation, which can be characterized by a change in the stability of equilibrium points. It is called local because it can be detected and analyzed only by using localized information around the equilibrium point. The other category is called a global bifurcation, which occurs when non-local features of the phase space, such as limit cycles (to be discussed later), collide with equilibrium points in a phase space. This type of bifurcation can’t be characterized just by using localized information around the equilibrium point."  (Hiroki Sayama, "Introduction to the Modeling and Analysis of Complex Systems", 2015)

10 September 2025

❄️Systems Thinking: On Capacity (Quotes)

"True equilibria can occur only in closed systems and that, in open systems, disequilibria called ‘steady states’, or ‘flow equilibria’ are the predominant and characteristic feature. According to the second law of thermodynamics a closed system must eventually attain a time-independent equilibrium state, with maximum entropy and minimum free energy. An open system may, under certain conditions, attain a time-independent state where the system remains constant as a whole and in its phases, though there is a continuous flow of component materials. This is called a steady state. Steady states are irreversible as a whole. […] A closed system in equilibrium does not need energy for its preservation, nor can energy be obtained from it. In order to perform work, a system must be in disequilibrium, tending toward equilibrium and maintaining a steady state, Therefore the character of an open system is the necessary condition for the continuous working capacity of the organism." (Ludwig on Bertalanffy, "Theoretische Biologie: Band 1: Allgemeine Theorie, Physikochemie, Aufbau und Entwicklung des Organismus", 1932)

"By some definitions 'systems engineering' is suggested to be a new discovery. Actually it is a common engineering approach which has taken on a new and important meaning because of the greater complexity and scope of problems to be solved in industry, business, and the military. Newly discovered scientific phenomena, new machines and equipment, greater speed of communications, increased production capacity, the demand for control over ever-extending areas under constantly changing conditions, and the resultant complex interactions, all have created a tremendously accelerating need for improved systems engineering. Systems engineering can be complex, but is simply defined as 'logical engineering within physical, economic and technical limits' - bridging the gap from fundamental laws to a practical operating system." (Instrumentation Technology, 1957)

"Clearly, if the state of the system is coupled to parameters of an environment and the state of the environment is made to modify parameters of the system, a learning process will occur. Such an arrangement will be called a Finite Learning Machine, since it has a definite capacity. It is, of course, an active learning mechanism which trades with its surroundings. Indeed it is the limit case of a self-organizing system which will appear in the network if the currency supply is generalized." (Gordon Pask, "The Natural History of Networks", 1960)

"According to the science of cybernetics, which deals with the topic of control in every kind of system" (mechanical, electronic, biological, human, economic, and so on), there is a natural law that governs the capacity of a control system to work. It says that the control must be capable of generating as much 'variety' as the situation to be controlled." (Anthony S Beer, Management Science", 1968)

"Learning is any change in a system that produces a more or less permanent change in its capacity for adapting to its environment. Understanding systems, especially systems capable of understanding problems in new task domains, are learning systems." (Herbert A Simon, "The Sciences of the Artificial", 1968)

"The notion that the 'balance of nature' is delicately poised and easily upset is nonsense. Nature is extraordinarily tough and resilient, interlaced with checks and balances, with an astonishing capacity for recovering from disturbances in equilibrium. The formula for survival is not power; it is symbiosis." (Sir Eric Ashby, [Encounter] 1976)

"The greater the uncertainty, the greater the amount of decision making and information processing. It is hypothesized that organizations have limited capacities to process information and adopt different organizing modes to deal with task uncertainty. Therefore, variations in organizing modes are actually variations in the capacity of organizations to process information and make decisions about events which cannot be anticipated in advance." (John K Galbraith, "Organization Design", 1977)

"Real learning gets to the heart of what it means to be human. Through learning we re-create ourselves. Through learning we become able to do something we never were able to do. Through learning we reperceive the world and our relationship to it. Through learning we extend our capacity to create, to be part of the generative process of life." (Peter M Senge, "The Fifth Discipline: The Art and Practice of the Learning Organization", 1990)

"The organizations that will truly excel in the future will be the organizations that discover how to tap people's commitment and capacity to learn at all levels in an organization." (Peter M Senge, "The Fifth Discipline: The Art and Practice of the Learning Organization", 1990)

"Neural networks conserve the complexity of the systems they model because they have complex structures themselves. Neural networks encode information about their environment in a distributed form. […] Neural networks have the capacity to self-organise their internal structure." (Paul Cilliers, "Complexity and Postmodernism: Understanding Complex Systems", 1998)

"This is a general characteristic of self-organizing systems: they are robust or resilient. This means that they are relatively insensitive to perturbations or errors, and have a strong capacity to restore themselves, unlike most human designed systems." (Francis Heylighen, "The Science of Self-Organization and Adaptivity", 2001)

"The very essence of mass communication theory is a simple but all-embracing expression of technological determinism, since the essential features depend on what certain technologies have made possible, certain technologies have made possible, especially the following: communication at a distance, the multiplication and simultaneous distribution of diverse ‘messages’, the enormous capacity and speed of carriers, and the limitations on response. There is no escaping the implication that public communication as practised in modern societies is profoundly shaped by these general features." (Denis McQuail, "McQuail's Reader in Mass Communication Theory", 2002)

"In loosely coupled systems by contrast there is plenty of slack in terms of time, resources and organizational capacity. They are much less likely to produce normal accidents since incidents can be .coped with, so avoiding the interactive complexity found within the tightly coupled system. in the latter, moreover, the effects are non-linear. Up to a point, tightening the connections between elements in the system will increase efficiency when everything works smoothly. But, if one small item goes wrong, then that can have a  catastrophic knock-on effect throughout the system. The system literally switches over; from smooth functioning to interactively complex disaster. And sometimes this results from a supposed improvement in the system." (John Urry, "Global Complexity", 2003)

"Mutual information is the receiver's entropy minus the conditional entropy of what the receiver receives - given what message the sender sends through the noisy channel. Conditioning or getting data can only reduce uncertainty and so this gap is always positive or zero. It can never be negative. You can only learn from further experience. Information theorists capture this theorem in a slogan: Conditioning reduces entropy. The channel capacity itself is the largest gap given all possible probability descriptions of what [the sender] sent. It is the most information that on average you could ever get out of the noisy channel." (Bart Kosko, "Noise", 2006)

"Information theory leads to the quantification of the information content of the source, as denoted by entropy, the characterization of the information-bearing capacity of the communication channel, as related to its noise characteristics, and consequently the establishment of the relationship between the information content of the source and the capacity of the channel. In short, information theory provides a quantitative measure of the information contained in message signals and help determine the capacity of a communication system to transfer this information from source to sink over a noisy channel in a reliable fashion." (Ali Grami, "Information Theory", 2016)

07 September 2025

❄️Systems Thinking: On Thresholds (Quotes)

"[The] system may evolve through a whole succession of transitions leading to a hierarchy of more and more complex and organized states. Such transitions can arise in nonlinear systems that are maintained far from equilibrium: that is, beyond a certain critical threshold the steady-state regime become unstable and the system evolves into a new configuration." (Ilya Prigogine, Gregoire Micolis & Agnes Babloyantz, "Thermodynamics of Evolution", Physics Today 25 (11), 1972)

"As the complexity of a system increases, our ability to make precise and yet significant statements about its behavior diminishes until a threshold is reached beyond which precision and significance (or relevance) become almost mutually exclusive characteristics." (Lotfi A Zadeh, 1973)

"Fuzziness, then, is a concomitant of complexity. This implies that as the complexity of a task, or of a system for performing that task, exceeds a certain threshold, the system must necessarily become fuzzy in nature. Thus, with the rapid increase in the complexity of the information processing tasks which the computers are called upon to perform, we are reaching a point where computers will have to be designed for processing of information in fuzzy form. In fact, it is the capability to manipulate fuzzy concepts that distinguishes human intelligence from the machine intelligence of current generation computers. Without such capability we cannot build machines that can summarize written text, translate well from one natural language to another, or perform many other tasks that humans can do with ease because of their ability to manipulate fuzzy concepts." (Lotfi A Zadeh, "The Birth and Evolution of Fuzzy Logic", 1989)

"Threshold functions (are described) which facilitate the careful study of the structure of a graph as it grows and specifically reveal the mysterious circumstances surrounding the abrupt appearance of the Unique Giant Component which systematically absorbs its neighbours, devouring the larger first and ruthlessly continuing until the last Isolated Nodes have been swallowed up, whereupon the Giant is suddenly brought under control by a Spanning Cycle." (Edgar Palmer, "Graphical Evolution", 1985)

"[…] an epidemic does not always percolate through an entire population. There is a percolation threshold below which the epidemic has died out before most of the people have." (Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

"In the realms of nature it is impossible to predict which way a bifurcation will cut. The outcome of a bifurcation is determined neither by the past history of a system nor by its environment, but only by the interplay of more or less random fluctuations in the chaos of critical destabilization. One or another of the fluctuations that rock such a system will suddenly 'nucleate'. The nucleating fluctuation will amplify with great rapidity and spread to the rest of the system. In a surprisingly short time, it dominates the system’s dynamics. The new order that is then born from the womb of chaos reflects the structural and functional characteristics of the nucleated fluctuation. [...] Bifurcations are more visible, more frequent, and more dramatic when the systems that exhibit them are close to their thresholds of stability - when they are all but choked out of existence." (Ervin László, "Vision 2020: Reordering Chaos for Global Survival", 1994)

"When a system is 'stressed' beyond certain threshold limits as, for example, when it is heated up, or its pressure is increased, it shifts from one set of attractors to another and then behaves differently. To use the language of the theory, the system 'settles into a new dynamic regime'. It is at the point of transition that a bifurcation takes place. The system no longer follows the trajectory of its initial attractors, but responds to new attractors that make the system appear to be behaving randomly. It is not behaving randomly, however, and this is the big shift in our understanding caused by dynamical systems theory. It is merely responding to a new set of attractors that give it a more complex trajectory. The term bifurcation, in its most significant sense, refers to the transition of a system from the dynamic regime of one set of attractors, generally more stable and simpler ones, to the dynamic regime of a set of more complex and 'chaotic' attractors." (Ervin László, "Vision 2020: Reordering Chaos for Global Survival", 1994)

"For any given population of susceptibles, there is some critical combination of contact frequency, infectivity, and disease duration just great enough for the positive loop to dominate the negative loops. That threshold is known as the tipping point. Below the tipping point, the system is stable: if the disease is introduced into the community, there may be a few new cases, but on average, people will recover faster than new cases are generated. Negative feedback dominates and the population is resistant to an epidemic. Past the tipping point, the positive loop dominates .The system is unstable and once a disease arrives, it can spread like wildfire that is, by positive feedback-limited only by the depletion of the susceptible population." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"The tipping point is that magic moment when an idea, trend, or social behavior crosses a threshold, tips, and spreads like wildfire." (Malcolm T Gladwell, "The Tipping Point: How Little Things Can Make a Big Difference", 2000)

"This possibility of sudden change is at the center of the idea of the Tipping Point and might well be the hardest of all to accept. [...] The Tipping Point is the moment of critical mass, the threshold, the boiling point." (Malcolm T Gladwell, "The Tipping Point: How Little Things Can Make a Big Difference", 2000)

"[…] real networks not only are connected but are well beyond the threshold of one. Random network theory tells us that as the average number of links per node increases beyond the critical one, the number of nodes left out of the giant cluster decreases exponentially. That is, the more links we add, the harder it is to find a node that remains isolated. Nature does not take risks by staying close to the threshold. It well surpasses it."  (Albert-László Barabási, "Linked: How Everything Is Connected to Everything Else and What It Means for Business, Science, and Everyday Life", 2002)

"In the case of a complex system, nonlinear behavior can happen as disturbances or changes in the system, each one relatively small by itself, accumulate. Outwardly, everything seems to be normal: the system doesn’t generate any surprises. At some point, though, the behavior of the whole system suddenly shifts to a radically new mode. This kind of behavior is often called a threshold effect, because the shift occurs when a critical threshold - usually unseen and often unexpected - is crossed." (Thomas Homer-Dixon, "The Upside of Down: Catastrophe, Creativity, and the Renewal of Civilization", 2006)

"But in mathematics there is a kind of threshold effect, an intellectual tipping point. If a student can just get over the first few humps, negotiate the notational peculiarities of the subject, and grasp that the best way to make progress is to understand the ideas, not just learn them by rote, he or she can sail off merrily down the highway, heading for ever more abstruse and challenging ideas, while an only slightly duller student gets stuck at the geometry of isosceles triangles." (Ian Stewart, "Why Beauty is Truth: A history of symmetry", 2007)

"The simplest basic architecture of an artificial neural network is composed of three layers of neurons - input, output, and intermediary (historically called perceptron). When the input layer is stimulated, each node responds in a particular way by sending information to the intermediary level nodes, which in turn distribute it to the output layer nodes and thereby generate a response. The key to artificial neural networks is in the ways that the nodes are connected and how each node reacts to the stimuli coming from the nodes it is connected to. Just as with the architecture of the brain, the nodes allow information to pass only if a specific stimulus threshold is passed. This threshold is governed by a mathematical equation that can take different forms. The response depends on the sum of the stimuli coming from the input node connections and is 'all or nothing'." (Diego Rasskin-Gutman, "Chess Metaphors: Artificial Intelligence and the Human Mind", 2009)

"Even more important is the way complex systems seem to strike a balance between the need for order and the imperative for change. Complex systems tend to locate themselves at a place we call 'the edge of chaos'. We imagine the edge of chaos as a place where there is enough innovation to keep a living system vibrant, and enough stability to keep it from collapsing into anarchy. It is a zone of conflict and upheaval, where the old and new are constantly at war. Finding the balance point must be a delicate matter - if a living system drifts too close, it risks falling over into incoherence and dissolution; but if the system moves too far away from the edge, it becomes rigid, frozen, totalitarian. Both conditions lead to extinction. […] Only at the edge of chaos can complex systems flourish. This threshold line, that edge between anarchy and frozen rigidity, is not a like a fence line, it is a fractal line; it possesses nonlinearity." (Stephen H Buhner, "Plant Intelligence and the Imaginal Realm: Beyond the Doors of Perception into the Dreaming of Earth", 2014)

"Flaws can be found in any research design if you look hard enough. […] In our experience, it is good scientific practice to refine one's research hypotheses in light of the data. Working scientists are also keenly aware of the risks of data dredging, and they use confidence intervals and p-values as a tool to avoid getting fooled by noise. Unfortunately, a by-product of all this struggle and care is that when a statistically significant pattern does show up, it is natural to get excited and believe it. The very fact that scientists generally don't cheat, generally don't go fishing for statistical significance, makes them vulnerable to drawing strong conclusions when they encounter a pattern that is robust enough to cross the p < 0.05 threshold." (Andrew Gelman & Eric Loken, "The Statistical Crisis in Science", American Scientist Vol. 102(6), 2014)

"Only at the edge of chaos can complex systems flourish. This threshold line, that edge between anarchy and frozen rigidity, is not a like a fence line, it is a fractal line; it possesses nonlinearity." (Stephen H Buhner, "Plant Intelligence and the Imaginal Realm: Beyond the Doors of Perception into the Dreaming of Earth", 2014)

"Bifurcation is a qualitative, topological change of a system’s phase space that occurs when some parameters are slightly varied across their critical thresholds. Bifurcations play important roles in many real-world systems as a switching mechanism. […] There are two categories of bifurcations. One is called a local bifurcation, which can be characterized by a change in the stability of equilibrium points. It is called local because it can be detected and analyzed only by using localized information around the equilibrium point. The other category is called a global bifurcation, which occurs when non-local features of the phase space, such as limit cycles (to be discussed later), collide with equilibrium points in a phase space. This type of bifurcation can’t be characterized just by using localized information around the equilibrium point."  (Hiroki Sayama, "Introduction to the Modeling and Analysis of Complex Systems", 2015)

"[...] living organisms manifest deep new physical principles, and that we are on the threshold of uncovering and harnessing those principles. What is different this time, and why it has taken so many decades to discover the real secret of life, is that the new physics is not simply a matter of an additional type of force - a 'life force' - but something altogether more subtle, something that interweaves matter and information, wholes and parts, simplicity and complexity." (Paul Davies, "The Demon in the Machine: How Hidden Webs of Information Are Solving the Mystery of Life", 2019) 

Related Posts Plugin for WordPress, Blogger...