31 December 2025

❄️Systems Thinking: On Competition (Quotes)

"The true nature of the universal principle of synergy pervading all nature and creating all the different kinds of structure that we observe to exist, must now be made clearer. Primarily and essentially it is a process of equilibration, i.e., the several forces are first brought into a state of partial equilibrium. It begins in collision, conflict, antagonism, and opposition, and then we have the milder phases of antithesis, competition, and interaction, passing next into a modus vivendi, or compromise, and ending in collaboration and cooperation. […] The entire drift is toward economy, conservatism, and the prevention of waste." (James Q Dealey & Lester F Ward, "A Text-book of Sociology", 1905)

"An internal model allows a system to look ahead to the future consequences of current actions, without actually committing itself to those actions. In particular, the system can avoid acts that would set it irretrievably down some road to future disaster" ('stepping off a cliff'). Less dramatically, but equally important, the model enables the agent to make current 'stage-setting' moves that set up later moves that are obviously advantageous. The very essence of a competitive advantage, whether it be in chess or economics, is the discovery and execution of stage-setting moves." (John H Holland, 1992)

"The systems' basic components are treated as sets of rules. The systems rely on three key mechanisms: parallelism, competition, and recombination. Parallelism permits the system to use individual rules as building blocks, activating sets of rules to describe and act upon the changing situations. Competition allows the system to marshal its rules as the situation demands, providing flexibility and transfer of experience. This is vital in realistic environments, where the agent receives a torrent of information, most of it irrelevant to current decisions. The procedures for adaptation - credit assignment and rule discovery - extract useful, repeatable events from this torrent, incorporating them as new building blocks. Recombination plays a key role in the discovery process, generating plausible new rules from parts of tested rules. It implements the heuristic that building blocks useful in the past will prove useful in new, similar contexts." (John H Holland, "Complex Adaptive Systems", Daedalus Vol. 121" (1), 1992) 

"The model of competitive equilibrium which has been discussed so far is set in a timeless environment. People and companies all operate in a world in which there is no future and hence no uncertainty." (Paul Ormerod, "The Death of Economics", 1994)

"Economics emphasizes competition, expansion, and domination; ecology emphasizes cooperation, conservation, and partnership." (Fritjof Capra, The Web of Life", 1996)

"What is sustained in a sustainable community is not economic growth, development, market share, or competitive advantage, but the entire web of life on which our long-term survival depends. In other words, a sustainable community is designed in such a way that its ways of life, businesses, economy, physical structures, and technologies do not interfere with nature’s inherent ability to sustain life." (Fritjof Capra, "Ecoliteracy: The Challenge for Education in the Next Century", 1999)

"Optimization by individual agents, often used to derive competitive equilibria, are unnecessary for an actual economy to approximately attain such equilibria. From the failure of humans to optimize in complex tasks, one need not conclude that the equilibria derived from the competitive model are descriptively irrelevant. We show that even in complex economic systems, such equilibria can be attained under a range of surprisingly weak assumptions about agent behavior." (Antoni Bosch-Domènech & Shyam Sunder, "Tracking the Invisible Hand", 2000)

"Periods of rapid change and high exponential growth do not, typically, last long. A new equilibrium with a new dominant technology and/or competitor is likely to be established before long. Periods of punctuation are therefore exciting and exhibit unusual uncertainty. The payoff from establishing a dominant position in this short time is therefore extraordinarily high. Dominance is more likely to come from skill in marketing and positioning than from superior technology itself." (Richar Koch, "The Power Laws", 2000)

"To remedy chaotic situations requires a chaotic approach, one that is non-linear, constantly morphing, and continually sharpening its competitive edge with recurring feedback loops that build upon past experiences and lessons learned. Improvement cannot be sustained without reflection. Chaos arises from myriad sources that stem from two origins: internal chaos rising within you, and external chaos being imposed upon you by the environment. The result of this push/pull effect is the disequilibrium [...]." (Jeff Boss, "Navigating Chaos: How to Find Certainty in Uncertain Situations", 2015)


❄️Systems Thinking: On Closed Systems (Quotes)

"True equilibria can occur only in closed systems and that, in open systems, disequilibria called ‘steady states’, or ‘flow equilibria’ are the predominant and characteristic feature. According to the second law of thermodynamics a closed system must eventually attain a time-independent equilibrium state, with maximum entropy and minimum free energy. An open system may, under certain conditions, attain a time-independent state where the system remains constant as a whole and in its phases, though there is a continuous flow of component materials. This is called a steady state. Steady states are irreversible as a whole. […] A closed system in equilibrium does not need energy for its preservation, nor can energy be obtained from it. In order to perform work, a system must be in disequilibrium, tending toward equilibrium and maintaining a steady state, Therefore the character of an open system is the necessary condition for the continuous working capacity of the organism." (Ludwig on Bertalanffy, "Theoretische Biologie: Band 1: Allgemeine Theorie, Physikochemie, Aufbau und Entwicklung des Organismus", 1932)

"[…] the characteristic tendency of entropy is to increase. As entropy increases, the universe, and all closed systems in the universe, tend naturally to deteriorate and lose their distinctiveness, to move from the least to the most probable state, from a state of organization and differentiation in which distinctions and forms exist, to a state of chaos and sameness." (Norbert Wiener, "The Human Use of Human Beings", 1950)

"Cybernetics might, in fact, be defined as the study of systems that are open to energy but closed to information and control-systems that are 'information-tight'." (W Ross Ashby, "An Introduction to Cybernetics", 1956)

"System theory is basically concerned with problems of relationships, of structure, and of interdependence rather than with the constant attributes of objects. In general approach it resembles field theory except that its dynamics deal with temporal as well as spatial patterns. Older formulations of system constructs dealt with the closed systems of the physical sciences, in which relatively self-contained structures could be treated successfully as if they were independent of external forces. But living systems, whether biological organisms or social organizations, are acutely dependent on their external environment and so must be conceived of as open systems." (Daniel Katz, "The Social Psychology of Organizations", 1966)

"Traditional organizational theories have tended to view the human organization as a closed system. This tendency has led to a disregard of differing organizational environments and the nature of organizational dependency on environment. It has led also to an over-concentration on principles of internal organizational functioning, with consequent failure to develop and understand the processes of feedback which are essential to survival." (Daniel Katz, "The Social Psychology of Organizations", 1966)

"Most of our beliefs about complex organizations follow from one or the other of two distinct strategies. The closed-system strategy seeks certainty by incorporating only those variables positively associated with goal achievement and subjecting them to a monolithic control network. The open-system strategy shifts attention from goal achievement to survival and incorporates uncertainty by recognizing organizational interdependence with environment. A newer tradition enables us to conceive of the organization as an open system, indeterminate and faced with uncertainty, but subject to criteria of rationality and hence needing certainty." (James D Thompson, "Organizations in Action", 1967)

"Conventional physics deals only with closed systems, i.e. systems which are considered to be isolated from their environment. [...] However, we find systems which by their very nature and definition are not closed systems. Every living organism is essentially an open system. It maintains itself in a continuous inflow and outflow, a building up and breaking down of components, never being, so long as it is alive, in a state of chemical and thermodynamic equilibrium but maintained in a so-called steady state which is distinct from the latter." (Ludwig von Bertalanffy, "General System Theory", 1968)

"Open systems, in contrast to closed systems, exhibit a principle of equifinality, that is, a tendency to achieve a final state independent of initial conditions. In other words, open systems tend to 'resist' perturbations that take them away from some steady state. They can exhibit homeostasis." (Anatol Rapaport, "The Uses of Mathematical Isomorphism in General System Theory", 1972) 

"Every day of continued exponential growth brings the world system closer to the ultimate limits of that growth." (Mihajlo D Mesarovic, "Mankind at the Turning Point", 1974) 

"Everywhere […] in the Universe, we discern that closed physical systems evolve in the same sense from ordered states towards a state of complete disorder called thermal equilibrium. This cannot be a consequence of known laws of change, since […] these laws are time symmetric- they permit […] time-reverse. […] The initial conditions play a decisive role in endowing the world with its sense of temporal direction. […] some prescription for initial conditions is crucial if we are to understand […]" (John D Barrow, "Theories of Everything: The Quest for Ultimate Explanation", 1991)

"In a closed system, the change in entropy must always be 'positive', meaning toward death. However, in open biological or social systems, entropy can be arrested and may even be transformed into negative entropy - a process of more complete organization and enhanced ability to transform resources. Why? Because the system imports energy and resources from its environment, leading to renewal. This is why education and learning are so important, as they provide new and stimulating input" (termed neg-entropy) that can transform each of us." (Stephen G Haines, "The Managers Pocket Guide to Systems Thinking & Learning", 1998)

"All systems have a tendency toward maximum entropy, disorder, and death. Importing resources from the environment is key to long-term viability; closed systems move toward this disorganization faster than open systems." (Stephen G Haines, "The Systems Thinking Approach to Strategic Planning and Management", 2000

"However, the law of accelerating returns pertains to evolution, which is not a closed system. It takes place amid great chaos and indeed depends on the disorder in its midst, from which it draws its options for diversity. And from these options, an evolutionary process continually prunes its choices to create ever greater order. " (Ray Kurzweil, "The Singularity is Near", 2005)

"The Second Law of Thermodynamics states that in an isolated system (one that is not taking in energy), entropy never decreases. (The First Law is that energy is conserved; the Third, that a temperature of absolute zero is unreachable.) Closed systems inexorably become less structured, less organized, less able to accomplish interesting and useful outcomes, until they slide into an equilibrium of gray, tepid, homogeneous monotony and stay there." (Steven Pinker, "The Second Law of Thermodynamics", 2017)

27 December 2025

❄️Systems Thinking: On Open Systems (Quotes)

"True equilibria can occur only in closed systems and that, in open systems, disequilibria called ‘steady states’, or ‘flow equilibria’ are the predominant and characteristic feature. According to the second law of thermodynamics a closed system must eventually attain a time-independent equilibrium state, with maximum entropy and minimum free energy. An open system may, under certain conditions, attain a time-independent state where the system remains constant as a whole and in its phases, though there is a continuous flow of component materials. This is called a steady state. Steady states are irreversible as a whole. […] A closed system in equilibrium does not need energy for its preservation, nor can energy be obtained from it. In order to perform work, a system must be in disequilibrium, tending toward equilibrium and maintaining a steady state, Therefore the character of an open system is the necessary condition for the continuous working capacity of the organism." (Ludwig on Bertalanffy, "Theoretische Biologie: Band 1: Allgemeine Theorie, Physikochemie, Aufbau und Entwicklung des Organismus", 1932)

"A state of equilibrium in a system does not mean, further, that the system is without tension. Systems can, on the contrary, also come to equilibrium in a state of tension (e.g., a spring under tension or a container with gas under pressure).The occurrence of this sort of system, however, presupposes a certain firmness of boundaries and actual segregation of the system from its environment (both of these in a functional, not a spatial, sense). If the different parts of the system are insufficiently cohesive to withstand the forces working toward displacement (i.e., if the system shows insufficient internal firmness, if it is fluid), or if the system is not segregated from its environment by sufficiently firm walls but is open to its neighboring systems, stationary tensions cannot occur. Instead, there occurs a process in the direction of the forces, which encroaches upon the neighboring regions with diffusion of energy and which goes in the direction of an equilibrium at a lower level of tension in the total region. The presupposition for the existence of a stationary state of tension is thus a certain firmness of the system in question, whether this be its own inner firmness or the firmness of its walls." (Kurt Lewin, "A Dynamic Theory of Personality", 1935)

"Cybernetics might, in fact, be defined as the study of systems that are open to energy but closed to information and control-systems that are 'information-tight'." (W Ross Ashby, "An Introduction to Cybernetics", 1956)

"To say a system is 'self-organizing' leaves open two quite different meanings. There is a first meaning that is simple and unobjectionable. This refers to the system that starts with its parts separate (so that the behavior of each is independent of the others' states) and whose parts then act so that they change towards forming connections of some type. Such a system is 'self-organizing' in the sense that it changes from 'parts separated' to 'parts joined'. […] In general such systems can be more simply characterized as 'self-connecting', for the change from independence between the parts to conditionality can always be seen as some form of 'connection', even if it is as purely functional […]  'Organizing' […] may also mean 'changing from a bad organization to a good one' […] The system would be 'self-organizing' if a change were automatically made to the feedback, changing it from positive to negative; then the whole would have changed from a bad organization to a good." (W Ross Ashby, "Principles of the self-organizing system", 1962)

"System theory is basically concerned with problems of relationships, of structure, and of interdependence rather than with the constant attributes of objects. In general approach it resembles field theory except that its dynamics deal with temporal as well as spatial patterns. Older formulations of system constructs dealt with the closed systems of the physical sciences, in which relatively self-contained structures could be treated successfully as if they were independent of external forces. But living systems, whether biological organisms or social organizations, are acutely dependent on their external environment and so must be conceived of as open systems." (Daniel Katz, "The Social Psychology of Organizations", 1966)

"Most of our beliefs about complex organizations follow from one or the other of two distinct strategies. The closed-system strategy seeks certainty by incorporating only those variables positively associated with goal achievement and subjecting them to a monolithic control network. The open-system strategy shifts attention from goal achievement to survival and incorporates uncertainty by recognizing organizational interdependence with environment. A newer tradition enables us to conceive of the organization as an open system, indeterminate and faced with uncertainty, but subject to criteria of rationality and hence needing certainty." (James D Thompson, "Organizations in Action", 1967)

"That a system is open means, not simply that it engages in interchanges with the environment, but that this interchange is an essential factor underlying the system's viability, its reproductive ability or continuity, and its ability to change. [...] Openness is an essential factor underlying a system's viability, continuity, and its ability to change. " (Walter F Buckley, "Sociology and modern systems theory", 1967)

"Conventional physics deals only with closed systems, i.e. systems which are considered to be isolated from their environment. [...] However, we find systems which by their very nature and definition are not closed systems. Every living organism is essentially an open system. It maintains itself in a continuous inflow and outflow, a building up and breaking down of components, never being, so long as it is alive, in a state of chemical and thermodynamic equilibrium but maintained in a so-called steady state which is distinct from the latter." (Ludwig von Bertalanffy, "General System Theory", 1968)

"Open systems, in contrast to closed systems, exhibit a principle of equifinality, that is, a tendency to achieve a final state independent of initial conditions. In other words, open systems tend to 'resist' perturbations that take them away from some steady state. They can exhibit homeostasis." (Anatol Rapaport, "The Uses of Mathematical Isomorphism in General System Theory", 1972)

"There is nothing supernatural about the process of self-organization to states of higher entropy; it is a general property of systems, regardless of their materials and origin. It does not violate the Second Law of thermodynamics since the decrease in entropy within an open system is always offset by the increase of entropy in its surroundings." (Ervin László, "Introduction to Systems Philosophy", 1972)

"The third model regards mind as an information processing system. This is the model of mind subscribed to by cognitive psychologists and also to some extent by the ego psychologists. Since an acquisition of information entails maximization of negative entropy and complexity, this model of mind assumes mind to be an open system." (Thaddus E Weckowicz, "Models of Mental Illness", 1984)

"Self-organization refers to the spontaneous formation of patterns and pattern change in open, nonequilibrium systems. […] Self-organization provides a paradigm for behavior and cognition, as well as the structure and function of the nervous system. In contrast to a computer, which requires particular programs to produce particular results, the tendency for self-organization is intrinsic to natural systems under certain conditions." (J A Scott Kelso, "Dynamic Patterns : The Self-organization of Brain and Behavior", 1995)

"[…] self-organization is the spontaneous emergence of new structures and new forms of behavior in open systems far from equilibrium, characterized by internal feedback loops and described mathematically by nonlinear equations." (Fritjof  Capra, "The web of life: a new scientific understanding of living  systems", 1996)

"All systems evolve, although the rates of evolution may vary over time both between and within systems. The rate of evolution is a function of both the inherent stability of the system and changing environmental circumstances. But no system can be stabilized forever. For the universe as a whole, an isolated system, time’s arrow points toward greater and greater breakdown, leading to complete molecular chaos, maximum entropy, and heat death. For open systems, including the living systems that are of major interest to us and that interchange matter and energy with their external environments, time’s arrow points to evolution toward greater and greater complexity. Thus, the universe consists of islands of increasing order in a sea of decreasing order. Open systems evolve and maintain structure by exporting entropy to their external environments." (L Douglas Kiel, "Chaos Theory in the Social Sciences: Foundations and Applications", 1996)

"[…] self-organization is the spontaneous emergence of new structures and new forms of behavior in open systems far from equilibrium, characterized by internal feedback loops and described mathematically by nonlinear equations." (Fritjof Capra, “The web of life: a new scientific understanding of living systems”, 1996)

"In a closed system, the change in entropy must always be 'positive', meaning toward death. However, in open biological or social systems, entropy can be arrested and may even be transformed into negative entropy - a process of more complete organization and enhanced ability to transform resources. Why? Because the system imports energy and resources from its environment, leading to renewal. This is why education and learning are so important, as they provide new and stimulating input" (termed neg-entropy) that can transform each of us." (Stephen G Haines, "The Managers Pocket Guide to Systems Thinking & Learning", 1998)

"All systems have a tendency toward maximum entropy, disorder, and death. Importing resources from the environment is key to long-term viability; closed systems move toward this disorganization faster than open systems." (Stephen G Haines, "The Systems Thinking Approach to Strategic Planning and Management", 2000)

"Strategy in complex systems must resemble strategy in board games. You develop a small and useful tree of options that is continuously revised based on the arrangement of pieces and the actions of your opponent. It is critical to keep the number of options open. It is important to develop a theory of what kinds of options you want to have open." (John H Holland, [presentation] 2000)

"Systems, and organizations as systems, can only be understood holistically. Try to understand the system and its environment first. Organizations are open systems and, as such, are viable only in interaction with and adaptation to the changing environment." (Stephen G Haines, "The Systems Thinking Approach to Strategic Planning and Management", 2000)

"This spontaneous emergence of order at critical points of instability is one of the most important concepts of the new understanding of life. It is technically known as self-organization and is often referred to simply as ‘emergence’. It has been recognized as the dynamic origin of development, learning and evolution. In other words, creativity-the generation of new forms-is a key property of all living systems. And since emergence is an integral part of the dynamics of open systems, we reach the important conclusion that open systems develop and evolve. Life constantly reaches out into novelty." (Fritjof  Capra, "The Hidden Connections", 2002)

"Limiting factors in population dynamics play the role in ecology that friction does in physics. They stop exponential growth, not unlike the way in which friction stops uniform motion. Whether or not ecology is more like physics in a viscous liquid, when the growth-rate-based traditional view is sufficient, is an open question. We argue that this limit is an oversimplification, that populations do exhibit inertial properties that are noticeable. Note that the inclusion of inertia is a generalization - it does not exclude the regular rate-based, first-order theories. They may still be widely applicable under a strong immediate density dependence, acting like friction in physics." (Lev Ginzburg & Mark Colyvan, "Ecological Orbits: How Planets Move and Populations Grow", 2004)

"An ecology provides the special formations needed by organizations. Ecologies are: loose, free, dynamic, adaptable, messy, and chaotic. Innovation does not arise through hierarchies. As a function of creativity, innovation requires trust, openness, and a spirit of experimentation - where random ideas and thoughts can collide for re-creation." (George Siemens, "Knowing Knowledge", 2006)

"Complexity is a relative term. It depends on the number and the nature of interactions among the variables involved. Open loop systems with linear, independent variables are considered simpler than interdependent variables forming nonlinear closed loops with a delayed response." (Jamshid Gharajedaghi, "Systems Thinking: Managing Chaos and Complexity A Platform for Designing Business Architecture" 3rd Ed., 2011)

"Principle of Equifinality: If a steady state is reached in an open system, it is independent of the initial conditions, and determined only by the system parameters, i.e. rates of reaction and transport." (Kevin Adams & Charles Keating, "Systems of systems engineering", 2012)


24 December 2025

❄️Systems Thinking: On Systemic Analysis(Quotes)

"As the analysis of a substantial composite terminates only in a part which is not a whole, that is, in a simple part, so synthesis terminates only in a whole which is not a part, that is, the world." (Immanuel Kant, "Inaugural Dissertation", 1770)

"It is interesting thus to follow the intellectual truths of analysis in the phenomena of nature. This correspondence, of which the system of the world will offer us numerous examples, makes one of the greatest charms attached to mathematical speculations." (Pierre-Simon Laplace, "Exposition du système du monde", 1799)

"SYSTEM (to place together) - is a full and connected view of all the truths of some department of knowledge. An organized body of truth, or truths arranged under one and the same idea, which idea is as the life or soul which assimilates all those truths. No truth is altogether isolated. Every truth has relation to some other. And we should try to unite the facts of our knowledge so as to see them in their several bearings. This we do when we frame them into a system. To do so legitimately we must begin by analysis and end with synthesis. But system applies not only to our knowledge, but to the objects of our knowledge. Thus we speak of the planetary system, the muscular system, the nervous system. We believe that the order to which we would reduce our ideas has a foundation in the nature of things. And it is this belief that encourages us to reduce our knowledge of things into systematic order. The doing so is attended with many advantages. At the same time a spirit of systematizing may be carried too far. It is only in so far as it is in accordance with the order of nature that it can be useful or sound." (William Fleming, "Vocabulary of philosophy, mental, moral, and metaphysical; with quotations and references; for the use of students", 1857)

"The whole of the developments and operations of analysis are now capable of being executed by machinery. […] As soon as an Analytical Engine exists, it will necessarily guide the future course of science." (Charles Babbage, "Passages from the Life of a Philosopher", 1864)

"The analysis of Nature into its individual parts, the grouping of the different natural processes and natural objects in definite classes, the study of the internal anatomy of organic bodies in their manifold forms - these were the fundamental conditions of the gigantic strides in our knowledge of Nature which have been made during the last four hundred years. But this method of investigation has also left us as a legacy the habit of observing natural objects and natural processes in their isolation, detached from the whole vast interconnection of things; and therefore not in their motion, but in their repose; not as essentially changing, but fixed constants; not in their life, but in their death." (Friedrich Engels, "Herr Eugen Dühring's Revolution in Science", 1878)

"After this first approximation, the various aspects of the situation undergo a more and more detailed analysis. In contrast to this the second method [for analysis of life space] begins with the life space as a whole and defines its fundamental structure. The procedure in this case is not to add disconnected items but to make the original structure more specific and differentiated. This method therefore proceeds by steps from the more general to the particular and thereby avoids the danger of a wrong simplification" by abstraction." (Kurt Lewin, "Principles of topological psychology", 1936)

"Analysis shows that closed systems cannot behave equifinally. This is the reason why equifinality is found in inanimate nature only in exceptional cases. However, in open systems, which are exchanging materials with the environment, in so far as they attain a steady state, the latter is independent of the initial conditions, or is equifinal.[...] Steady state systems show equifinality, in sharp contrast to closed systems in equilibrium where the final state depends on the components given at the beginning of the process." (Ludwig von Bertalanffy, "The Theory of Open Systems in Physics and Biology", Science Vol. 111, 1950)

"The analysis of engineering systems and the understanding of economic structure have advanced since then, and the time is now more ripe to bring these topics into a potentially fruitful marriage." (Arnold Tustin, "The Mechanism of Economic Systems", 1953)

"An engineering science aims to organize the design principles used in engineering practice into a discipline and thus to exhibit the similarities between different areas of engineering practice and to emphasize the power of fundamental concepts. In short, an engineering science is predominated by theoretical analysis and very often uses the tool of advanced mathematics." (Qian Xuesen, "Engineering cybernetics", 1954)

"The term 'systems engineering' is a term with an air of romance and of mystery. The romance and the mystery come from its use in the field of guided missiles, rockets, artificial satellites, and space flight. Much of the work being done in these areas is classified and hence much of it is not known to the general public or to this writer. […] From a business point of view, systems engineering is the creation of a deliberate combination of human services, material services, and machine service to accomplish an information processing job. But this is also very nearly a definition of business system analysis. The difference, from a business point of view, therefore, between business system analysis and systems engineering is only one of degree. In general, systems engineering is more total and more goal-oriented in its approach [...]." ("Computers and People" Vol. 5, 1956)

"There are two types of systems engineering - basis and applied. [...] Systems engineering is, obviously, the engineering of a system. It usually, but not always, includes dynamic analysis, mathematical models, simulation, linear programming, data logging, computing, optimating, etc., etc. It connotes an optimum method, realized by modern engineering techniques. Basic systems engineering includes not only the control system but also all equipment within the system, including all host equipment for the control system. Applications engineering is - and always has been - all the engineering required to apply the hardware of a hardware manufacturer to the needs of the customer. Such applications engineering may include, and always has included where needed, dynamic analysis, mathematical models, simulation, linear programming, data logging, computing, and any technique needed to meet the end purpose - the fitting of an existing line of production hardware to a customer's needs. This is applied systems engineering." (Instruments and Control Systems Vol. 31, 1958

"In a general way it may be said that to think in terms of systems seems the most appropriate conceptual response so far available when the phenomena under study - at any level and in any domain - display the character of being organized, and when understanding the nature of the interdependencies constitutes the research task. In the behavioral sciences, the first steps in building a systems theory were taken in connection with the analysis of internal processes in organisms, or organizations, when the parts had to be related to the whole." (Fred Emery, "The Causal Texture of Organizational Environments", 1963)

"Foundations and organization are similar in that both provide some sort of more systematic exposition. But a step in this direction may be crucial for organization, yet foundationally trivial, for instance a new choice of language when (i) old theorems are simpler to state but (ii) the primitive notions of the new language are defined in terms of the old, that is if they are logically dependent on the latter. Quite often, (i) will be achieved by using new notions with more ‘structure’, that is less analyzed notions, which is a step in the opposite direction to a foundational analysis. In short, foundational and organizational aims are liable to be actually contradictory." (Georg Kreisel & Jean-Louis Krivine, "Elements of Mathematical Logic: Model Theory", 1967)

"General systems theory (in the narrow sense of the term) is a discipline concerned with the general properties and laws of 'systems' . A system is defined as a complex of components in interaction, or by some similar proposition. Systems theory tries to develop those principles that apply to systems in general, irrespective of the nature of the system, of their components, and of the relations or 'forces' between them. The system components need not even be material, as, for example, in the system analysis of a commercial enterprise where components such as buildings, machines, personnel, money and 'good will' of customers enter." (Ludwig von Bertalanffy, "Robots, Men and Minds", 1967

"In the minds of many writers systems engineering is synonomous with component selection and interface design; that is, the systems engineer does not design hardware but decides what types of existing hardware shall be coupled and how they shall be coupled. Complete agreement that this function is the essence of systems engineering will not be found here, for, besides the very important function of systems engineering in systems analysis, there is the role played by systems engineering in providing boundary conditions for hardware design." (A Wayne Wymore, "A Mathematical Theory of Systems Engineering", 1967

"Now we are looking for another basic outlook on the world - the world as organization. Such a conception - if it can be substantiated - would indeed change the basic categories upon which scientific thought rests, and profoundly influence practical attitudes. This trend is marked by the emergence of a bundle of new disciplines such as cybernetics, information theory, general system theory, theories of games, of decisions, of queuing and others; in practical applications, systems analysis, systems engineering, operations research, etc. They are different in basic assumptions, mathematical techniques and aims, and they are often unsatisfactory and sometimes contradictory. They agree, however, in being concerned, in one way or another, with ‘systems’, ‘wholes’ or ‘organizations’; and in their totality, they herald a new approach." (Ludwig von Bertalanffy, "General System Theory", 1968)"

"In the selection of papers for this volume, two problems have arisen, namely what constitutes systems thinking and what systems thinking is relevant to the thinking required for organizational management. The first problem is obviously critical. Unless there were a meaningful answer there would be no sense in producing a volume of readings in systems thinking in any subject. A great many writers have manifestly believed that there is a way of considering phenomena which is sufficiently different from the well-established modes of scientific analysis to deserve the particular title of systems thinking." (Frederick E Emery" (ed.),"Systems thinking: selected readings", 1969

"My analysis of living systems uses concepts of thermodynamics, information theory, cybernetics, and systems engineering, as well as the classical concepts appropriate to each level. The purpose is to produce a description of living structure and process in terms of input and output, flows through systems, steady states, and feedbacks, which will clarify and unify the facts of life." (James G Miller, "Living Systems: Basic Concepts", 1969)

"The notion of ‘system’ has gained central importance in contemporary science, society and life. In many fields of endeavor, the necessity of a ‘systems approach’ or ‘systems thinking’ is emphasized, new professions called ‘systems engineering’, ‘systems analysis’ and the like have come into being, and there can be little doubt that this this concept marks a genuine, necessary, and consequential development in science and world-view." (Ervin László,"Introduction to Systems Philosophy: Toward a New Paradigm of Contemporary Thought", 1972)

"System theory is a tool which engineers use to help them design the 'best' system to do the job that must be done. A dominant characteristic of system theory is the interest in the analysis and design (synthesis) of systems from an input-output point of view. System theory uses mathematical manipulation of a mathematical model to help design the actual system." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"The cybernetics phase of cognitive science produced an amazing array of concrete results, in addition to its long-term (often underground) influence: the use of mathematical logic to understand the operation of the nervous system; the invention of information processing machines (as digital computers), thus laying the basis for artificial intelligence; the establishment of the metadiscipline of system theory, which has had an imprint in many branches of science, such as engineering (systems analysis, control theory), biology (regulatory physiology, ecology), social sciences (family therapy, structural anthropology, management, urban studies), and economics (game theory); information theory as a statistical theory of signal and communication channels; the first examples of self-organizing systems. This list is impressive: we tend to consider many of these notions and tools an integrative part of our life […]" (Francisco Varela, "The Embodied Mind", 1991)

"[…] it does not seem helpful just to say that all models are wrong. The very word model implies simplification and idealization. The idea that complex physical, biological or sociological systems can be exactly described by a few formulae is patently absurd. The construction of idealized representations that capture important stable aspects of such systems is, however, a vital part of general scientific analysis and statistical models, especially substantive ones, do not seem essentially different from other kinds of model." (Sir David Cox, "Comment on ‘Model uncertainty, data mining and statistical inference’", Journal of the Royal Statistical Society, Series A 158, 1995)

"In our analysis of complex systems (like the brain and language) we must avoid the trap of trying to find master keys. Because of the mechanisms by which complex systems structure themselves, single principles provide inadequate descriptions. We should rather be sensitive to complex and self-organizing interactions and appreciate the play of patterns that perpetually transforms the system itself as well as the environment in which it operates." (Paul Cilliers, "Complexity and Postmodernism: Understanding Complex Systems" , 1998)

"Analysis of a system reveals its structure and how it works. It provides the knowledge required to make it work efficiently and to repair it when it stops working. Its product is know-how, knowledge, not understanding. To enable a system to perform effectively we must understand it - we must be able to explain its behavior - and this requires being aware of its functions in the larger systems of which it is a part." (Russell L Ackoff, "Re-Creating the Corporation", 1999

"Conventional mathematics and control theory exclude vagueness and contradictory conditions. As a consequence, conventional control systems theory does not attempt to study any formulation, analysis, and control of what has been called fuzzy systems, which may be vague, incomplete, linguistically described, or even inconsistent." (Guanrong Chen & Trung Tat Pham, "Introduction to Fuzzy Sets, Fuzzy Logic, and Fuzzy Control Systems", 2001))

"One of the factors that distinguishes engineering from science is that the engineer builds complex systems from simple bits, whereas the scientist breaks complex systems into hopefully comprehensible components. The first is called understanding by synthesis and the second is understanding by analysis." (Igor Aleksander, "How to Build a Mind: toward machines with imagination", 2001)

"Systems thinking expands the focus of the observer, whereas analytical thinking reduces it. In other words, analysis looks into things, synthesis looks out of them. This attitude of systems thinking is often called expansionism, an alternative to classic reductionism. Whereas analytical thinking concentrates on static and structural properties, systems thinking concentrates on the function and behaviour of whole systems. Analysis gives description and knowledge; systems thinking gives explanation and understanding." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Systems thinking, in contrast, focuses on how the thing being studied interacts with the other constituents of the system - a set of elements that interact to produce behaviour - of which it is a part. This means that instead of isolating smaller and smaller parts of the system being studied, systems thinking works by expanding its view to take into account larger and larger numbers of interactions as an issue is being studied. This results in sometimes strikingly different conclusions than those generated by traditional forms of analysis, especially when what is being studied is dynamically complex or has a great deal of feedback from other sources, internal or external. Systems thinking allows people to make their understanding of social systems explicit and improve them in the same way that people can use engineering principles to make explicit and improve their understanding of mechanical systems." (Raed M Al-Qirem & Saad G Yaseen, "Modelling a Small Firm in Jordan Using System Dynamics" [in "Handbook of Research on Discrete Event Simulation Environments: Technologies and Applications"], 2010)

"Understanding interdependency requires a way of thinking different from analysis. It requires systems thinking. And analytical thinking and systems thinking are quite distinct. [...] Systems thinking is the art of simplifying complexity. It is about seeing through chaos, managing interdependency, and understanding choice. We see the world as increasingly more complex and chaotic because we use inadequate concepts to explain it. When we understand something, we no longer see it as chaotic or complex." (Jamshid Gharajedaghi, "Systems Thinking: Managing Chaos and Complexity A Platform for Designing Business Architecture", 2011) 

"Catastrophe theory can be thought of as a link between classical analysis, dynamical systems, differential topology (including singularity theory), modern bifurcation theory and the theory of complex systems. [...] The name ‘catastrophe theory’ is used for a combination of singularity theory and its applications. [...] From the didactical point of view, there are two main positions for courses in catastrophe theory at university level: Trying to teach the theory as a perfect axiomatic system consisting of exact definitions, theorems and proofs or trying to teach mathematics as it can be developed from historical or from natural problems." (Werner Sanns, "Catastrophe Theory" [Mathematics of Complexity and Dynamical Systems, 2012]

"A model is a simplified representation of a system. It can be conceptual, verbal, diagrammatic, physical, or formal (mathematical)." (Hiroki Sayama, "Introduction to the Modeling and Analysis of Complex Systems", 2015)

"A network (or graph) consists of a set of nodes" (or vertices, actors) and a set of edges" (or links, ties) that connect those nodes. [...] The size of a network is characterized by the numbers of nodes and edges in it." (Hiroki Sayama, "Introduction to the Modeling and Analysis of Complex Systems", 2015)

"Chaos can be understood as a dynamical process in which microscopic information hidden in the details of a system’s state is dug out and expanded to a macroscopically visible scale" (stretching), while the macroscopic information visible in the current system’s state is continuously discarded" (folding)." (Hiroki Sayama, "Introduction to the Modeling and Analysis of Complex Systems", 2015)

"Complex systems are networks made of a number of components that interact with each other, typically in a nonlinear fashion. Complex systems may arise and evolve through self-organization, such that they are neither completely regular nor completely random, permitting the development of emergent behavior at macroscopic scales." (Hiroki Sayama, "Introduction to the Modeling and Analysis of Complex Systems", 2015)

"Dynamics of a linear system are decomposable into multiple independent one-dimensional exponential dynamics, each of which takes place along the direction given by an eigenvector. A general trajectory from an arbitrary initial condition can be obtained by a simple linear superposition of those independent dynamics." (Hiroki Sayama, "Introduction to the Modeling and Analysis of Complex Systems", 2015)

"Emergence is a nontrivial relationship between the properties of a system at microscopic and macroscopic scales. Macroscopic properties are called emergent when it is hard to explain them simply from microscopic properties." (Hiroki Sayama, "Introduction to the Modeling and Analysis of Complex Systems", 2015

"Self-organization is a dynamical process by which a system spontaneously forms nontrivial macroscopic structures and/or behaviors over time." (Hiroki Sayama, "Introduction to the Modeling and Analysis of Complex Systems", 2015)

"The work around the complex systems map supported a concentration on causal mechanisms. This enabled poor system responses to be diagnosed as the unanticipated effects of previous policies as well as identification of the drivers of the sector. Understanding the feedback mechanisms in play then allowed experimentation with possible future policies and the creation of a coherent and mutually supporting package of recommendations for change. " (David C Lane et al, "Blending systems thinking approaches for organisational analysis: reviewing child protection", 2015)

19 December 2025

❄️Systems Thinking: On Dimensions (Quotes)

"A fairly general procedure for mathematical study of a physical system with explication of the space of states of that system. Now this space of states could reasonably be one of a number of mathematical objects. However, in my mind, a principal candidate for the state space should be a differentiable manifold; and in case the has a finite number of degrees of freedom, then this will be a finite dimensional manifold. Usually associated with physical is the notion of how a state progresses in time. The corresponding object is a dynamical system or a first order ordinary differential equation on the manifold of states." (Stephen Smale, "Personal perspectives on mathematics and mechanics", 1971)

"The structure of a system is the arrangement of its subsystems and components in three-dimensional space at a given moment of time. This always changes over time. It may remain relatively fixed for a long period or it may change from moment to moment, depending upon the characteristics of the process in the system. This process halted at any given moment, as when motion is frozen by a high-speed photograph, reveals the three-dimensional spatial arrangement of the system's components as of that instant." (James G Miller, "Living systems", 1978)

"Cellular automata are discrete dynamical systems with simple construction but complex self-organizing behaviour. Evidence is presented that all one-dimensional cellular automata fall into four distinct universality classes. Characterizations of the structures generated in these classes are discussed. Three classes exhibit behaviour analogous to limit points, limit cycles and chaotic attractors. The fourth class is probably capable of universal computation, so that properties of its infinite time behaviour are undecidable." (Stephen Wolfram, "Nonlinear Phenomena, Universality and complexity in cellular automata", Physica 10D, 1984)

"Linking topology and dynamical systems is the possibility of using a shape to help visualize the whole range of behaviors of a system. For a simple system, the shape might be some kind of curved surface; for a complicated system, a manifold of many dimensions. A single point on such a surface represents the state of a system at an instant frozen in time. As a system progresses through time, the point moves, tracing an orbit across this surface. Bending the shape a little corresponds to changing the system's parameters, making a fluid more visous or driving a pendulum a little harder. Shapes that look roughly the same give roughly the same kinds of behavior. If you can visualize the shape, you can understand the system." (James Gleick, "Chaos: Making a New Science", 1987)

"[…] physicists have come to appreciate a fourth kind of temporal behavior: deterministic chaos, which is aperiodic, just like random noise, but distinct from the latter because it is the result of deterministic equations. In dynamic systems such chaos is often characterized by small fractal dimensions because a chaotic process in phase space typically fills only a small part of the entire, energetically available space." (Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

"The term chaos is used in a specific sense where it is an inherently random pattern of behaviour generated by fixed inputs into deterministic (that is fixed) rules (relationships). The rules take the form of non-linear feedback loops. Although the specific path followed by the behaviour so generated is random and hence unpredictable in the long-term, it always has an underlying pattern to it, a 'hidden' pattern, a global pattern or rhythm. That pattern is self-similarity, that is a constant degree of variation, consistent variability, regular irregularity, or more precisely, a constant fractal dimension. Chaos is therefore order (a pattern) within disorder" (random behaviour)." (Ralph D Stacey, "The Chaos Frontier: Creative Strategic Control for Business", 1991)

"What is an attractor? It is the set on which the point P, representing the system of interest, is moving at large times (i.e., after so-called transients have died out). For this definition to make sense it is important that the external forces acting on the system be time independent" (otherwise we could get the point P to move in any way we like). It is also important that we consider dissipative systems (viscous fluids dissipate energy by self-friction). Dissipation is the reason why transients die out. Dissipation is the reason why, in the infinite-dimensional space representing the system, only a small set" (the attractor) is really interesting." (David Ruelle, "Chance and Chaos", 1991)

"An attractor that consists of an infinite number of curves, surfaces, or higher-dimensional manifolds - generalizations of surfaces to multidimensional space - often occurring in parallel sets, with a gap between any two members of the set, is called a strange attractor." (Edward N Lorenz, "The Essence of Chaos", 1993)

"As with subtle bifurcations, catastrophes also involve a control parameter. When the value of that parameter is below a bifurcation point, the system is dominated by one attractor. When the value of that parameter is above the bifurcation point, another attractor dominates. Thus the fundamental characteristic of a catastrophe is the sudden disappearance of one attractor and its basin, combined with the dominant emergence of another attractor. Any type of attractor static, periodic, or chaotic can be involved in this. Elementary catastrophe theory involves static attractors, such as points. Because multidimensional surfaces can also attract" (together with attracting points on these surfaces), we refer to them more generally as attracting hypersurfaces, limit sets, or simply attractors." (Courtney Brown, "Chaos and Catastrophe Theories", 1995)

"The dimensionality and nonlinearity requirements of chaos do not guarantee its appearance. At best, these conditions allow it to occur, and even then under limited conditions relating to particular parameter values. But this does not imply that chaos is rare in the real world. Indeed, discoveries are being made constantly of either the clearly identifiable or arguably persuasive appearance of chaos. Most of these discoveries are being made with regard to physical systems, but the lack of similar discoveries involving human behavior is almost certainly due to the still developing nature of nonlinear analyses in the social sciences rather than the absence of chaos in the human setting. " (Courtney Brown, "Chaos and Catastrophe Theories", 1995)

"A system may be called complex here if its dimension (order) is too high and its model (if available) is nonlinear, interconnected, and information on the system is uncertain such that classical techniques can not easily handle the problem." (M Jamshidi, "Autonomous Control on Complex Systems: Robotic Applications", Current Advances in Mechanical Design and Production VII, 2000)

"Science reveals complexity unfolding in all dimensions and novel features emerging at all scales and organizational levels of the universe. The more we know the more we become aware of how much we do not know. […] Complexity itself is understood as a particular dynamic or 'movement' in time that is simultaneously stable and unstable, predictable and unpredictable, known and unknown, certain and uncertain." (Terry Cooke-Davies et al, "Exploring the Complexity of Projects", 2009)

"In a chaotic system, there must be stretching to cause the exponential separation of initial conditions but also folding to keep the trajectories from moving off to infinity. The folding requires that the equations of motion contain at least one nonlinearity, leading to the important principle that chaos is a property unique to nonlinear dynamical systems. If a system of equations has only linear terms, it cannot exhibit chaos no matter how complicated or high-dimensional it may be." (Julien C Sprott, "Elegant Chaos: Algebraically Simple Chaotic Flows", 2010)

"Strange attractors, unlike regular ones, are geometrically very complicated, as revealed by the evolution of a small phase-space volume. For instance, if the attractor is a limit cycle, a small two-dimensional volume does not change too much its shape: in a direction it maintains its size, while in the other it shrinks till becoming a 'very thin strand' with an almost constant length. In chaotic systems, instead, the dynamics continuously stretches and folds an initial small volume transforming it into a thinner and thinner 'ribbon' with an exponentially increasing length." (Massimo Cencini et al, "Chaos: From Simple Models to Complex Systems", 2010)

"Systems with dimension greater than four begin to lose their elegance unless they possess some kind of symmetry that reduces the number of parameters. One such symmetry has the variables arranged in a ring of many identical elements, each connected to its neighbors in an identical fashion. The symmetry of the equations is often broken in the solutions, giving rise to spatiotemporal chaotic patterns that are elegant in their own right." (Julien C Sprott, "Elegant Chaos: Algebraically Simple Chaotic Flows", 2010)

"Dynamics of a linear system are decomposable into multiple independent one-dimensional exponential dynamics, each of which takes place along the direction given by an eigenvector. A general trajectory from an arbitrary initial condition can be obtained by a simple linear superposition of those independent dynamics." (Hiroki Sayama, "Introduction to the Modeling and Analysis of Complex Systems", 2015)

13 December 2025

❄️Systems Thinking: On Self-Similarity (Quotes)

 "[…] a pink (or white, or brown) noise is the very paradigm of a statistically self-similar process. Phenomena whose power spectra are homogeneous power functions lack inherent time and frequency scales; they are scale-free. There is no characteristic time or frequency -whatever happens in one time or frequency range happens on all time or frequency scales. If such noises are recorded on magnetic tape and played back at various speeds, they sound the same […]" (Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

"All physical objects that are 'self-similar' have limited self-similarity - just as there are no perfectly periodic functions, in the mathematical sense, in the real world: most oscillations have a beginning and an end (with the possible exception of our universe, if it is closed and begins a new life cycle after every 'big crunch' […]. Nevertheless, self-similarity is a useful  abstraction, just as periodicity is one of the most useful concepts in the sciences, any finite extent notwithstanding." (Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

"Percolation is a widespread paradigm. Percolation theory can therefore illuminate a great many seemingly diverse situations. Because of its basically geometric character, it facilitates the analysis of intricate patterns and textures without needless physical complications. And the self-similarity that prevails at critical points permits profitably mining the connection with scaling and fractals." (Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

"[…] power laws, with integer or fractional exponents, are one of the most fertile fields and abundant sources of self-similarity." (Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

"The only prerequisite for a self-similar law to prevail in a given size range is the absence of an inherent size scale." (Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

"The unifying concept underlying fractals, chaos, and power laws is self-similarity. Self-similarity, or invariance against changes in scale or size, is an attribute of many laws of nature and innumerable phenomena in the world around us. Self-similarity is, in fact, one of the decisive symmetries that shape our universe and our efforts to comprehend it." (Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

"[…] the world is not complete chaos. Strange attractors often do have structure: like the Sierpinski gasket, they are self-similar or approximately so. And they have fractal dimensions that hold important clues for our attempts to understand chaotic systems such as the weather." (Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

"The term chaos is used in a specific sense where it is an inherently random pattern of behaviour generated by fixed inputs into deterministic" (that is fixed) rules" (relationships). The rules take the form of non-linear feedback loops. Although the specific path followed by the behaviour so generated is random and hence unpredictable in the long-term, it always has an underlying pattern to it, a 'hidden' pattern, a global pattern or rhythm. That pattern is self-similarity, that is a constant degree of variation, consistent variability, regular irregularity, or more precisely, a constant fractal dimension. Chaos is therefore order" (a pattern) within disorder" (random behaviour)." (Ralph D Stacey, "The Chaos Frontier: Creative Strategic Control for Business", 1991)

"Chaos appears in both dissipative and conservative systems, but there is a difference in its structure in the two types of systems. Conservative systems have no attractors. Initial conditions can give rise to periodic, quasiperiodic, or chaotic motion, but the chaotic motion, unlike that associated with dissipative systems, is not self-similar. In other words, if you magnify it, it does not give smaller copies of itself. A system that does exhibit self-similarity is called fractal. [...] The chaotic orbits in conservative systems are not fractal; they visit all regions of certain small sections of the phase space, and completely avoid other regions. If you magnify a region of the space, it is not self-similar." (Barry R Parker, "Chaos in the Cosmos: The stunning complexity of the universe", 1996)

"What is renormalization? First of all, if scaling is present we can go to smaller scales and get exactly the same result. In a sense we are looking at the system with a microscope of increasing power. If you take the limit of such a process you get a stability that is not otherwise present. In short, in the renormalized system, the self-similarity is exact, not approximate as it usually is. So renormalization gives stability and exactness." (Barry R Parker, "Chaos in the Cosmos: The stunning complexity of the universe", 1996)

"The self-similarity of fractal structures implies that there is some redundancy because of the repetition of details at all scales. Even though some of these structures may appear to teeter on the edge of randomness, they actually represent complex systems at the interface of order and disorder. " (Edward Beltrami, "What is Random?: Chaos and Order in Mathematics and Life", 1999)

"The self-similarity on different scales arises because growth often involves iteration of simple, discrete processes (e.g. branching). These repetitive processes can often be summarized as sets of simple rules." (David G Green, 2000)

"Fractals are self-similar objects. However, not every self-similar object is a fractal, with a scale-free form distribution. If we put identical cubes on top of each other, we get a self-similar object. However, this object will not have scale-free statistics: since it has only one measure of rectangular forms, it is single-scaled. We need a growing number of smaller and smaller self-similar objects to satisfy the scale-free distribution." (Péter Csermely, "Weak Links: The Universal Key to the Stabilityof Networks and Complex Systems", 2009)

"The concept of bifurcation, present in the context of non-linear dynamic systems and theory of chaos, refers to the transition between two dynamic modalities qualitatively distinct; both of them are exhibited by the same dynamic system, and the transition (bifurcation) is promoted by the change in value of a relevant numeric parameter of such system. Such parameter is named 'bifurcation parameter', and in highly non-linear dynamic systems, its change can produce a large number of bifurcations between distinct dynamic modalities, with self-similarity and fractal structure. In many of these systems, we have a cascade of numberless bifurcations, culminating with the production of chaotic dynamics." (Emilio Del-Moral-Hernandez, "Chaotic Neural Networks", Encyclopedia of Artificial Intelligence, 2009)

"In the telephone system a century ago, messages dispersed across the network in a pattern that mathematicians associate with randomness. But in the last decade, the flow of bits has become statistically more similar to the patterns found in self-organized systems. For one thing, the global network exhibits self-similarity, also known as a fractal pattern. We see this kind of fractal pattern in the way the jagged outline of tree branches look similar no matter whether we look at them up close or far away. Today messages disperse through the global telecommunications system in the fractal pattern of self-organization." (Kevin Kelly, "What Technology Wants", 2010)

"Cyberneticists argue that positive feedback may be useful, but it is inherently unstable, capable of causing loss of control and runaway. A higher level of control must therefore be imposed upon any positive feedback mechanism: self-stabilising properties of a negative feedback loop constrain the explosive tendencies of positive feedback. This is the starting point of our journey to explore the role of cybernetics in the control of biological growth. That is the assumption that the evolution of self-limitation has been an absolute necessity for life forms with exponential growth." (Tony Stebbing, "A Cybernetic View of Biological Growth: The Maia Hypothesis", 2011)

"Laws of complexity hold universally across hierarchical scales (scalar, self-similarity) and are not influenced by the detailed behavior of constituent parts." (Jamshid Gharajedaghi, "Systems Thinking: Managing Chaos and Complexity A Platform for Designing Business Architecture" 3rd Ed., 2011)

"Fractals are different from chaos. Fractals are self-similar geometric objects, while chaos is a type of deterministic yet unpredictable dynamical behavior. Nevertheless, the two ideas or areas of study have several interesting and important links. Fractal objects at first blush seem intricate and complex. However, they are often the product of very simple dynamical systems. So the two areas of study - chaos and fractals - are naturally paired, even though they are distinct concepts." (David P Feldman,"Chaos and Fractals: An Elementary Introduction", 2012)

"The study of chaos shows that simple systems can exhibit complex and unpredictable behavior. This realization both suggests limits on our ability to predict certain phenomena and that complex behavior may have a simple explanation. Fractals give scientists a simple and concise way to qualitatively and quantitatively understand self-similar objects or phenomena. More generally, the study of chaos and fractals hold many fun surprises; it challenges one’s intuition about simplicity and complexity, order and disorder." (David P Feldman,"Chaos and Fractals: An Elementary Introduction", 2012

"Chaos theory is a branch of mathematics focusing on the study of chaos - dynamical systems whose random states of disorder and irregularities are governed by underlying patterns and deterministic laws that are highly sensitive to initial conditions. Chaos theory is an interdisciplinary theory stating that, within the apparent randomness of complex, chaotic systems, there are underlying patterns, interconnectedness, constant feedback loops, repetition, self-similarity, fractals, and self-organization. The butterfly effect, an underlying principle of chaos, describes how a small change in one state of a deterministic nonlinear system can result in large differences in a later state (meaning that there is a sensitive dependence on initial conditions)." (Nima Norouzi, "Criminal Policy, Security, and Justice in the Time of COVID-19", 2022)

06 December 2025

❄️Systems Thinking: On Thresholds (Quotes)

"[The] system may evolve through a whole succession of transitions leading to a hierarchy of more and more complex and organized states. Such transitions can arise in nonlinear systems that are maintained far from equilibrium: that is, beyond a certain critical threshold the steady-state regime become unstable and the system evolves into a new configuration." (Ilya Prigogine, Gregoire Micolis & Agnes Babloyantz, "Thermodynamics of Evolution", Physics Today 25" (11), 1972)

"As the complexity of a system increases, our ability to make precise and yet significant statements about its behavior diminishes until a threshold is reached beyond which precision and significance (or relevance) become almost mutually exclusive characteristics." (Lotfi A Zadeh, 1973)

"Fuzziness, then, is a concomitant of complexity. This implies that as the complexity of a task, or of a system for performing that task, exceeds a certain threshold, the system must necessarily become fuzzy in nature. Thus, with the rapid increase in the complexity of the information processing tasks which the computers are called upon to perform, we are reaching a point where computers will have to be designed for processing of information in fuzzy form. In fact, it is the capability to manipulate fuzzy concepts that distinguishes human intelligence from the machine intelligence of current generation computers. Without such capability we cannot build machines that can summarize written text, translate well from one natural language to another, or perform many other tasks that humans can do with ease because of their ability to manipulate fuzzy concepts." (Lotfi A Zadeh, "The Birth and Evolution of Fuzzy Logic", 1989)

"In the realms of nature it is impossible to predict which way a bifurcation will cut. The outcome of a bifurcation is determined neither by the past history of a system nor by its environment, but only by the interplay of more or less random fluctuations in the chaos of critical destabilization. One or another of the fluctuations that rock such a system will suddenly 'nucleate'. The nucleating fluctuation will amplify with great rapidity and spread to the rest of the system. In a surprisingly short time, it dominates the system’s dynamics. The new order that is then born from the womb of chaos reflects the structural and functional characteristics of the nucleated fluctuation. [...] Bifurcations are more visible, more frequent, and more dramatic when the systems that exhibit them are close to their thresholds of stability - when they are all but choked out of existence." (Ervin László, "Vision 2020: Reordering Chaos for Global Survival", 1994)

"When a system is 'stressed' beyond certain threshold limits as, for example, when it is heated up, or its pressure is increased, it shifts from one set of attractors to another and then behaves differently. To use the language of the theory, the system 'settles into a new dynamic regime'. It is at the point of transition that a bifurcation takes place. The system no longer follows the trajectory of its initial attractors, but responds to new attractors that make the system appear to be behaving randomly. It is not behaving randomly, however, and this is the big shift in our understanding caused by dynamical systems theory. It is merely responding to a new set of attractors that give it a more complex trajectory. The term bifurcation, in its most significant sense, refers to the transition of a system from the dynamic regime of one set of attractors, generally more stable and simpler ones, to the dynamic regime of a set of more complex and 'chaotic' attractors." (Ervin László, "Vision 2020: Reordering Chaos for Global Survival", 1994)

"For any given population of susceptibles, there is some critical combination of contact frequency, infectivity, and disease duration just great enough for the positive loop to dominate the negative loops. That threshold is known as the tipping point. Below the tipping point, the system is stable: if the disease is introduced into the community, there may be a few new cases, but on average, people will recover faster than new cases are generated. Negative feedback dominates and the population is resistant to an epidemic. Past the tipping point, the positive loop dominates .The system is unstable and once a disease arrives, it can spread like wildfire that is, by positive feedback-limited only by the depletion of the susceptible population." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"In the case of a complex system, nonlinear behavior can happen as disturbances or changes in the system, each one relatively small by itself, accumulate. Outwardly, everything seems to be normal: the system doesn’t generate any surprises. At some point, though, the behavior of the whole system suddenly shifts to a radically new mode. This kind of behavior is often called a threshold effect, because the shift occurs when a critical threshold - usually unseen and often unexpected - is crossed." (Thomas Homer-Dixon, "The Upside of Down: Catastrophe, Creativity, and the Renewal of Civilization", 2006)

"Creative elements add random elements to network behavior, inducing an increase in noise. This is highly beneficial to a certain extent as we saw in the previous box, but becomes intolerable if it exceeds a certain threshold. This threshold is high if the hosting network lives an individual life and often meets unexpected situations. However, the same threshold becomes low if the hosting network is part of a higher level organization which provides a stable environment." (Péter Csermely, "Weak Links: The Universal Key to the Stabilityof Networks and Complex Systems", 2009)

"The simplest basic architecture of an artificial neural network is composed of three layers of neurons - input, output, and intermediary" (historically called perceptron). When the input layer is stimulated, each node responds in a particular way by sending information to the intermediary level nodes, which in turn distribute it to the output layer nodes and thereby generate a response. The key to artificial neural networks is in the ways that the nodes are connected and how each node reacts to the stimuli coming from the nodes it is connected to. Just as with the architecture of the brain, the nodes allow information to pass only if a specific stimulus threshold is passed. This threshold is governed by a mathematical equation that can take different forms. The response depends on the sum of the stimuli coming from the input node connections and is 'all or nothing'." (Diego Rasskin-Gutman, "Chess Metaphors: Artificial Intelligence and the Human Mind", 2009)

"Even more important is the way complex systems seem to strike a balance between the need for order and the imperative for change. Complex systems tend to locate themselves at a place we call 'the edge of chaos'. We imagine the edge of chaos as a place where there is enough innovation to keep a living system vibrant, and enough stability to keep it from collapsing into anarchy. It is a zone of conflict and upheaval, where the old and new are constantly at war. Finding the balance point must be a delicate matter - if a living system drifts too close, it risks falling over into incoherence and dissolution; but if the system moves too far away from the edge, it becomes rigid, frozen, totalitarian. Both conditions lead to extinction. […] Only at the edge of chaos can complex systems flourish. This threshold line, that edge between anarchy and frozen rigidity, is not a like a fence line, it is a fractal line; it possesses nonlinearity." (Stephen H Buhner, "Plant Intelligence and the Imaginal Realm: Beyond the Doors of Perception into the Dreaming of Earth", 2014)

"Flaws can be found in any research design if you look hard enough. […] In our experience, it is good scientific practice to refine one's research hypotheses in light of the data. Working scientists are also keenly aware of the risks of data dredging, and they use confidence intervals and p-values as a tool to avoid getting fooled by noise. Unfortunately, a by-product of all this struggle and care is that when a statistically significant pattern does show up, it is natural to get excited and believe it. The very fact that scientists generally don't cheat, generally don't go fishing for statistical significance, makes them vulnerable to drawing strong conclusions when they encounter a pattern that is robust enough to cross the p < 0.05 threshold." (Andrew Gelman & Eric Loken, "The Statistical Crisis in Science", American Scientist Vol. 102(6), 2014)

"Even more important is the way complex systems seem to strike a balance between the need for order and the imperative for change. Complex systems tend to locate themselves at a place we call 'the edge of chaos'. We imagine the edge of chaos as a place where there is enough innovation to keep a living system vibrant, and enough stability to keep it from collapsing into anarchy. It is a zone of conflict and upheaval, where the old and new are constantly at war. Finding the balance point must be a delicate matter - if a living system drifts too close, it risks falling over into incoherence and dissolution; but if the system moves too far away from the edge, it becomes rigid, frozen, totalitarian. Both conditions lead to extinction. […] Only at the edge of chaos can complex systems flourish. This threshold line, that edge between anarchy and frozen rigidity, is not a like a fence line, it is a fractal line; it possesses nonlinearity." (Stephen H Buhner, Plant Intelligence and the Imaginal Realm: Beyond the Doors of Perception into the Dreaming of Earth", 2014)

"Only at the edge of chaos can complex systems flourish. This threshold line, that edge between anarchy and frozen rigidity, is not a like a fence line, it is a fractal line; it possesses nonlinearity." (Stephen H Buhner, "Plant Intelligence and the Imaginal Realm: Beyond the Doors of Perception into the Dreaming of Earth", 2014)

"Bifurcation is a qualitative, topological change of a system’s phase space that occurs when some parameters are slightly varied across their critical thresholds. Bifurcations play important roles in many real-world systems as a switching mechanism. […] There are two categories of bifurcations. One is called a local bifurcation, which can be characterized by a change in the stability of equilibrium points. It is called local because it can be detected and analyzed only by using localized information around the equilibrium point. The other category is called a global bifurcation, which occurs when non-local features of the phase space, such as limit cycles (to be discussed later), collide with equilibrium points in a phase space. This type of bifurcation can’t be characterized just by using localized information around the equilibrium point."  (Hiroki Sayama, "Introduction to the Modeling and Analysis of Complex Systems", 2015)

29 November 2025

❄️Systems Thinking: On Shocks (Quotes)

"Antifragility is beyond resilience or robustness. The resilient resists shocks and stays the same; the antifragile gets better." (Nassim N Taleb, "Antifragile: Things that gain from disorder", 2012)

"If you want a system - economic, social, political, or otherwise - to operate at a high level of efficiency, then you have to optimize its operation in such a way that its resilience is dramatically reduced to unknown - and possibly unknowable - shocks and/or changes in its operating environment. In other words, there is an inescapable price to be paid in efficiency in order to gain the benefits of adaptability and survivability in a highly uncertain environment. There is no escape clause!" (John L Casti, "X-Events: The Collapse of Everything", 2012)

"Stability is often defined as a resilient system that keeps processing transactions, even if transient impulses" (rapid shocks to the system), persistent stresses" (force applied to the system over an extended period), or component failures disrupt normal processing." (Michael Hüttermann et al, "DevOps for Developers", 2013)

"Because we hate living with uncertainty, we often try to make complex systems, such as the economy, more predictable and less uncertain. However, making a complex system more predictable also makes it less resilient to shocks, and less creative. Lowering uncertainty reduces complexity, often with disastrous effects." (Edgar E Peters, "Patterns in the dark: understanding risk and financial crisis with complexity theory", 1999)

"Complex systems, then, have local uncertainty and global certainty. They generate change, and they are resilient to unexpected shocks. They turn uncertainty into order, and they reverse order back into uncertainty. They evolve and change through time, and they do so without a central planner. Complex systems are everywhere. In fact, real life is one huge complex system. How is such behavior possible? First, we need to understand the general class of complex systems. By understanding their nature, we will see the important role uncertainty plays in maintaining stability. When we understand natural systems, we will understand the role of uncertainty in a free society." (Edgar E Peters, "Patterns in the dark: understanding risk and financial crisis with complexity theory", 1999)

In a free-market economy, then, uncertainty is a necessary element. Only when the economy is in a state of uncertainty can the participants efficiently search for solutions to problems and find creative answers. In addition, only a system that depends on uncertainty can survive unexpected shocks. A complex process can take multiple paths to an optimal solution. It does not require 'ideal' conditions; in fact, shocks often force it to find a better solution, a higher hill in the fitness landscape. The 'creative destruction'identified by the Austrian school suggests that a free-market economy is not only resilient to shocks, but is also creative and capable of generating innovation. It can only do so while in a high state of uncertainty." (Edgar E Peters, "Patterns in the dark: understanding risk and financial crisis with complexity theory", 1999)

"It seems obvious that uncertainty reigns during times of crisis. However, crisis itself can be a positive development. It is only negative in that it specifies that change is coming. Most people are uncomfortable with change and equate it with hard times. Mainstream economics tends to take the same view, calling such events 'shocks'. They are even referred to as 'exogenous' - outside of the system. If it were not for change, according to the mainstream school, everything would continue along in perfect balance, a 'circular flow'. Change is like an alien invasion. The mainstream view ignores the fact that change is necessary." (Edgar E Peters, "Patterns in the dark: understanding risk and financial crisis with complexity theory", 1999)

"Uncertainty is not necessarily bad or synonymous with risk. Complex systems use uncertainty to their advantage as they adapt to changes in their environment and learn to be resilient to unexpected shocks. Uncertainty then, rather than being the source of so many problems, becomes a necessary element if a market and a society are to remain free." (Edgar E Peters, "Patterns in the dark: understanding risk and financial crisis with complexity theory", 1999) 

22 November 2025

❄️Systems Thinking: On Paradoxes (Quotes)

"Every process, event, happening - call it what you will; in a word, everything that is going on in Nature means an increase of the entropy of the part of the world where it is going on. Thus a living organism continually increases its entropy - or, as you may say, produces positive entropy – and thus tends to approach the dangerous state of maximum entropy, which is death. It can only keep aloof from it, i.e. alive, by continually drawing from its environment negative entropy – which is something very positive as we shall immediately see. What an organism feeds upon is negative entropy. Or, to put it less paradoxically, the essential thing in metabolism is that the organism succeeds in freeing itself from all the entropy it cannot help producing while alive." (Erwin Schrödinger, "What is Life?", 1944)

"Scaling invariance results from the fact that homogeneous power laws lack natural scales; they do not harbor a characteristic unit (such as a unit length, a unit time, or a unit mass). Such laws are therefore also said to be scale-free or, somewhat paradoxically, 'true on all scales'. Of course, this is strictly true only for our mathematical models. A real spring will not expand linearly on all scales; it will eventually break, at some characteristic dilation length. And even Newton's law of gravitation, once properly quantized, will no doubt sprout a characteristic length." (Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

"Chaos demonstrates that deterministic causes can have random effects […] There's a similar surprise regarding symmetry: symmetric causes can have asymmetric effects. […] This paradox, that symmetry can get lost between cause and effect, is called symmetry-breaking. […] From the smallest scales to the largest, many of nature's patterns are a result of broken symmetry; […]" (Ian Stewart & Martin Golubitsky, "Fearful Symmetry: Is God a Geometer?", 1992)

"There is a new science of complexity which says that the link between cause and effect is increasingly difficult to trace; that change" (planned or otherwise) unfolds in non-linear ways; that paradoxes and contradictions abound; and that creative solutions arise out of diversity, uncertainty and chaos." (Andy P Hargreaves & Michael Fullan, "What’s Worth Fighting for Out There?", 1998)

"Emergent self-organization in multi-agent systems appears to contradict the second law of thermodynamics. This paradox has been explained in terms of a coupling between the macro level that hosts self-organization" (and an apparent reduction in entropy), and the micro level" (where random processes greatly increase entropy). Metaphorically, the micro level serves as an entropy 'sink', permitting overall system entropy to increase while sequestering this increase from the interactions where self-organization is desired." (H Van Dyke Parunak & Sven Brueckner, "Entropy and Self-Organization in Multi-Agent Systems", Proceedings of the International Conference on Autonomous Agents, 2001)

"A sudden change in the evolutive dynamics of a system" (a ‘surprise’) can emerge, apparently violating a symmetrical law that was formulated by making a reduction on some" (or many) finite sequences of numerical data. This is the crucial point. As we have said on a number of occasions, complexity emerges as a breakdown of symmetry" (a system that, by evolving with continuity, suddenly passes from one attractor to another) in laws which, expressed in mathematical form, are symmetrical. Nonetheless, this breakdown happens. It is the surprise, the paradox, a sort of butterfly effect that can highlight small differences between numbers that are very close to one another in the continuum of real numbers; differences that may evade the experimental interpretation of data, but that may increasingly amplify in the system’s dynamics." (Cristoforo S Bertuglia & Franco Vaio, "Nonlinearity, Chaos, and Complexity: The Dynamics of Natural and Social Systems", 2003)

"Chaos theory, for example, uses the metaphor of the ‘butterfly effect’. At critical times in the formation of Earth’s weather, even the fluttering of the wings of a butterfly sends ripples that can tip the balance of forces and set off a powerful storm. Even the smallest inanimate objects sent back into the past will inevitably change the past in unpredictable ways, resulting in a time paradox." (Michio Kaku, "Parallel Worlds: A journey through creation, higher dimensions, and the future of the cosmos", 2004)

"Network stability may be a key element in the development of multilevel, nested networks. The formation of nested networks obviously requires at least a few contacts between the bottom networks. However, evolutionary selection requires the independence and at least temporary isolation of the bottom networks themselves. Weak links are probably the only tools for solving this apparent paradox." (Péter Csermely, "Weak Links: The Universal Key to the Stabilityof Networks and Complex Systems", 2009)

"The paradox of artificial intelligence is that any system simple enough to be understandable is not complicated enough to behave intelligently, and any system complicated enough to behave intelligently is not simple enough to understand. The path to artificial intelligence, suggested Turing, is to construct a machine with the curiosity of a child, and let intelligence evolve." (George B Dyson, "Turing's Cathedral: The Origins of the Digital Universe", 2012)

"Political structures are excessively paternalistic, and to maintain them requires a high level of energy. The massive amounts of energy they consume are unsustainable and invite political meltdowns, bailouts, and fallout. On the other hand, proponents of complexity theory take the paradigm–shattering view that less is more. They understand that, paradoxically enough, the complexity of simplicity is the key to the emergence of systems, repeatable patterns and the social glue that holds community together and creates order. Anyone can make simplicity complicated; it takes a true genius to make the complicated simple." (Lawrence K Samuels, "Defense of Chaos", 2013)

"The principles of equifinality, multifinality, homeostasis and heterostasis have farstretching implications for the application of deductive reasoning The paradox of equifinality and multifinality means that when observing the behaviour of a system, it might be moving towards a final state irrespective of the initial state or moving away from an initial state without being able to predict the final outcome." (Rob Dekkers, "Applied Systems Theory", 2014)

"The basis of complex systems is actually quite simple (and this is not an attempt to be paradoxical, like an art critic who describes a sculpture as 'big yet small'. What makes a system unpredictable and thus nonlinear (which includes you and your perceptual process, or the process of making collective decisions) is that the components making up the system are interconnected." (Beau Lotto, "Deviate: The Science of Seeing Differently", 2017)

15 November 2025

❄️Systems Thinking: On Hubs (Quotes)

"There are a variety of swarm topologies, but the only organization that holds a genuine plurality of shapes is the grand mesh. In fact, a plurality of truly divergent components can only remain coherent in a network. No other arrangement-chain, pyramid, tree, circle, hub-can contain true diversity working as a whole. This is why the network is nearly synonymous with democracy or the market." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"In a random network the peak of the distribution implies that the vast majority of nodes have the same number of links and that nodes deviating from the average are extremely rare. Therefore, a random network has a characteristic scale in its node connectivity, embodied by the average node and fixed by the peak of the degree distribution. In contrast, the absence of a peak in a power-law degree distribution implies that in a real network there is no such thing as a characteristic node. We see a continuous hierarchy of nodes, spanning from rare hubs to the numerous tiny nodes. The largest hub is closely fol - lowed by two or three somewhat smaller hubs, followed by dozens that are even smaller, and so on, eventually arriving at the numerous small nodes." (Albert-László Barabási, "Linked: How Everything Is Connected to Everything Else and What It Means for Business, Science, and Everyday Life", 2002)

"In networks belonging to the second category, the winner takes all, meaning that the fittest node grabs all links, leaving very little for the rest of the nodes. Such networks develop a star topology, in which all nodes are connected to a central hub. In such a hub-and-spokes network there is a huge gap between the lonely hub and everybody else in the system. Thus a winner-takes-all network is very different from the scale-free networks we encountered earlier, where there is a hierarchy of hubs whose size distribution follows a power law. A winner-takes-all network is not scale-free. Instead there is a single hub and many tiny nodes. This is a very important distinction." (Albert-László Barabási, "Linked: How Everything Is Connected to Everything Else and What It Means for Business, Science, and Everyday Life", 2002)

"The first category includes all networks in which, despite the fierce competition for links, the scale-free topology survives. These networks display a fit-get-rich behavior, meaning that the fittest node will inevitably grow to become the biggest hub. The winner's lead is never significant, however. The largest hub is closely followed by a smaller one, which acquires almost as many links as the fittest node. At any moment we have a hierarchy of nodes whose degree distribution follows a power law. In most complex networks, the power law and the fight for links thus are not antagonistic but can coexist peacefully."(Albert-László Barabási, "Linked: How Everything Is Connected to Everything Else and What It Means for Business, Science, and Everyday Life", 2002)

"[…] it is useful to note that there are three basic network topologies. First, there are line or chain networks with many nodes that are spread out in more or less linear fashion. Second, there are star or hub networks, where most important relationships move through a central hub or hubs. Third, there are all-channel networks, in which communications proceed in more or less all directions across the network simultaneously […]." (John Urry, "Global Complexity", 2003)

"In a random network the loss of a small number of nodes can cause the overall network to become incoherent - that is, to break into disconnected subnetworks. In a scale-free network, such an event usually won’t disrupt the overall network because most nodes don’t have many links. But there’s a big caveat to this general principle: if a scale-free network loses a hub, it can be disastrous, because many other nodes depend on thaot hub." (Thomas Homer-Dixon, "The Upside of Down: Catastrophe, Creativity, and the Renewal of Civilization", 2006)

"Scale-free networks are particularly vulnerable to intentional attack: if someone wants to wreck the whole network, he simply needs to identify and destroy some of its hubs. And here we see how our world’s increasing connectivity really matters. Scientists have found that as a scale-free network like the Internet or our food-distribution system grows- as it adds more nodes - the new nodes tend to hook up with already highly connected hubs." (Thomas Homer-Dixon, "The Upside of Down: Catastrophe, Creativity, and the Renewal of Civilization", 2006)

"The scale-free distribution pattern has been most studied on the degree distribution of networks. What is a degree? The degree of a network. element is the number of connections it has. A scale-free degree distribution means that the network has a large number of elements with very few neighbors, but it has a non-zero number of elements with an extraordinarily large number of neighbors. These connection-rich elements are called hubs. If an element has just a few connections, it is often called a node." (Péter Csermely, "Weak Links: The Universal Key to the Stabilityof Networks and Complex Systems", 2009)

08 November 2025

❄️Systems Thinking: On Trajectories (Quotes)

"Finite systems of deterministic ordinary nonlinear differential equations may be designed to represent forced dissipative hydrodynamic flow. Solutions of these equations can be identified with trajectories in phase space. For those systems with bounded solutions, it is found that nonperiodic solutions are ordinarily unstable with respect to small modifications, so that slightly differing initial states can evolve into considerably different states. Systems with bounded solutions are shown to possess bounded numerical solutions." (Edward N Lorenz, Deterministic Nonperiodic Flow", Journal of the Atmospheric Science 20, 1963)

"Cellular automata may be considered as discrete dynamical systems. In almost all cases, cellular automaton evolution is irreversible. Trajectories in the configuration space for cellular automata therefore merge with time, and after many time steps, trajectories starting from almost all initial states become concentrated onto 'attractors'. These attractors typically contain only a very small fraction of possible states. Evolution to attractors from arbitrary initial states allows for 'self-organizing' behaviour, in which structure may evolve at large times from structureless initial states. The nature of the attractors determines the form and extent of such structures." (Stephen Wolfram, "Nonlinear Phenomena, Universality and complexity in cellular automata", Physica 10D, 1984)

"Chaos is a peculiar situation in which the trajectories of a system, taken in the traditional sense, fail to converge as they approach their limit cycles or 'attractors' or 'equilibria'. Instead, they diverge, due to an increase, of indefinite magnitude, in amplification or gain." (Gordon Pask, "Different Kinds of Cybernetics", 1992)

"Regarding stability, the state trajectories of a system tend to equilibrium. In the simplest case they converge to one point (or different points from different initial states), more commonly to one (or several, according to initial state) fixed point or limit cycle(s) or even torus(es) of characteristic equilibrial behaviour. All this is, in a rigorous sense, contingent upon describing a potential, as a special summation of the multitude of forces acting upon the state in question, and finding the fixed points, cycles, etc., to be minima of the potential function. It is often more convenient to use the equivalent jargon of 'attractors' so that the state of a system is 'attracted' to an equilibrial behaviour. In any case, once in equilibrial conditions, the system returns to its limit, equilibrial behaviour after small, arbitrary, and random perturbations." (Gordon Pask, "Different Kinds of Cybernetics", 1992)

"Systems, acting dynamically, produce (and incidentally, reproduce) their own boundaries, as structures which are complementary (necessarily so) to their motion and dynamics. They are liable, for all that, to instabilities chaos, as commonly interpreted of chaotic form, where nowadays, is remote from the random. Chaos is a peculiar situation in which the trajectories of a system, taken in the traditional sense, fail to converge as they approach their limit cycles or 'attractors' or 'equilibria'. Instead, they diverge, due to an increase, of indefinite magnitude, in amplification or gain." (Gordon Pask, "Different Kinds of Cybernetics", 1992)

"Chaos has three fundamental characteristics. They are (a) irregular periodicity, (b) sensitivity to initial conditions, and (c) a lack of predictability. These characteristics interact within any one chaotic setting to produce highly complex nonlinear variable trajectories." (Courtney Brown, "Chaos and Catastrophe Theories", 1995)

"In classical catastrophe theory, the various attracting static hypersurfaces are actually connected. However, there are portions of the overall surface that are unstable, and thus repelling. Thus nearby trajectories tend to 'fly' quickly past these unstable regions as they move from one stable area to another. It is this relatively rapid snapping movement that is typical of nearly all catastrophe phenomena." (Courtney Brown, "Chaos and Catastrophe Theories", 1995)

"Small changes in the initial conditions in a chaotic system produce dramatically different evolutionary histories. It is because of this sensitivity to initial conditions that chaotic systems are inherently unpredictable. To predict a future state of a system, one has to be able to rely on numerical calculations and initial measurements of the state variables. Yet slight errors in measurement combined with extremely small computational errors (from roundoff or truncation) make prediction impossible from a practical perspective. Moreover, small initial errors in prediction grow exponentially in chaotic systems as the trajectories evolve. Thus, theoretically, prediction may be possible with some chaotic processes if one is interested only in the movement between two relatively close points on a trajectory. When longer time intervals are involved, the situation becomes hopeless.(Courtney Brown, "Chaos and Catastrophe Theories", 1995)

"A typical control goal when controlling chaotic systems is to transform a chaotic trajectory into a periodic one. In terms of control theory it means stabilization of an unstable periodic orbit or equilibrium. A specific feature of this problem is the possibility of achieving the goal by means of an arbitrarily small control action. Other control goals like synchronization and chaotization can also be achieved by small control in many cases." (Alexander L Fradkov, "Cybernetical Physics: From Control of Chaos to Quantum Control", 2007)

"Chaotic system is a deterministic dynamical system exhibiting irregular, seemingly random behavior. Two trajectories of a chaotic system starting close to each other will diverge after some time (such an unstable behavior is often called 'sensitive dependence on initial conditions'). Mathematically, chaotic systems are characterized by local instability and global boundedness of the trajectories. Since local instability of a linear system implies unboundedness (infinite growth) of its solutions, chaotic system should be necessarily nonlinear, i.e., should be described by a nonlinear mathematical model." (Alexander L Fradkov, "Cybernetical Physics: From Control of Chaos to Quantum Control", 2007)

"Systematic usage of the methods of modern control theory to study physical systems is a key feature of a new research area in physics that may be called cybernetical physics. The subject of cybernetical physics is focused on studying physical systems by means of feedback interactions with the environment. Its methodology heavily relies on the design methods developed in cybernetics. However, the approach of cybernetical physics differs from the conventional use of feedback in control applications (e.g., robotics, mechatronics) aimed mainly at driving a system to a prespecified position or a given trajectory." (Alexander L Fradkov, "Cybernetical Physics: From Control of Chaos to Quantum Control", 2007)

"A characteristic of such chaotic dynamics is an extreme sensitivity to initial conditions" (exponential separation of neighboring trajectories), which puts severe limitations on any forecast of the future fate of a particular trajectory. This sensitivity is known as the ‘butterfly effect’: the state of the system at time t can be entirely different even if the initial conditions are only slightly changed, i.e., by a butterfly flapping its wings." (Hans J Korsch et al, "Chaos: A Program Collection for the PC", 2008)

"[...] a high degree of unpredictability is associated with erratic trajectories. This not only because they look random but mostly because infinitesimally small uncertainties on the initial state of the system grow very quickly - actually exponentially fast. In real world, this error amplification translates into our inability to predict the system behavior from the unavoidable imperfect knowledge of its initial state." (Massimo Cencini et al, "Chaos: From Simple Models to Complex Systems", 2010)

"In chaotic deterministic systems, the probabilistic description is not linked to the number of degrees of freedom (which can be just one as for the logistic map) but stems from the intrinsic erraticism of chaotic trajectories and the exponential amplification of small uncertainties, reducing the control on the system behavior." (Massimo Cencini et al, "Chaos: From Simple Models to Complex Systems", 2010)

"A limit cycle is an isolated closed trajectory. Isolated means that neighboring trajectories are not closed; they spiral either toward or away from the limit cycle. If all neighboring trajectories approach the limit cycle, we say the limit cycle is stable or attracting. Otherwise the limit cycle is unstable, or in exceptional cases, half-stable. Stable limit cycles are very important scientifically - they model systems that exhibit self-sustained oscillations. In other words, these systems oscillate even in the absence of external periodic forcing." (Steven H Strogatz, "Nonlinear Dynamics and Chaos: With Applications to Physics, Biology, Chemistry, and Engineering", 2015)

"Dynamics of a linear system are decomposable into multiple independent one-dimensional exponential dynamics, each of which takes place along the direction given by an eigenvector. A general trajectory from an arbitrary initial condition can be obtained by a simple linear superposition of those independent dynamics." (Hiroki Sayama, "Introduction to the Modeling and Analysis of Complex Systems", 2015)

Related Posts Plugin for WordPress, Blogger...