23 September 2025

❄️Systems Thinking: On Expansion (Quotes)

"From that time, the universe has steadily become more complex and less reducible to a central control. With as much obstinacy as though it were human, it has insisted on expanding its parts; with as much elusiveness as though it were feminine, it has evaded the attempt to impose on it a single will. Modern science, like modern art, tends, in practice, to drop the dogma of organic unity. Some of the mediaeval habit of mind survives, but even that is said to be yielding before the daily evidence of increasing and extending complexity. The fault, then, was not in man, if he no longer looked at science or art as an organic whole or as the expression of unity. Unity turned itself into complexity, multiplicity, variety, and even contradiction." (Henry Adams, "Mont Saint Michel and Chartres", 1904)

"The homeostatic principle does not apply literally to the functioning of all complex living systems, in that in counteracting entropy they move toward growth and expansion." (Daniel Katz, "The Social Psychology of Organizations", 1966)

"For some years now the activity of the artist in our society has been trending more toward the function of the ecologist: one who deals with environmental relationships. Ecology is defined as the totality or pattern of relations between organisms and their environment. Thus the act of creation for the new artist is not so much the invention of new objects as the revelation of previously unrecognized relation- ships between existing phenomena, both physical and metaphysical. So we find that ecology is art in the most fundamental and pragmatic sense, expanding our apprehension of reality." (Gene Youngblood, "Expanded Cinema", 1970)

"In the Systems Age we tend to look at things as part of larger wholes rather than as wholes to be taken apart. This is the doctrine of expansionism. Expansionism brings with it the synthetic mode of thought much as reductionism brought with it." (Russell L Ackoff, "Redesigning the future", 1974)

"In the new systems thinking, the metaphor of knowledge as a building is being replaced by that of the network. As we perceive reality as a network of relationships, our descriptions, too, form an interconnected network of concepts and models in which there are no foundations. For most scientists such a view of knowledge as a network with no firm foundations is extremely unsettling, and today it is by no means generally accepted. But as the network approach expands throughout the scientific community, the idea of knowledge as a network will undoubtedly find increasing acceptance." (Fritjof Capra," The Web of Life: a new scientific understanding of living systems", 1996)

"Networks constitute the new social morphology of our societies, and the diffusion of networking logic substantially modifies the operation and outcomes in processes of production, experience, power, and culture. While the networking form of social organization has existed in other times and spaces, the new information technology paradigm provides the material basis for its pervasive expansion throughout the entire social structure." (Manuel Castells, "The Rise of the Network Society", 1996)

"A more extreme form of exponential growth was probably responsible for the start of the universe. Astronomer and physicists now generally accept the Big Bang theory, according to which the universe started at an unimaginably small size and then doubled in a split second 100 times, enough to make it the size of a small grapefruit. This period of 'inflation' or exponential growth then ended, and linear growth took over, with an expanding fireball creating the universe that we know today." (Richar Koch, "The Power Laws", 2000)

"Systems thinking expands the focus of the observer, whereas analytical thinking reduces it. In other words, analysis looks into things, synthesis looks out of them. This attitude of systems thinking is often called expansionism, an alternative to classic reductionism. Whereas analytical thinking concentrates on static and structural properties, systems thinking concentrates on the function and behaviour of whole systems. Analysis gives description and knowledge; systems thinking gives explanation and understanding." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"This new model of development would be based clearly on the goal of sustainable human well-being. It would use measures of progress that clearly acknowledge this goal. It would acknowledge the importance of ecological sustainability, social fairness, and real economic efficiency. Ecological sustainability implies recognizing that natural and social capital are not infinitely substitutable for built and human capital, and that real biophysical limits exist to the expansion of the market economy." (Robert Costanza, "Toward a New Sustainable Economy", 2008)

"The self-organizing map is a subtype of artificial neural networks. It is trained using unsupervised learning to produce low dimensional representation of the training samples while preserving the topological properties of the input space. The self-organizing map is a single layer feed-forward network where the output syntaxes are arranged in low dimensional (usually 2D or 3D) grid. Each input is connected to all output neurons. Attached to every neuron there is a weight vector with the same dimensionality as the input vectors. The number of input dimensions is usually a lot higher than the output grid dimension. SOMs are mainly used for dimensionality reduction rather than expansion. (Larbi Esmahi et al, "Adaptive Neuro-Fuzzy Systems", 2009)

"[…] economics is a profession grounded in the belief that 'the economy' is a machine and a closed system. The more clearly that machine is understood, the more its variables are precisely measured, the more we will be able to manage and steer it as we choose, avoiding the frenetic expansions and sharp contractions. With better indicators would come better policy, and with better policy, states would be less likely to fall into depression and risk collapse." (Zachary Karabell, "The Leading Indicators: A short history of the numbers that rule our world", 2014)

21 September 2025

❄️Systems Thinking: On Reductionism (Quotes)

"Beauty had been born, not, as we so often conceive it nowadays, as an ideal of humanity, but as measure, as the reduction of the chaos of appearances to the precision of linear symbols. Symmetry, balance, harmonic division, mated and mensurated intervals - such were its abstract characteristics." (Herbert E Read, "Icon and Idea", 1955)

"Science is the reduction of the bewildering diversity of unique events to manageable uniformity within one of a number of symbol systems, and technology is the art of using these symbol systems so as to control and organize unique events. Scientific observation is always a viewing of things through the refracting medium of a symbol system, and technological praxis is always handling of things in ways that some symbol system has dictated. Education in science and technology is essentially education on the symbol level." (Aldous L Huxley, "Essay", Daedalus, 1962)

"Whereas traditional reductionism sought to find the commonality underlying diversity in reference to a shared substance, such as material atoms, contemporary systems theory seeks to find common features in terms of shared aspects of organization." (Ervin László, "The Systems View of the World: A Holistic Vision for Our Time", 1972)

"For any system the environment is always more complex than the system itself. No system can maintain itself by means of a point-for-point correlation with its environment, i.e., can summon enough 'requisite variety' to match its environment. So each one has to reduce environmental complexity - primarily by restricting the environment itself and perceiving it in a categorically preformed way. On the other hand, the difference of system and environment is a prerequisite for the reduction of complexity because reduction can be performed only within the system, both for the system itself and its environment." (Thomas Luckmann & Niklas Luhmann, "The Differentiation of Society", 1977)

"There is a strong current in contemporary culture advocating ‘holistic’ views as some sort of cure-all […] Reductionism implies attention to a lower level while holistic implies attention to higher level. These are intertwined in any satisfactory description: and each entails some loss relative to our cognitive preferences, as well as some gain [...] there is no whole system without an interconnection of its parts and there is no whole system without an environment." (Francisco Varela, "On being autonomous: The lessons of natural history for systems theory", 1977)

"Systems theory is antireductionist; it asserts that no system can be adequately understood or totally explained once it has been broken down into its component parts." (Charles Zastrow, "Introduction to Social Work and Social Welfare: Empowering People", 1993)

"In such systems, the whole is more than the sum of the parts, not in an ultimate, metaphysical sense, but in the important pragmatic sense that, given the properties of the parts and the laws of their interaction, it is not a trivial matter to infer the properties of the whole. In the face of complexity, an in-principle reductionist may be at the same time a pragmatic holist." (Charles François (Ed.) "International Encyclopedia of Cybernetics and Systems", 1997)

"[...] information feedback about the real world not only alters our decisions within the context of existing frames and decision rules but also feeds back to alter our mental models. As our mental models change we change the structure of our systems, creating different decision rules and new strategies. The same information, processed and interpreted by a different decision rule, now yields a different decision. Altering the structure of our systems then alters their patterns of behavior. The development of systems thinking is a double-loop learning process in which we replace a reductionist, narrow, short-run, static view of the world with a holistic, broad, long-term, dynamic view and then redesign our policies and institutions accordingly." (John D Sterman, "Business dynamics: Systems thinking and modeling for a complex world", 2000)

"Systems thinking expands the focus of the observer, whereas analytical thinking reduces it. In other words, analysis looks into things, synthesis looks out of them. This attitude of systems thinking is often called expansionism, an alternative to classic reductionism. Whereas analytical thinking concentrates on static and structural properties, systems thinking concentrates on the function and behaviour of whole systems. Analysis gives description and knowledge; systems thinking gives explanation and understanding." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"In particular, complexity examines how components of a system can through their dynamic interaction 'spontaneously' develop collective properties or patterns, such as colour, that do not seem implicit, or at least not implicit in the same way, within individual components.  Complexity investigates emergent properties, certain regularities of behaviour that somehow transcend the ingredients that make them up. Complexity argues against reductionism, against reducing the whole to the parts. And in so doing it transforms scientific understanding of far-from-equilibrium structures, of irreversible times and of non-Euclidean mobile spaces. It emphasizes how positive feedback loops can exacerbate initial stresses in the system and render it unable to absorb shocks to re-establish the original equilibrium. Positive feedback occurs when a change tendency is reinforced rather than dampened clown. Very strong interactions occur between the parts of such systems, with the absence of a central hierarchical structure that unambiguously' governs' and produces outcomes. These outcomes are to be seen as both uncertain and irreversible." (John Urry, "Global Complexity", 2003)

"There exists an alternative to reductionism for studying systems. This alternative is known as holism. Holism considers systems to be more than the sum of their parts. It is of course interested in the parts and particularly the networks of relationships between the parts, but primarily in terms of how they give rise to and sustain in existence the new entity that is the whole whether it be a river system, an automobile, a philosophical system or a quality system." (Michael C Jackson, "Systems Thinking: Creative Holism for Manager", 2003)

"In general [...] a dissipative structure may become unstable at a certain threshold and break down, enabling the emergence of a new structure. As the introduction of corresponding order parameters results from the elimination of a huge number of degrees of freedom, the emergence of dissipative order is combined with a drastic reduction of complexity." (Klaus Mainzer, "Thinking in Complexity: The Computational Dynamics of Matter, Mind, and Mankind", 2004)

"This reduction principle - the reduction of the behavior of a complex system to the behavior of its parts - is valid only if the level of complexity of the system is rather low." (Andrzej P Wierzbicki & Yoshiteru Nakamori, "Creative Space: Models of Creative Processes for the Knowledge Civilization Age", Studies in Computational Intelligence Vol.10, 2006)

"Mechanistic reductionism suggested that the universe, including life, were considered as 'mechanisms'. Consequently, understanding any system required the application of the mental strategy of engineering: the whole system should be reduced to its parts. Knowing the parts was thought to imply the complete understanding of the whole." (Péter Érdi, "Complexity Explained", 2008)

"Complex systems are not easily predictable, and the principles of reductionism do not bear fruit when laboring to understand them, as system behavior emerges on all levels of the system. Although they are not fully knowable, within reason there may be some prediction possible. (Andreas Tolk et al, "Epistemological Constraints When Evaluating Ontological Emergence with Computational Complex Adaptive Systems", [in Unifying Themes in Complex Systems IX, Eds. Alfredo J Morales et al], 2018)

"In principle, reduction works and reductive logic does not fail, despite practical difficulty conducting reduction in complicated contexts. Careful reductive analysis reveals even convoluted, non-linear, evolved, self-organized, hierarchically complex, emergent phenomena, such as consciousness, remain open to unyielding reduction. However, among the anomalies associated with reductive logic," (J Rowan Scott, "Descartes, Gödel and Kuhn: Epiphenomenalism Defines a Limit on Reductive Logic", [in Unifying Themes in Complex Systems IX, Eds. Alfredo J Morales et al], 2018)

"From a modeller’s perspective a dynamic hypothesis is a particularly important step of ‘complexity reduction’ - making sense of a messy situation in the real world. A feedback systems thinker has in mind a number of structure-behaviour pairs that give valuable clues or patterns to look for when explaining puzzling dynamics." (John Morecroft, "System Dynamics", [in "Systems Approaches to Making Change: A Practical Guide", Ed. Martin Reynolds & Sue Holwell] 2020)

19 September 2025

❄️Systems Thinking: On Phase Space (Quotes)

"Finite systems of deterministic ordinary nonlinear differential equations may be designed to represent forced dissipative hydrodynamic flow. Solutions of these equations can be identified with trajectories in phase space. For those systems with bounded solutions, it is found that nonperiodic solutions are ordinarily unstable with respect to small modifications, so that slightly differing initial states can evolve into considerably different states. Systems with bounded solutions are shown to possess bounded numerical solutions. (Edward N Lorenz, "Deterministic Nonperiodic Flow", Journal of the Atmospheric Science 20, 1963)

"The ‘eyes of the mind’ must be able to see in the phase space of mechanics, in the space of elementary events of probability theory, in the curved four-dimensional space-time of general relativity, in the complex infinite dimensional projective space of quantum theory. To comprehend what is visible to the ‘actual eyes’, we must understand that it is only the projection of an infinite dimensional world on the retina." (Yuri I Manin, "Mathematics and Physics", 1981)

"[…] physicists have come to appreciate a fourth kind of temporal behavior: deterministic chaos, which is aperiodic, just like random noise, but distinct from the latter because it is the result of deterministic equations. In dynamic systems such chaos is often characterized by small fractal dimensions because a chaotic process in phase space typically fills only a small part of the entire, energetically available space." (Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

"When a system has more than one attractor, the points in phase space that are attracted to a particular attractor form the basin of attraction for that attractor. Each basin contains its attractor, but consists mostly of points that represent transient states. Two contiguous basins of attraction will be separated by a basin boundary." (Edward N Lorenz, "The Essence of Chaos", 1993)

"Chaos appears in both dissipative and conservative systems, but there is a difference in its structure in the two types of systems. Conservative systems have no attractors. Initial conditions can give rise to periodic, quasiperiodic, or chaotic motion, but the chaotic motion, unlike that associated with dissipative systems, is not self-similar. In other words, if you magnify it, it does not give smaller copies of itself. A system that does exhibit self-similarity is called fractal. [...] The chaotic orbits in conservative systems are not fractal; they visit all regions of certain small sections of the phase space, and completely avoid other regions. If you magnify a region of the space, it is not self-similar." (Barry R Parker, "Chaos in the Cosmos: The stunning complexity of the universe", 1996)

"One of the reasons we deal with the pendulum is that it is easy to plot its motion in phase space. If the amplitude is small, it's a two-dimensional problem, so all we need to specify it completely is its position and its velocity. We can make a two-dimensional plot with one axis (the horizontal), position, and the other (the vertical), velocity." (Barry R Parker, "Chaos in the Cosmos: The stunning complexity of the universe", 1996)

"The chance events due to deterministic chaos, on the other hand, occur even within a closed system determined by immutable laws. Our most cherished examples of chance - dice, roulette, coin-tossing - seem closer to chaos than to the whims of outside events. So, in this revised sense, dice are a good metaphor for chance after all. It's just that we've refined our concept of randomness. Indeed, the deterministic but possibly chaotic stripes of phase space may be the true source of probability." (Ian Stewart, "Does God Play Dice: The New Mathematics of Chaos", 1997)

"The chance events due to deterministic chaos, on the other hand, occur even within a closed system determined by immutable laws. Our most cherished examples of chance - dice, roulette, coin-tossing – seem closer to chaos than to the whims of outside events. So, in this revised sense, dice are a good metaphor for chance after all. It's just that we've refined our concept of randomness. Indeed, the deterministic but possibly chaotic stripes of phase space may be the true source of probability." (Ian Stewart, "Does God Play Dice: The New Mathematics of Chaos", 2002)

"Roughly spoken, bifurcation theory describes the way in which dynamical system changes due to a small perturbation of the system-parameters. A qualitative change in the phase space of the dynamical system occurs at a bifurcation point, that means that the system is structural unstable against a small perturbation in the parameter space and the dynamic structure of the system has changed due to this slight variation in the parameter space." (Holger I Meinhardt, "Cooperative Decision Making in Common Pool Situations", 2012)

"The impossibility of predicting which point in phase space the trajectory of the Lorenz attractor will pass through at a certain time, even though the system is governed by deterministic equations, is a common feature of all chaotic systems. However, this does not mean that chaos theory is not capable of any predictions. We can still make very accurate predictions, but they concern the qualitative features of the system’s behavior rather than the precise values of its variables at a particular time. The new mathematics thus represents the shift from quantity to quality that is characteristic of systems thinking in general. Whereas conventional mathematics deals with quantities and formulas, nonlinear dynamics deals with qualities and patterns." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"Bifurcation is a qualitative, topological change of a system’s phase space that occurs when some parameters are slightly varied across their critical thresholds. Bifurcations play important roles in many real-world systems as a switching mechanism. […] There are two categories of bifurcations. One is called a local bifurcation, which can be characterized by a change in the stability of equilibrium points. It is called local because it can be detected and analyzed only by using localized information around the equilibrium point. The other category is called a global bifurcation, which occurs when non-local features of the phase space, such as limit cycles (to be discussed later), collide with equilibrium points in a phase space. This type of bifurcation can’t be characterized just by using localized information around the equilibrium point."  (Hiroki Sayama, "Introduction to the Modeling and Analysis of Complex Systems", 2015)

10 September 2025

❄️Systems Thinking: On Capacity (Quotes)

"True equilibria can occur only in closed systems and that, in open systems, disequilibria called ‘steady states’, or ‘flow equilibria’ are the predominant and characteristic feature. According to the second law of thermodynamics a closed system must eventually attain a time-independent equilibrium state, with maximum entropy and minimum free energy. An open system may, under certain conditions, attain a time-independent state where the system remains constant as a whole and in its phases, though there is a continuous flow of component materials. This is called a steady state. Steady states are irreversible as a whole. […] A closed system in equilibrium does not need energy for its preservation, nor can energy be obtained from it. In order to perform work, a system must be in disequilibrium, tending toward equilibrium and maintaining a steady state, Therefore the character of an open system is the necessary condition for the continuous working capacity of the organism." (Ludwig on Bertalanffy, "Theoretische Biologie: Band 1: Allgemeine Theorie, Physikochemie, Aufbau und Entwicklung des Organismus", 1932)

"By some definitions 'systems engineering' is suggested to be a new discovery. Actually it is a common engineering approach which has taken on a new and important meaning because of the greater complexity and scope of problems to be solved in industry, business, and the military. Newly discovered scientific phenomena, new machines and equipment, greater speed of communications, increased production capacity, the demand for control over ever-extending areas under constantly changing conditions, and the resultant complex interactions, all have created a tremendously accelerating need for improved systems engineering. Systems engineering can be complex, but is simply defined as 'logical engineering within physical, economic and technical limits' - bridging the gap from fundamental laws to a practical operating system." (Instrumentation Technology, 1957)

"Clearly, if the state of the system is coupled to parameters of an environment and the state of the environment is made to modify parameters of the system, a learning process will occur. Such an arrangement will be called a Finite Learning Machine, since it has a definite capacity. It is, of course, an active learning mechanism which trades with its surroundings. Indeed it is the limit case of a self-organizing system which will appear in the network if the currency supply is generalized." (Gordon Pask, "The Natural History of Networks", 1960)

"According to the science of cybernetics, which deals with the topic of control in every kind of system" (mechanical, electronic, biological, human, economic, and so on), there is a natural law that governs the capacity of a control system to work. It says that the control must be capable of generating as much 'variety' as the situation to be controlled." (Anthony S Beer, Management Science", 1968)

"Learning is any change in a system that produces a more or less permanent change in its capacity for adapting to its environment. Understanding systems, especially systems capable of understanding problems in new task domains, are learning systems." (Herbert A Simon, "The Sciences of the Artificial", 1968)

"The notion that the 'balance of nature' is delicately poised and easily upset is nonsense. Nature is extraordinarily tough and resilient, interlaced with checks and balances, with an astonishing capacity for recovering from disturbances in equilibrium. The formula for survival is not power; it is symbiosis." (Sir Eric Ashby, [Encounter] 1976)

"The greater the uncertainty, the greater the amount of decision making and information processing. It is hypothesized that organizations have limited capacities to process information and adopt different organizing modes to deal with task uncertainty. Therefore, variations in organizing modes are actually variations in the capacity of organizations to process information and make decisions about events which cannot be anticipated in advance." (John K Galbraith, "Organization Design", 1977)

"Real learning gets to the heart of what it means to be human. Through learning we re-create ourselves. Through learning we become able to do something we never were able to do. Through learning we reperceive the world and our relationship to it. Through learning we extend our capacity to create, to be part of the generative process of life." (Peter M Senge, "The Fifth Discipline: The Art and Practice of the Learning Organization", 1990)

"The organizations that will truly excel in the future will be the organizations that discover how to tap people's commitment and capacity to learn at all levels in an organization." (Peter M Senge, "The Fifth Discipline: The Art and Practice of the Learning Organization", 1990)

"Neural networks conserve the complexity of the systems they model because they have complex structures themselves. Neural networks encode information about their environment in a distributed form. […] Neural networks have the capacity to self-organise their internal structure." (Paul Cilliers, "Complexity and Postmodernism: Understanding Complex Systems", 1998)

"This is a general characteristic of self-organizing systems: they are robust or resilient. This means that they are relatively insensitive to perturbations or errors, and have a strong capacity to restore themselves, unlike most human designed systems." (Francis Heylighen, "The Science of Self-Organization and Adaptivity", 2001)

"The very essence of mass communication theory is a simple but all-embracing expression of technological determinism, since the essential features depend on what certain technologies have made possible, certain technologies have made possible, especially the following: communication at a distance, the multiplication and simultaneous distribution of diverse ‘messages’, the enormous capacity and speed of carriers, and the limitations on response. There is no escaping the implication that public communication as practised in modern societies is profoundly shaped by these general features." (Denis McQuail, "McQuail's Reader in Mass Communication Theory", 2002)

"In loosely coupled systems by contrast there is plenty of slack in terms of time, resources and organizational capacity. They are much less likely to produce normal accidents since incidents can be .coped with, so avoiding the interactive complexity found within the tightly coupled system. in the latter, moreover, the effects are non-linear. Up to a point, tightening the connections between elements in the system will increase efficiency when everything works smoothly. But, if one small item goes wrong, then that can have a  catastrophic knock-on effect throughout the system. The system literally switches over; from smooth functioning to interactively complex disaster. And sometimes this results from a supposed improvement in the system." (John Urry, "Global Complexity", 2003)

"Mutual information is the receiver's entropy minus the conditional entropy of what the receiver receives - given what message the sender sends through the noisy channel. Conditioning or getting data can only reduce uncertainty and so this gap is always positive or zero. It can never be negative. You can only learn from further experience. Information theorists capture this theorem in a slogan: Conditioning reduces entropy. The channel capacity itself is the largest gap given all possible probability descriptions of what [the sender] sent. It is the most information that on average you could ever get out of the noisy channel." (Bart Kosko, "Noise", 2006)

"Information theory leads to the quantification of the information content of the source, as denoted by entropy, the characterization of the information-bearing capacity of the communication channel, as related to its noise characteristics, and consequently the establishment of the relationship between the information content of the source and the capacity of the channel. In short, information theory provides a quantitative measure of the information contained in message signals and help determine the capacity of a communication system to transfer this information from source to sink over a noisy channel in a reliable fashion." (Ali Grami, "Information Theory", 2016)

07 September 2025

❄️Systems Thinking: On Thresholds (Quotes)

"[The] system may evolve through a whole succession of transitions leading to a hierarchy of more and more complex and organized states. Such transitions can arise in nonlinear systems that are maintained far from equilibrium: that is, beyond a certain critical threshold the steady-state regime become unstable and the system evolves into a new configuration." (Ilya Prigogine, Gregoire Micolis & Agnes Babloyantz, "Thermodynamics of Evolution", Physics Today 25 (11), 1972)

"As the complexity of a system increases, our ability to make precise and yet significant statements about its behavior diminishes until a threshold is reached beyond which precision and significance (or relevance) become almost mutually exclusive characteristics." (Lotfi A Zadeh, 1973)

"Fuzziness, then, is a concomitant of complexity. This implies that as the complexity of a task, or of a system for performing that task, exceeds a certain threshold, the system must necessarily become fuzzy in nature. Thus, with the rapid increase in the complexity of the information processing tasks which the computers are called upon to perform, we are reaching a point where computers will have to be designed for processing of information in fuzzy form. In fact, it is the capability to manipulate fuzzy concepts that distinguishes human intelligence from the machine intelligence of current generation computers. Without such capability we cannot build machines that can summarize written text, translate well from one natural language to another, or perform many other tasks that humans can do with ease because of their ability to manipulate fuzzy concepts." (Lotfi A Zadeh, "The Birth and Evolution of Fuzzy Logic", 1989)

"Threshold functions (are described) which facilitate the careful study of the structure of a graph as it grows and specifically reveal the mysterious circumstances surrounding the abrupt appearance of the Unique Giant Component which systematically absorbs its neighbours, devouring the larger first and ruthlessly continuing until the last Isolated Nodes have been swallowed up, whereupon the Giant is suddenly brought under control by a Spanning Cycle." (Edgar Palmer, "Graphical Evolution", 1985)

"[…] an epidemic does not always percolate through an entire population. There is a percolation threshold below which the epidemic has died out before most of the people have." (Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

"In the realms of nature it is impossible to predict which way a bifurcation will cut. The outcome of a bifurcation is determined neither by the past history of a system nor by its environment, but only by the interplay of more or less random fluctuations in the chaos of critical destabilization. One or another of the fluctuations that rock such a system will suddenly 'nucleate'. The nucleating fluctuation will amplify with great rapidity and spread to the rest of the system. In a surprisingly short time, it dominates the system’s dynamics. The new order that is then born from the womb of chaos reflects the structural and functional characteristics of the nucleated fluctuation. [...] Bifurcations are more visible, more frequent, and more dramatic when the systems that exhibit them are close to their thresholds of stability - when they are all but choked out of existence." (Ervin László, "Vision 2020: Reordering Chaos for Global Survival", 1994)

"When a system is 'stressed' beyond certain threshold limits as, for example, when it is heated up, or its pressure is increased, it shifts from one set of attractors to another and then behaves differently. To use the language of the theory, the system 'settles into a new dynamic regime'. It is at the point of transition that a bifurcation takes place. The system no longer follows the trajectory of its initial attractors, but responds to new attractors that make the system appear to be behaving randomly. It is not behaving randomly, however, and this is the big shift in our understanding caused by dynamical systems theory. It is merely responding to a new set of attractors that give it a more complex trajectory. The term bifurcation, in its most significant sense, refers to the transition of a system from the dynamic regime of one set of attractors, generally more stable and simpler ones, to the dynamic regime of a set of more complex and 'chaotic' attractors." (Ervin László, "Vision 2020: Reordering Chaos for Global Survival", 1994)

"For any given population of susceptibles, there is some critical combination of contact frequency, infectivity, and disease duration just great enough for the positive loop to dominate the negative loops. That threshold is known as the tipping point. Below the tipping point, the system is stable: if the disease is introduced into the community, there may be a few new cases, but on average, people will recover faster than new cases are generated. Negative feedback dominates and the population is resistant to an epidemic. Past the tipping point, the positive loop dominates .The system is unstable and once a disease arrives, it can spread like wildfire that is, by positive feedback-limited only by the depletion of the susceptible population." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"The tipping point is that magic moment when an idea, trend, or social behavior crosses a threshold, tips, and spreads like wildfire." (Malcolm T Gladwell, "The Tipping Point: How Little Things Can Make a Big Difference", 2000)

"This possibility of sudden change is at the center of the idea of the Tipping Point and might well be the hardest of all to accept. [...] The Tipping Point is the moment of critical mass, the threshold, the boiling point." (Malcolm T Gladwell, "The Tipping Point: How Little Things Can Make a Big Difference", 2000)

"[…] real networks not only are connected but are well beyond the threshold of one. Random network theory tells us that as the average number of links per node increases beyond the critical one, the number of nodes left out of the giant cluster decreases exponentially. That is, the more links we add, the harder it is to find a node that remains isolated. Nature does not take risks by staying close to the threshold. It well surpasses it."  (Albert-László Barabási, "Linked: How Everything Is Connected to Everything Else and What It Means for Business, Science, and Everyday Life", 2002)

"In the case of a complex system, nonlinear behavior can happen as disturbances or changes in the system, each one relatively small by itself, accumulate. Outwardly, everything seems to be normal: the system doesn’t generate any surprises. At some point, though, the behavior of the whole system suddenly shifts to a radically new mode. This kind of behavior is often called a threshold effect, because the shift occurs when a critical threshold - usually unseen and often unexpected - is crossed." (Thomas Homer-Dixon, "The Upside of Down: Catastrophe, Creativity, and the Renewal of Civilization", 2006)

"But in mathematics there is a kind of threshold effect, an intellectual tipping point. If a student can just get over the first few humps, negotiate the notational peculiarities of the subject, and grasp that the best way to make progress is to understand the ideas, not just learn them by rote, he or she can sail off merrily down the highway, heading for ever more abstruse and challenging ideas, while an only slightly duller student gets stuck at the geometry of isosceles triangles." (Ian Stewart, "Why Beauty is Truth: A history of symmetry", 2007)

"The simplest basic architecture of an artificial neural network is composed of three layers of neurons - input, output, and intermediary (historically called perceptron). When the input layer is stimulated, each node responds in a particular way by sending information to the intermediary level nodes, which in turn distribute it to the output layer nodes and thereby generate a response. The key to artificial neural networks is in the ways that the nodes are connected and how each node reacts to the stimuli coming from the nodes it is connected to. Just as with the architecture of the brain, the nodes allow information to pass only if a specific stimulus threshold is passed. This threshold is governed by a mathematical equation that can take different forms. The response depends on the sum of the stimuli coming from the input node connections and is 'all or nothing'." (Diego Rasskin-Gutman, "Chess Metaphors: Artificial Intelligence and the Human Mind", 2009)

"Even more important is the way complex systems seem to strike a balance between the need for order and the imperative for change. Complex systems tend to locate themselves at a place we call 'the edge of chaos'. We imagine the edge of chaos as a place where there is enough innovation to keep a living system vibrant, and enough stability to keep it from collapsing into anarchy. It is a zone of conflict and upheaval, where the old and new are constantly at war. Finding the balance point must be a delicate matter - if a living system drifts too close, it risks falling over into incoherence and dissolution; but if the system moves too far away from the edge, it becomes rigid, frozen, totalitarian. Both conditions lead to extinction. […] Only at the edge of chaos can complex systems flourish. This threshold line, that edge between anarchy and frozen rigidity, is not a like a fence line, it is a fractal line; it possesses nonlinearity." (Stephen H Buhner, "Plant Intelligence and the Imaginal Realm: Beyond the Doors of Perception into the Dreaming of Earth", 2014)

"Flaws can be found in any research design if you look hard enough. […] In our experience, it is good scientific practice to refine one's research hypotheses in light of the data. Working scientists are also keenly aware of the risks of data dredging, and they use confidence intervals and p-values as a tool to avoid getting fooled by noise. Unfortunately, a by-product of all this struggle and care is that when a statistically significant pattern does show up, it is natural to get excited and believe it. The very fact that scientists generally don't cheat, generally don't go fishing for statistical significance, makes them vulnerable to drawing strong conclusions when they encounter a pattern that is robust enough to cross the p < 0.05 threshold." (Andrew Gelman & Eric Loken, "The Statistical Crisis in Science", American Scientist Vol. 102(6), 2014)

"Only at the edge of chaos can complex systems flourish. This threshold line, that edge between anarchy and frozen rigidity, is not a like a fence line, it is a fractal line; it possesses nonlinearity." (Stephen H Buhner, "Plant Intelligence and the Imaginal Realm: Beyond the Doors of Perception into the Dreaming of Earth", 2014)

"Bifurcation is a qualitative, topological change of a system’s phase space that occurs when some parameters are slightly varied across their critical thresholds. Bifurcations play important roles in many real-world systems as a switching mechanism. […] There are two categories of bifurcations. One is called a local bifurcation, which can be characterized by a change in the stability of equilibrium points. It is called local because it can be detected and analyzed only by using localized information around the equilibrium point. The other category is called a global bifurcation, which occurs when non-local features of the phase space, such as limit cycles (to be discussed later), collide with equilibrium points in a phase space. This type of bifurcation can’t be characterized just by using localized information around the equilibrium point."  (Hiroki Sayama, "Introduction to the Modeling and Analysis of Complex Systems", 2015)

"[...] living organisms manifest deep new physical principles, and that we are on the threshold of uncovering and harnessing those principles. What is different this time, and why it has taken so many decades to discover the real secret of life, is that the new physics is not simply a matter of an additional type of force - a 'life force' - but something altogether more subtle, something that interweaves matter and information, wholes and parts, simplicity and complexity." (Paul Davies, "The Demon in the Machine: How Hidden Webs of Information Are Solving the Mystery of Life", 2019) 

06 September 2025

🏷️Knowledge Representation: On Puzzles (Quotes)

“The discovery which has been pointed to by theory is always one of profound interest and importance, but it is usually the close and crown of a long and fruitful period, whereas the discovery which comes as a puzzle and surprise usually marks a fresh epoch and opens a new chapter in science.” (Sir Oliver J Lodge, [Becquerel Memorial Lecture] Journal of the Chemical Society, Transactions 101 (2), 1912)

”[while] the traditional way is to regard the facts of science as something like the parts of a jig-saw puzzle, which can be fitted together in one and only one way, I regard them rather as the tiny pieces of a mosaic, which can be fitted together in many ways. A new theory in an old subject is, for me, a new mosaic pattern made with the pieces taken from an older pattern. [...] Theories come into fashion and theories go out of fashion, but the facts connected with them stay.” (William H George, “The Scientist in Action”, 1936)

"The laws of science are the permanent contributions to knowledge - the individual pieces that are fitted together in an attempt to form a picture of the physical universe in action. As the pieces fall into place, we often catch glimpses of emerging patterns, called theories; they set us searching for the missing pieces that will fill in the gaps and complete the patterns. These theories, these provisional interpretations of the data in hand, are mere working hypotheses, and they are treated with scant respect until they can be tested by new pieces of the puzzle." (Edwin P Whipple, "Experiment and Experience", [Commencement Address, California Institute of Technology] 1938)

"Even if all parts of a problem seem to fit together like the pieces of a jigsaw puzzle, one has to remember that the probable need not necessarily be the truth and the truth not always probable." (Sigmund Freud, "Moses and Monotheism", 1939)

"The methods of science may be described as the discovery of laws, the explanation of laws by theories, and the testing of theories by new observations. A good analogy is that of the jigsaw puzzle, for which the laws are the individual pieces, the theories local patterns suggested by a few pieces, and the tests the completion of these patterns with pieces previously unconsidered." (Edwin P Hubble, "The Nature of Science and Other Lectures", 1954)

"One often hears that successive theories grow ever closer to, or approximate more and more closely to, the truth. Apparently, generalizations like that refer not to the puzzle-solutions and the concrete predictions derived from a theory but rather to its ontology, to the match, that is, between the entities with which the theory populates nature and what is ‘really there’." (Thomas S Kuhn, "The Structure of Scientific Revolutions", 1962)

“One often hears that successive theories grow ever closer to, or approximate more and more closely to, the truth. Apparently, generalizations like that refer not to the puzzle-solutions and the concrete predictions derived from a theory but rather to its ontology, to the match, that is, between the entities with which the theory populates nature and what is ‘really there’.” (Thomas S Kuhn, “The Structure of Scientific Revolutions”, 1970)

"Owing to his lack of knowledge, the ordinary man cannot attempt to resolve conflicting theories of conflicting advice into a single organized structure. He is likely to assume the information available to him is on the order of what we might think of as a few pieces of an enormous jigsaw puzzle. If a given piece fails to fit, it is not because it is fraudulent; more likely the contradictions and inconsistencies within his information are due to his lack of understanding and to the fact that he possesses only a few pieces of the puzzle. Differing statements about the nature of things […] are to be collected eagerly and be made a part of the individual's collection of puzzle pieces. Ultimately, after many lifetimes, the pieces will fit together and the individual will attain clear and certain knowledge." (Alan R Beals, “Strategies of Resort to Curers in South India” [contributed in Charles M. Leslie (ed.), "Asian Medical Systems: A Comparative Study", 1976]) 

"Data, seeming facts, apparent asso­ciations-these are not certain knowledge of something. They may be puzzles that can one day be explained; they may be trivia that need not be explained at all." (Kenneth Waltz, "Theory of International Politics", 1979)

"A vision is a clear mental picture of a desired future outcome. If you have ever put together a large 1,000-piece jigsaw puzzle, the chances are you used the picture on the top of the puzzle box to guide the placement of the pieces. That picture on the top of the box is the end result or the vision of what you are trying to turn into a reality. It is much more difficult - if not impossible - to put the jigsaw puzzle together without ever looking at the picture." (Jane Flaherty & Peter B Stark, "The Manager's Pocket Guide to Leadership Skills", 1999)

"[…] most earlier attempts to construct a theory of complexity have overlooked the deep link between it and networks. In most systems, complexity starts where networks turn nontrivial. No matter how puzzled we are by the behavior of an electron or an atom, we rarely call it complex, as quantum mechanics offers us the tools to describe them with remarkable accuracy. The demystification of crystals-highly regular networks of atoms and molecules-is one of the major success stories of twentieth-century physics, resulting in the development of the transistor and the discovery of superconductivity. Yet, we continue to struggle with systems for which the interaction map between the components is less ordered and rigid, hoping to give self-organization a chance." (Albert-László Barabási, "Linked: How Everything Is Connected to Everything Else and What It Means for Business, Science, and Everyday Life", 2002)

Related Posts Plugin for WordPress, Blogger...