"Biological communities are systems of interacting components and thus display characteristic properties of systems, such as mutual interdependence, self-regulation, adaptation to disturbances, approach to states of equilibrium, etc." (Ludwig von Bertalanffy, "Problems of Life", 1952)
"Every isolated determinate dynamic system, obeying unchanging laws, will ultimately develop some sort of organisms that are adapted to their environments." (W Ross Ashby, "Principles of the self-organizing system", 1962)
"[...] in a state of dynamic equilibrium with their environments. If they do not maintain this equilibrium they die; if they do maintain it they show a degree of spontaneity, variability, and purposiveness of response unknown in the non-living world. This is what is meant by ‘adaptation to environment’ […] [Its] essential feature […] is stability - that is, the ability to withstand disturbances." (Kenneth Craik, 'Living organisms', "The Nature of Psychology", 1966)
"A more viable model, one much more faithful to the kind of system that society is more and more recognized to be, is in process of developing out of, or is in keeping with, the modern systems perspective (which we use loosely here to refer to general systems research, cybernetics, information and communication theory, and related fields). Society, or the sociocultural system, is not, then, principally an equilibrium system or a homeostatic system, but what we shall simply refer to as a complex adaptive system." (Walter F Buckley, "Society as a complex adaptive system", 1968)
"To adapt to a changing environment, the system needs a variety of stable states that is large enough to react to all perturbations but not so large as to make its evolution uncontrollably chaotic. The most adequate states are selected according to their fitness, either directly by the environment, or by subsystems that have adapted to the environment at an earlier stage. Formally, the basic mechanism underlying self-organization is the (often noise-driven) variation which explores different regions in the system’s state space until it enters an attractor. This precludes further variation outside the attractor, and thus restricts the freedom of the system’s components to behave independently. This is equivalent to the increase of coherence, or decrease of statistical entropy, that defines self-organization." (Francis Heylighen, "The Science Of Self-Organization And Adaptivity", 1970)
"Whatever the system, adaptive change depends upon feedback loops, be it those provided by natural selection or those of individual reinforcement. In all cases, then, there must be a process of trial and error and a mechanism of comparison. […] By superposing and interconnecting many feedback loops, we (and all other biological systems) not only solve particular problems but also form habits which we apply to the solution of classes of problems." (Gregory Bateson, "Steps to an Ecology of Mind", 1972)
"If all of the elements in a large system are loosely coupled to one another, then any one element can adjust to and modify a local a local unique contingency without affecting the whole system. These local adaptations can be swift, relatively economical, and substantial." (Karl E Weick, "Educational organizations as loosely coupled systems", 1976)
"The phenomenon of self-organization is not limited to living matter but occurs also in certain chemical systems […] [Ilya] Prigogine has called these systems 'dissipative structures' to express the fact that they maintain and develop structure by breaking down other structures in the process of metabolism, thus creating entropy disorder - which is subsequently dissipated in the form of degraded waste products. Dissipative chemical structures display the dynamics of self-organization in its simplest form, exhibiting most of the phenomena characteristic of life self-renewal, adaptation, evolution, and even primitive forms of 'mental' processes."
"Ultimately, uncontrolled escalation destroys a system. However, change in the direction of learning, adaptation, and evolution arises from the control of control, rather than unchecked change per se. In general, for the survival and co-evolution of any ecology of systems, feedback processes must be embodied by a recursive hierarchy of control circuits." (Bradford P Keeney, "Aesthetics of Change", 1983)
"Cybernetics is concerned with scientific investigation of systemic processes of a highly varied nature, including such phenomena as regulation, information processing, information storage, adaptation, self-organization, self-reproduction, and strategic behavior. Within the general cybernetic approach, the following theoretical fields have developed: systems theory (system), communication theory, game theory, and decision theory." (Fritz B Simon et al, "Language of Family Therapy: A Systemic Vocabulary and Source Book", 1985)
"Because the individual parts of a complex adaptive system are continually revising their ('conditioned') rules for interaction, each part is embedded in perpetually novel surroundings (the changing behavior of the other parts). As a result, the aggregate behavior of the system is usually far from optimal, if indeed optimality can even be defined for the system as a whole. For this reason, standard theories in physics, economics, and elsewhere, are of little help because they concentrate on optimal end-points, whereas complex adaptive systems 'never get there'. They continue to evolve, and they steadily exhibit new forms of emergent behavior." (John H Holland, "Complex Adaptive Systems", Daedalus Vol. 121 (1), 1992)
"In short, complex adaptive systems are characterized by perpetual novelty." (M Mitchell Waldrop, "Complexity: The Emerging Science at the Edge of Order and Chaos", 1992)
"[...] it's essentially meaningless to talk about a complex adaptive system being in equilibrium: the system can never get there. It is always unfolding, always in transition. In fact, if the system ever does reach equilibrium, it isn't just stable. It's dead." (M Mitchell Waldrop, "Complexity: The Emerging Science at the Edge of Order and Chaos", 1992)
"Complex adaptive systems have the property that if you run them - by just letting the mathematical variable of 'time' go forward - they'll naturally progress from chaotic, disorganized, undifferentiated, independent states to organized, highly differentiated, and highly interdependent states. Organized structures emerge spontaneously. [...]A weak system gives rise only to simpler forms of self-organization; a strong one gives rise to more complex forms, like life. (J Doyne Farmer, "The Third Culture: Beyond the Scientific Revolution", 1995)
"The second law of thermodynamics, which requires average entropy (or disorder) to increase, does not in any way forbid local order from arising through various mechanisms of self-organization, which can turn accidents into frozen ones producing extensive regularities. Again, such mechanisms are not restricted to complex adaptive systems." (Murray Gell-Mann, "What is Complexity?", Complexity Vol 1 (1), 1995)
"Complexity theory began with an interest on how order spring from chaos. According to complexity theory, adaption is most effective in systems that are only partially connected. The argument is that too much structure creates gridlock, while too little structure creates chaos. […] Consequently, the key to effective change is to stay poised on this edge of chaos. Complexity theory focuses managerial thinking on the interrelationships among different parts of an organization and on the trade-off of less control for greater adaptation." (Shona Brown, "Competing on the Edge, 1998)
"Distributed control means that the outcomes of a complex adaptive system emerge from a process of self-organization rather than being designed and controlled externally or by a centralized body." (Brenda Zimmerman et al, "A complexity science primer", 1998)
"With the growing interest in complex adaptive systems, artificial life, swarms and simulated societies, the concept of 'collective intelligence' is coming more and more to the fore. The basic idea is that a group of individuals (e. g. people, insects, robots, or software agents) can be smart in a way that none of its members is. Complex, apparently intelligent behavior may emerge from the synergy created by simple interactions between individuals that follow simple rules." (Francis Heylighen, "Collective Intelligence and its Implementation on the Web", 1999)
"Systems, and organizations as systems, can only be understood holistically. Try to understand the system and its environment first. Organizations are open systems and, as such, are viable only in interaction with and adaptation to the changing environment." (Stephen G Haines, "The Systems Thinking Approach to Strategic Planning and Management", 2000)
"A self-organizing system not only regulates or adapts its behavior, it creates its own organization. In that respect it differs fundamentally from our present systems, which are created by their designer. We define organization as structure with function. Structure means that the components of a system are arranged in a particular order. It requires both connections, that integrate the parts into a whole, and separations that differentiate subsystems, so as to avoid interference. Function means that this structure fulfils a purpose." (Francis Heylighen & Carlos Gershenson, "The Meaning of Self-organization in Computing", IEEE Intelligent Systems, 2003)
"When defining living systems, the term dynamic equilibrium is essential. It does not imply something which is steady or stable. On the contrary, it is a floating state characterized by invisible movements and preparedness for change. To be in dynamic equilibrium is adapting adjustment to balance. Homeostasis stands for the sum of all control functions creating the state of dynamic equilibrium in a healthy organism. It is the ability of the body to maintain a narrow range of internal conditions in spite of environmental changes." (Lars Skyttner, "General Systems Theory: Problems, Perspective, Practice", 2005)
"Adaptive systems learn by enlightened trial and error. The system can take a long time to learn well just as it can take a human a long time to learn to properly swing a golf club even with the help of the best golf instructor. But this iterative learning can also produce solutions that we could not find or at least could not find easily by pure mathematical analysis." (Bart Kosko, "Noise", 2006)
"An ecology provides the special formations needed by organizations. Ecologies are: loose, free, dynamic, adaptable, messy, and chaotic. Innovation does not arise through hierarchies. As a function of creativity, innovation requires trust, openness, and a spirit of experimentation - where random ideas and thoughts can collide for re-creation." (George Siemens, "Knowing Knowledge", 2006)
"It is not only a metaphor to transform the Internet to a superbrain with self-organizing features of learning and adapting. Information retrieval is already realized by neural networks adapting to the information preferences of a human user with synaptic plasticity. In sociobiology, we can 1 earn from populations of ants and termites how to organize traffic and information processing by swarm intelligence. From a technical point of view, we need intelligent programs distributed in the nets. There are already more or less intelligent virtual organisms {'agents'), learning, self-organizing and adapting to our individual preferences of information, to select our e-mails, to prepare economic transactions or to defend the attacks of hostile computer viruses, like the immune system of our body." (Klaus Mainzer, "Complexity Management in the Age of Globalization", 2006)
"How is it that an ant colony can organize itself to carry out the complex tasks of food gathering and nest building and at the same time exhibit an enormous degree of resilience if disrupted and forced to adapt to changing situations? Natural systems are able not only to survive, but also to adapt and become better suited to their environment, in effect optimizing their behavior over time. They seemingly exhibit collective intelligence, or swarm intelligence as it is called, even without the existence of or the direction provided by a central authority." (Michael J North & Charles M Macal, "Managing Business Complexity: Discovering Strategic Solutions with Agent-Based Modeling and Simulation", 2007)
"In that sense, a self-organizing system is intrinsically adaptive: it maintains its basic organization in spite of continuing changes in its environment. As noted, perturbations may even make the system more robust, by helping it to discover a more stable organization." (Francis Heylighen, "Complexity and Self-Organization", 2008)
"[a complex system is] a system in which large networks of components with no central control and simple rules of operation give rise to complex collective behavior, sophisticated information processing, and adaptation via learning or evolution." (Melanie Mitchell, "Complexity: A Guided Tour", 2009)
"The difference between complex adaptive systems and self-organizing systems is that the former have the capacity to learn from their experience, and thus to embody successful patterns into their repertoire, although there is actually quite a deep relationship between self-organizing systems and complex adaptive systems. Adaptive entities can emerge at high levels of description in simple self-organizing systems, i.e., adaptive systems are not necessarily self-organizing systems with something extra thrown in." (Terry Cooke-Davies et al, "Exploring the Complexity of Projects", 2009)
No comments:
Post a Comment