"A more viable model, one much more faithful to the kind of system that society is more and more recognized to be, is in process of developing out of, or is in keeping with, the modern systems perspective (which we use loosely here to refer to general systems research, cybernetics, information and communication theory, and related fields). Society, or the sociocultural system, is not, then, principally an equilibrium system or a homeostatic system, but what we shall simply refer to as a complex adaptive system." (Walter F Buckley, "Society as a complex adaptive system", 1968)
"Whatever the system, adaptive change depends upon feedback loops, be it those provided by natural selection or those of individual reinforcement. In all cases, then, there must be a process of trial and error and a mechanism of comparison. […] By superposing and interconnecting many feedback loops, we (and all other biological systems) not only solve particular problems but also form habits which we apply to the solution of classes of problems." (Gregory Bateson, "Steps to an Ecology of Mind", 1972)
"Because the individual parts of a complex adaptive system are continually revising their ('conditioned') rules for interaction, each part is embedded in perpetually novel surroundings (the changing behavior of the other parts). As a result, the aggregate behavior of the system is usually far from optimal, if indeed optimality can even be defined for the system as a whole. For this reason, standard theories in physics, economics, and elsewhere, are of little help because they concentrate on optimal end-points, whereas complex adaptive systems 'never get there'. They continue to evolve, and they steadily exhibit new forms of emergent behavior." (John H Holland, "Complex Adaptive Systems", Daedalus Vol. 121 (1), 1992)
"In short, complex adaptive systems are characterized by perpetual novelty." (M Mitchell Waldrop, "Complexity: The Emerging Science at the Edge of Order and Chaos", 1992)
"[...] it's essentially meaningless to talk about a complex adaptive system being in equilibrium: the system can never get there. It is always unfolding, always in transition. In fact, if the system ever does reach equilibrium, it isn't just stable. It's dead." (M Mitchell Waldrop, "Complexity: The Emerging Science at the Edge of Order and Chaos", 1992)
"Complex adaptive systems have the property that if you run them - by just letting the mathematical variable of 'time' go forward - they'll naturally progress from chaotic, disorganized, undifferentiated, independent states to organized, highly differentiated, and highly interdependent states. Organized structures emerge spontaneously. [...]A weak system gives rise only to simpler forms of self-organization; a strong one gives rise to more complex forms, like life. (J Doyne Farmer, "The Third Culture: Beyond the Scientific Revolution", 1995)
"The second law of thermodynamics, which requires average entropy (or disorder) to increase, does not in any way forbid local order from arising through various mechanisms of self-organization, which can turn accidents into frozen ones producing extensive regularities. Again, such mechanisms are not restricted to complex adaptive systems." (Murray Gell-Mann, "What is Complexity?", Complexity Vol 1 (1), 1995)
"Distributed control means that the outcomes of a complex adaptive system emerge from a process of self-organization rather than being designed and controlled externally or by a centralized body." (Brenda Zimmerman et al, "A complexity science primer", 1998)
"With the growing interest in complex adaptive systems, artificial life, swarms and simulated societies, the concept of 'collective intelligence' is coming more and more to the fore. The basic idea is that a group of individuals (e. g. people, insects, robots, or software agents) can be smart in a way that none of its members is. Complex, apparently intelligent behavior may emerge from the synergy created by simple interactions between individuals that follow simple rules." (Francis Heylighen, "Collective Intelligence and its Implementation on the Web", 1999)
"Adaptive systems learn by enlightened trial and error. The system can take a long time to learn well just as it can take a human a long time to learn to properly swing a golf club even with the help of the best golf instructor. But this iterative learning can also produce solutions that we could not find or at least could not find easily by pure mathematical analysis." (Bart Kosko, "Noise", 2006)
"In that sense, a self-organizing system is intrinsically adaptive: it maintains its basic organization in spite of continuing changes in its environment. As noted, perturbations may even make the system more robust, by helping it to discover a more stable organization." (Francis Heylighen, "Complexity and Self-Organization", 2008)
"The difference between complex adaptive systems and self-organizing systems is that the former have the capacity to learn from their experience, and thus to embody successful patterns into their repertoire, although there is actually quite a deep relationship between self-organizing systems and complex adaptive systems. Adaptive entities can emerge at high levels of description in simple self-organizing systems, i.e., adaptive systems are not necessarily self-organizing systems with something extra thrown in." (Terry Cooke-Davies et al, "Exploring the Complexity of Projects", 2009)
No comments:
Post a Comment