"The true nature of the universal principle of synergy pervading all nature and creating all the different kinds of structure that we observe to exist, must now be made clearer. Primarily and essentially it is a process of equilibration, i.e., the several forces are first brought into a state of partial equilibrium. It begins in collision, conflict, antagonism, and opposition, and then we have the milder phases of antithesis, competition, and interaction, passing next into a modus vivendi, or compromise, and ending in collaboration and cooperation. […] The entire drift is toward economy, conservatism, and the prevention of waste." (James Q Dealey & Lester F Ward, "A Text-book of Sociology", 1905)
"Nature prefers the more probable states to the less probable because in nature processes take place in the direction of greater probability. Heat goes from a body at higher temperature to a body at lower temperature because the state of equal temperature distribution is more probable than a state of unequal temperature distribution." (Max Planck, "The Atomic Theory of Matter", 1909)
"True equilibria can occur only in closed systems and that, in open systems, disequilibria called ‘steady states’, or ‘flow equilibria’ are the predominant and characteristic feature. According to the second law of thermodynamics a closed system must eventually attain a time-independent equilibrium state, with maximum entropy and minimum free energy. An open system may, under certain conditions, attain a time-independent state where the system remains constant as a whole and in its phases, though there is a continuous flow of component materials. This is called a steady state. Steady states are irreversible as a whole. […] A closed system in equilibrium does not need energy for its preservation, nor can energy be obtained from it. In order to perform work, a system must be in disequilibrium, tending toward equilibrium and maintaining a steady state, Therefore the character of an open system is the necessary condition for the continuous working capacity of the organism." (Ludwig on Bertalanffy, "Theoretische Biologie: Band 1: Allgemeine Theorie, Physikochemie, Aufbau und Entwicklung des Organismus", 1932)
"A state of equilibrium in a system does not mean, further, that the system is without tension. Systems can, on the contrary, also come to equilibrium in a state of tension" (e.g., a spring under tension or a container with gas under pressure). The occurrence of this sort of system, however, presupposes a certain firmness of boundaries and actual segregation of the system from its environment" (both of these in a functional, not a spatial, sense). If the different parts of the system are insufficiently cohesive to withstand the forces working toward displacement" (i.e., if the system shows insufficient internal firmness, if it is fluid), or if the system is not segregated from its environment by sufficiently firm walls but is open to its neighboring systems, stationary tensions cannot occur. Instead, there occurs a process in the direction of the forces, which encroaches upon the neighboring regions with diffusion of energy and which goes in the direction of an equilibrium at a lower level of tension in the total region. The presupposition for the existence of a stationary state of tension is thus a certain firmness of the system in question, whether this be its own inner firmness or the firmness of its walls." (Kurt Lewin, "A Dynamic Theory of Personality", 1935)
"The process moves in the direction of a state of equilibrium only for the system as a whole. Part processes may at the same time go on in opposed directions, a circumstance which is of the greatest significance for, for example, the theory of detour behavior. It is hence important to take the system whole which is dominant at the moment as basis." (Kurt Lewin, "A Dynamic Theory of Personality", 1935)
"An isolated system or a system in a uniform environment" (which for the present consideration we do best to include as a part of the system we contemplate) increases its entropy and more or less rapidly approaches the inert state of maximum entropy. We now recognize this fundamental law of physics to be just the natural tendency of things to approach the chaotic state (the same tendency that the books of a library or the piles of papers and manuscripts on a writing desk display) unless we obviate it." (The analogue of irregular heat motion, in this case, is our handling those objects now and again without troubling to put them back in their proper places.)" (Erwin Schrödinger, "What is Life?", 1944)
"Every process, event, happening - call it what you will; in a word, everything that is going on in Nature means an increase of the entropy of the part of the world where it is going on. Thus a living organism continually increases its entropy - or, as you may say, produces positive entropy – and thus tends to approach the dangerous state of maximum entropy, which is death. It can only keep aloof from it, i.e. alive, by continually drawing from its environment negative entropy – which is something very positive as we shall immediately see. What an organism feeds upon is negative entropy. Or, to put it less paradoxically, the essential thing in metabolism is that the organism succeeds in freeing itself from all the entropy it cannot help producing while alive." (Erwin Schrödinger, "What is Life?", 1944)
"The study of the conditions for change begins appropriately with an analysis of the conditions for no change, that is, for the state of equilibrium." (Kurt Lewin, "Quasi-Stationary Social Equilibria and the Problem of Permanent Change", 1947)
"[…] the characteristic tendency of entropy is to increase. As entropy increases, the universe, and all closed systems in the universe, tend naturally to deteriorate and lose their distinctiveness, to move from the least to the most probable state, from a state of organization and differentiation in which distinctions and forms exist, to a state of chaos and sameness." (Norbert Wiener, "The Human Use of Human Beings", 1950)
"Physical irreversibility manifests itself in the fact that, whenever the system is in a state far removed from equilibrium, it is much more likely to move toward equilibrium, than in the opposite direction." (William Feller, "An Introduction To Probability Theory And Its Applications", 1950)
"Every stable system has the property that if displaced from a state of equilibrium and released, the subsequent movement is so matched to the initial displacement that the system is brought back to the state of equilibrium. A variety of disturbances will therefore evoke a variety of matched reactions." (W Ross Ashby, "Design for a Brain: The Origin of Adaptive Behavior", 1952)
"The primary fact is that all isolated state-determined dynamic systems are selective: from whatever state they have initially, they go towards states of equilibrium. These states of equilibrium are always characterised, in their relation to the change-inducing laws of the system, by being exceptionally resistant." (W Ross Ashby, "Design for a Brain: The Origin of Adaptive Behavior", 1952)
"Reversible processes are not, in fact, processes at all, they are sequences of states of equilibrium. The processes which we encounter in real life are always irreversible processes." (Arnold Sommerfeld, "Thermodynamics and Statistical Mechanics", Lectures on Theoretical - Physics Vol. V, 1956)
"Stability is commonly thought of as desirable, for its presence enables the system to combine of flexibility and activity in performance with something of permanence. Behaviour that is goal-seeking is an example of behaviour that is stable around a state of equilibrium. Nevertheless, stability is not always good, for a system may persist in returning to some state that, for other reasons, is considered undesirable." (W Ross Ashby, "An Introduction to Cybernetics", 1956)
"Clearly, if the state of the system is coupled to parameters of an environment and the state of the environment is made to modify parameters of the system, a learning process will occur. Such an arrangement will be called a Finite Learning Machine, since it has a definite capacity. It is, of course, an active learning mechanism which trades with its surroundings. Indeed it is the limit case of a self-organizing system which will appear in the network if the currency supply is generalized." (Gordon Pask, "The Natural History of Networks", 1960)
"Cybernetics is the general science of communication. But to refer to communication is consciously or otherwise to refer to distinguishable states of information inputs and outputs and /or to information being processed within some relatively isolated system." (Henryk Greniewski, "Cybernetics without Mathematics", 1960)
"To say a system is 'self-organizing' leaves open two quite different meanings. There is a first meaning that is simple and unobjectionable. This refers to the system that starts with its parts separate" (so that the behavior of each is independent of the others' states) and whose parts then act so that they change towards forming connections of some type. Such a system is 'self-organizing' in the sense that it changes from 'parts separated' to 'parts joined'. […] In general such systems can be more simply characterized as 'self-connecting', for the change from independence between the parts to conditionality can always be seen as some form of 'connection', even if it is as purely functional […] 'Organizing' […] may also mean 'changing from a bad organization to a good one' […] The system would be 'self-organizing' if a change were automatically made to the feedback, changing it from positive to negative; then the whole would have changed from a bad organization to a good." (W Ross Ashby, "Principles of the self-organizing system", 1962)
"[...] in a state of dynamic equilibrium with their environments. If they do not maintain this equilibrium they die; if they do maintain it they show a degree of spontaneity, variability, and purposiveness of response unknown in the non-living world. This is what is meant by ‘adaptation to environment’ […] [Its] essential feature […] is stability - that is, the ability to withstand disturbances." (Kenneth Craik, 'Living organisms', “The Nature of Psychology”, 1966)
"Conventional physics deals only with closed systems, i.e. systems which are considered to be isolated from their environment. [...] However, we find systems which by their very nature and definition are not closed systems. Every living organism is essentially an open system. It maintains itself in a continuous inflow and outflow, a building up and breaking down of components, never being, so long as it is alive, in a state of chemical and thermodynamic equilibrium but maintained in a so-called steady state which is distinct from the latter." (Ludwig von Bertalanffy, "General System Theory", 1968)
"We may state as characteristic of modern science that this scheme of isolable units acting in one-way causality has proven to be insufficient. Hence the appearance, in all fields of science, of notions like wholeness, holistic, organismic, gestalt, etc., which all signify that, in the last resort, we must think in terms of systems of elements in mutual interaction […]." (Ludwig von Bertalanffy, "General System Theory", 1968)
"In complex systems cause and effect are often not closely related in either time or space. The structure of a complex system is not a simple feedback loop where one system state dominates the behavior. The complex system has a multiplicity of interacting feedback loops. Its internal rates of flow are controlled by nonlinear relationships. The complex system is of high order, meaning that there are many system states (or levels). It usually contains positive-feedback loops describing growth processes as well as negative, goal-seeking loops. In the complex system the cause of a difficulty may lie far back in time from the symptoms, or in a completely different and remote part of the system. In fact, causes are usually found, not in prior events, but in the structure and policies of the system." (Jay W Forrester, "Urban dynamics", 1969)
"The structure of a complex system is not a simple feedback loop where one system state dominates the behavior. The complex system has a multiplicity of interacting feedback loops. Its internal rates of flow are controlled by non-linear relationships. The complex system is of high order, meaning that there are many system states (or levels). It usually contains positive-feedback loops describing growth processes as well as negative, goal-seeking loops." (Jay F Forrester, "Urban Dynamics", 1969)
"To adapt to a changing environment, the system needs a variety of stable states that is large enough to react to all perturbations but not so large as to make its evolution uncontrollably chaotic. The most adequate states are selected according to their fitness, either directly by the environment, or by subsystems that have adapted to the environment at an earlier stage. Formally, the basic mechanism underlying self-organization is the" (often noise-driven) variation which explores different regions in the system’s state space until it enters an attractor. This precludes further variation outside the attractor, and thus restricts the freedom of the system’s components to behave independently. This is equivalent to the increase of coherence, or decrease of statistical entropy, that defines self-organization." (Francis Heylighen, "The Science Of Self-Organization And Adaptivity", 1970)
"Self-organization can be defined as the spontaneous creation of a globally coherent pattern out of local interactions. Because of its distributed character, this organization tends to be robust, resisting perturbations. The dynamics of a self-organizing system is typically non-linear, because of circular or feedback relations between the components. Positive feedback leads to an explosive growth, which ends when all components have been absorbed into the new configuration, leaving the system in a stable, negative feedback state. Non-linear systems have in general several stable states, and this number tends to increase" (bifurcate) as an increasing input of energy pushes the system farther from its thermodynamic equilibrium." (Francis Heylighen, The Science Of Self-Organization And Adaptivity", 1970)
"[The] system may evolve through a whole succession of transitions leading to a hierarchy of more and more complex and organized states. Such transitions can arise in nonlinear systems that are maintained far from equilibrium: that is, beyond a certain critical threshold the steady-state regime become unstable and the system evolves into a new configuration." (Ilya Prigogine, Gregoire Micolis & Agnes Babloyantz, "Thermodynamics of Evolution", Physics Today 25" (11), 1972)
"General systems theory is the scientific exploration of 'wholes' and 'wholeness' which, not so long ago, were considered metaphysical notions transcending the boundaries of science. Hierarchic structure, stability, teleology, differentiation, approach to and maintenance of steady states, goal-directedness - these are a few of such general system properties." (Ervin László, "Introduction to Systems Philosophy", 1972)
"In an isolated system, which cannot exchange energy and matter with the surroundings, this tendency is expressed in terms of a function of the macroscopic state of the system: the entropy." (Ilya Prigogine, "Thermodynamics of Evolution", 1972)
"Open systems, in contrast to closed systems, exhibit a principle of equifinality, that is, a tendency to achieve a final state independent of initial conditions. In other words, open systems tend to 'resist' perturbations that take them away from some steady state. They can exhibit homeostasis." (Anatol Rapaport, "The Uses of Mathematical Isomorphism in General System Theory", 1972)
"There is nothing supernatural about the process of self-organization to states of higher entropy; it is a general property of systems, regardless of their materials and origin. It does not violate the Second Law of thermodynamics since the decrease in entropy within an open system is always offset by the increase of entropy in its surroundings." (Ervin László, "Introduction to Systems Philosophy", 1972)
"When a system is considered in two different states, the difference in volume or in any other property, between the two states, depends solely upon those states themselves and not upon the manner in which the system may pass from one state to the other." (Rudolf Arnheim, "Entropy and Art: An Essay on Disorder and Order", 1974)
"A system may be specified in either of two ways. In the first, which we shall call a state description, sets of abstract inputs, outputs and states are given, together with the action of the inputs on the states and the assignments of outputs to states. In the second, which we shall call a coordinate description, certain input, output and state variables are given, together with a system of dynamical equations describing the relations among the variables as functions of time. Modern mathematical system theory is formulated in terms of state descriptions, whereas the classical formulation is typically a coordinate description, for example a system of differential equations." (E S Bainbridge, "The Fundamental Duality of System Theory", 1975)
"In any system governed by a potential, and in which the system's behavior is determined by no more than four different factors, only seven qualitatively different types of discontinuity are possible. In other words, while there are an infinite number of ways for such a system to change continuously" (staying at or near equilibrium), there are only seven structurally stable ways for it to change discontinuously" (passing through non-equilibrium states)." (Alexander Woodcock & Monte Davis, "Catastrophe Theory", 1978)
"Cellular automata may be considered as discrete dynamical systems. In almost all cases, cellular automaton evolution is irreversible. Trajectories in the configuration space for cellular automata therefore merge with time, and after many time steps, trajectories starting from almost all initial states become concentrated onto 'attractors'. These attractors typically contain only a very small fraction of possible states. Evolution to attractors from arbitrary initial states allows for 'self-organizing' behaviour, in which structure may evolve at large times from structureless initial states. The nature of the attractors determines the form and extent of such structures." (Stephen Wolfram, "Nonlinear Phenomena, Universality and complexity in cellular automata", Physica 10D, 1984)
"If a system is in a state of equilibrium" (a steady state), then all sub-systems must be in equilibrium. If all sub-systems are in a state of equilibrium, then the system must be in equilibrium." (Barry Clemson, "Cybernetics: A New Management Tool", 1984)
"When one combines the new insights gained from studying far-from-equilibrium states and nonlinear processes, along with these complicated feedback systems, a whole new approach is opened that makes it possible to relate the so-called hard sciences to the softer sciences of life - and perhaps even to social processes as well. […] It is these panoramic vistas that are opened to us by Order Out of Chaos." (Ilya Prigogine, "Order Out of Chaos: Man's New Dialogue with Nature", 1984)
"We will treat problem solving as a process of search through a state space. A problem is defined by an initial state, one or more goal states to be reached, a set of operators that can transform one state into another, and constraints that an acceptable solution must meet. Problem-solving methods are procedures for selecting an appropriate sequence of operators that will succeed in transforming the initial state into a goal state through a series of steps." (John H Holland et al, "Induction: Processes Of Inference, Learning, And Discovery", 1986)
"Everywhere […] in the Universe, we discern that closed physical systems evolve in the same sense from ordered states towards a state of complete disorder called thermal equilibrium. This cannot be a consequence of known laws of change, since […] these laws are time symmetric- they permit […] time-reverse. […] The initial conditions play a decisive role in endowing the world with its sense of temporal direction. […] some prescription for initial conditions is crucial if we are to understand […]" (John D Barrow, "Theories of Everything: The Quest for Ultimate Explanation", 1991)
"Regarding stability, the state trajectories of a system tend to equilibrium. In the simplest case they converge to one point" (or different points from different initial states), more commonly to one" (or several, according to initial state) fixed point or limit cycle(s) or even torus(es) of characteristic equilibrial behaviour. All this is, in a rigorous sense, contingent upon describing a potential, as a special summation of the multitude of forces acting upon the state in question, and finding the fixed points, cycles, etc., to be minima of the potential function. It is often more convenient to use the equivalent jargon of 'attractors' so that the state of a system is 'attracted' to an equilibrial behaviour. In any case, once in equilibrial conditions, the system returns to its limit, equilibrial behaviour after small, arbitrary, and random perturbations." (Gordon Pask, "Different Kinds of Cybernetics", 1992)
"A strange attractor, when it exists, is truly the heart of a chaotic system. If a concrete system has been in existence for some time, states other than those extremely close to the attractor might as well not exist; they will never occur. For one special complicated chaotic system - the global weather - the attractor is simply the climate, that is, the set of weather patterns that have at least some chance of occasionally occurring." (Edward N Lorenz, "The Essence of Chaos", 1993)
"How can deterministic behavior look random? If truly identical states do occur on two or more occasions, it is unlikely that the identical states that will necessarily follow will be perceived as being appreciably different. What can readily happen instead is that almost, but not quite, identical states occurring on two occasions will appear to be just alike, while the states that follow, which need not be even nearly alike, will be observably different. In fact, in some dynamical systems it is normal for two almost identical states to be followed, after a sufficient time lapse, by two states bearing no more resemblance than two states chosen at random from a long sequence. Systems in which this is the case are said to be sensitively dependent on initial conditions. With a few more qualifications, to be considered presently, sensitive dependence can serve as an acceptable definition of chaos [...]" (Edward N Lorenz, "The Essence of Chaos", 1993)
"When a system has more than one attractor, the points in phase space that are attracted to a particular attractor form the basin of attraction for that attractor. Each basin contains its attractor, but consists mostly of points that represent transient states. Two contiguous basins of attraction will be separated by a basin boundary." (Edward N Lorenz, "The Essence of Chaos", 1993)
"Complex adaptive systems have the property that if you run them - by just letting the mathematical variable of 'time' go forward - they'll naturally progress from chaotic, disorganized, undifferentiated, independent states to organized, highly differentiated, and highly interdependent states. Organized structures emerge spontaneously. [...]A weak system gives rise only to simpler forms of self-organization; a strong one gives rise to more complex forms, like life." (J Doyne Farmer, The Third Culture: Beyond the Scientific Revolution", 1995)
"Cybernetics is a science of purposeful behavior. It helps us explain behavior as the continuous action of someone" (or thing) in the process, as we see it, of maintaining certain conditions near a goal state, or purpose." (Jeff Dooley, "Thoughts on the Question: What is Cybernetics", 1995)
"A dictionary definition of the word ‘complex’ is: ‘consisting of interconnected or interwoven parts’ […] Loosely speaking, the complexity of a system is the amount of information needed in order to describe it. The complexity depends on the level of detail required in the description. A more formal definition can be understood in a simple way. If we have a system that could have many possible states, but we would like to specify which state it is actually in, then the number of binary digits" (bits) we need to specify this particular state is related to the number of states that are possible." (Yaneer Bar-Yamm, "Dynamics of Complexity", 1997)
"Linear programming and its generalization, mathematical programming, can be viewed as part of a great revolutionary development that has given mankind the ability to state general goals and lay out a path of detailed decisions to be taken in order to 'best' achieve these goals when faced with practical situations of great complexity. The tools for accomplishing this are the models that formulate real-world problems in detailed mathematical terms, the algorithms that solve the models, and the software that execute the algorithms on computers based on the mathematical theory." (George B Dantzig & Mukund N Thapa, "Linear Programming" Vol I, 1997)
"Something of the previous state, however, survives every change. This is called in the language of cybernetics (which took it form the language of machines) feedback, the advantages of learning from experience and of having developed reflexes." (Guy Davenport, "The Geography of the Imagination: Forty Essays", 1997)
"Self-organization is seen as the process by which systems of many components tend to reach a particular state, a set of cycling states, or a small volume of their state space" (attractor basins), with no external interference." (Luis M Rocha, "Syntactic Autonomy", Proceedings of the Joint Conference on the Science and Technology of Intelligent Systems, 1998)
"Cybernetics is the science of effective organization, of control and communication in animals and machines. It is the art of steersmanship, of regulation and stability. The concern here is with function, not construction, in providing regular and reproducible behaviour in the presence of disturbances. Here the emphasis is on families of solutions, ways of arranging matters that can apply to all forms of systems, whatever the material or design employed. [...] This science concerns the effects of inputs on outputs, but in the sense that the output state is desired to be constant or predictable – we wish the system to maintain an equilibrium state. It is applicable mostly to complex systems and to coupled systems, and uses the concepts of feedback and transformations" (mappings from input to output) to effect the desired invariance or stability in the result." (Chris Lucas, "Cybernetics and Stochastic Systems", 1999)
"From a functional point of view, mental models can be described as symbolic structures which permit people: to generate descriptions of the purpose of a system, to generate descriptions of the architecture of a system, to provide explanations of the state of a system, to provide explanations of the functioning of a system, to make predictions of future states of a system." (Gert Rickheit & Lorenz Sichelschmidt, "Mental Models: Some Answers, Some Questions, Some Suggestions", 1999)
"Just as dynamics arise from feedback, so too all learning depends on feedback. We make decisions that alter the real world; we gather information feedback about the real world, and using the new information we revise our understanding of the world and the decisions we make to bring our perception of the state of the system closer to our goals." (John D Sterman, "Business dynamics: Systems thinking and modeling for a complex world", 2000)
"True systems thinking, on the other hand, studies each problem as it relates to the organization’s objectives and interaction with its entire environment, looking at it as a whole within its universe. Taking your organization from a partial systems to a true systems state requires effective strategic management and backward thinking." (Stephen G Haines, "The Systems Thinking Approach to Strategic Planning and Management", 2000)
"The basic concept of complexity theory is that systems show patterns of organization without organizer" (autonomous or self-organization). Simple local interactions of many mutually interacting parts can lead to emergence of complex global structures. […] Complexity originates from the tendency of large dynamical systems to organize themselves into a critical state, with avalanches or 'punctuations' of all sizes. In the critical state, events which would otherwise be uncoupled became correlated." (Jochen Fromm, "The Emergence of Complexity", 2004)
"At the foundation of classical thermodynamics are the first and second laws. The first law formulates that the total energy of a system is conserved, while the second law states that the entropy of an isolated system can only increase. The second law implies that the free energy of an isolated system is successively degraded by diabatic processes over time, leading to entropy production. This eventually results in an equilibrium state of maximum entropy. In its statistical interpretation, the direction towards higher entropy can be interpreted as a transition to more probable states." (Axel Kleidon & Ralph D Lorenz, "Entropy Production by Earth System Processes" [in "Non- quilibrium Thermodynamics and the Production of Entropy"], 2005)
"Of course, the existence of an unknown butterfly flapping its wings has no direct bearing on weather forecasts, since it will take far too long for such a small perturbation to grow to a significant size, and we have many more immediate uncertainties to worry about. So, the direct impact of this phenomenon on weather prediction is often somewhat overstated." (James Annan & William Connolley, "Chaos and Climate", 2005)
"When defining living systems, the term dynamic equilibrium is essential. It does not imply something which is steady or stable. On the contrary, it is a floating state characterized by invisible movements and preparedness for change. To be in dynamic equilibrium is adapting adjustment to balance. Homeostasis stands for the sum of all control functions creating the state of dynamic equilibrium in a healthy organism. It is the ability of the body to maintain a narrow range of internal conditions in spite of environmental changes." (Lars Skyttner, "General Systems Theory: Problems, Perspective, Practice", 2005)
"Complexity arises when emergent system-level phenomena are characterized by patterns in time or a given state space that have neither too much nor too little form. Neither in stasis nor changing randomly, these emergent phenomena are interesting, due to the coupling of individual and global behaviours as well as the difficulties they pose for prediction. Broad patterns of system behaviour may be predictable, but the system's specific path through a space of possible states is not." (Steve Maguire et al, "Complexity Science and Organization Studies", 2006)
"Physically, the stability of the dynamics is characterized by the sensitivity to initial conditions. This sensitivity can be determined for statistically stationary states, e.g. for the motion on an attractor. If this motion demonstrates sensitive dependence on initial conditions, then it is chaotic. In the popular literature this is often called the 'Butterfly Effect', after the famous 'gedankenexperiment' of Edward Lorenz: if a perturbation of the atmosphere due to a butterfly in Brazil induces a thunderstorm in Texas, then the dynamics of the atmosphere should be considered as an unpredictable and chaotic one. By contrast, stable dependence on initial conditions means that the dynamics is regular." (Ulrike Feudel et al, "Strange Nonchaotic Attractors", 2006)
"Thus, nonlinearity can be understood as the effect of a causal loop, where effects or outputs are fed back into the causes or inputs of the process. Complex systems are characterized by networks of such causal loops. In a complex, the interdependencies are such that a component A will affect a component B, but B will in general also affect A, directly or indirectly. A single feedback loop can be positive or negative. A positive feedback will amplify any variation in A, making it grow exponentially. The result is that the tiniest, microscopic difference between initial states can grow into macroscopically observable distinctions."" (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)"
"The methodology of feedback design is borrowed from cybernetics" (control theory). It is based upon methods of controlled system model’s building, methods of system states and parameters estimation" (identification), and methods of feedback synthesis. The models of controlled system used in cybernetics differ from conventional models of physics and mechanics in that they have explicitly specified inputs and outputs. Unlike conventional physics results, often formulated as conservation laws, the results of cybernetical physics are formulated in the form of transformation laws, establishing the possibilities and limits of changing properties of a physical system by means of control." (Alexander L Fradkov, "Cybernetical Physics: From Control of Chaos to Quantum Control", 2007)
"The second law of thermodynamics states that in an isolated system, entropy can only increase, not decrease. Such systems evolve to their state of maximum entropy, or thermodynamic equilibrium. Therefore, physical self-organizing systems cannot be isolated: they require a constant input of matter or energy with low entropy, getting rid of the internally generated entropy through the output of heat" ('dissipation'). This allows them to produce ‘dissipative structures’ which maintain far from thermodynamic equilibrium. Life is a clear example of order far from thermodynamic equilibrium." (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)
"Thus, nonlinearity can be understood as the effect of a causal loop, where effects or outputs are fed back into the causes or inputs of the process. Complex systems are characterized by networks of such causal loops. In a complex, the interdependencies are such that a component A will affect a component B, but B will in general also affect A, directly or indirectly. A single feedback loop can be positive or negative. A positive feedback will amplify any variation in A, making it grow exponentially. The result is that the tiniest, microscopic difference between initial states can grow into macroscopically observable distinctions." (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)
"We have to be aware that even in mathematical and physical models of self-organizing systems, it is the observer who ascribes properties, aspects, states, and probabilities; and therefore entropy or order to the system. But organization is more than low entropy: it is structure that has a function or purpose." (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)
"A characteristic of such chaotic dynamics is an extreme sensitivity to initial conditions" (exponential separation of neighboring trajectories), which puts severe limitations on any forecast of the future fate of a particular trajectory. This sensitivity is known as the ‘butterfly effect’: the state of the system at time t can be entirely different even if the initial conditions are only slightly changed, i.e., by a butterfly flapping its wings." (Hans J Korsch et al, "Chaos: A Program Collection for the PC", 2008)
"Two systems concepts lie at the disposal of the architect to reflect the beauty of harmony: parsimony and variety. The law of parsimony states that given several explanations of a specific phenomenon, the simplest is probably the best. […] On the other hand, the law of requisite variety states that for a system to survive in its environment the variety of choice that the system is able to make must equal or exceed the variety of influences that the environment can impose on the system." (John Boardman & Brian Sauser, "Systems Thinking: Coping with 21st Century Problems", 2008)
"Generally, these programs fall within the techniques of reinforcement learning and the majority use an algorithm of temporal difference learning. In essence, this computer learning paradigm approximates the future state of the system as a function of the present state. To reach that future state, it uses a neural network that changes the weight of its parameters as it learns." (Diego Rasskin-Gutman, "Chess Metaphors: Artificial Intelligence and the Human Mind", 2009)
"If universality is one of the observed characteristics of complex dynamical systems in many fields of study, a second characteristic that flows from the study of these systems is that of emergence. As self-organizing systems go about their daily business, they are constantly exchanging matter and energy with their environment, and this allows them to remain in a state that is far from equilibrium. That allows spontaneous behavior to give rise to new patterns." (Terry Cooke-Davies et al, "Exploring the Complexity of Projects", 2009)
"[...] a high degree of unpredictability is associated with erratic trajectories. This not only because they look random but mostly because infinitesimally small uncertainties on the initial state of the system grow very quickly - actually exponentially fast. In real world, this error amplification translates into our inability to predict the system behavior from the unavoidable imperfect knowledge of its initial state." (Massimo Cencini et al, "Chaos: From Simple Models to Complex Systems", 2010)
"[…] the law of requisite complexity […] states that in order to fully regulate/control a system, the complexity of the controller has to be at least as great as the complexity of the system that’s being controlled. To put it in even simpler terms, only complexity can destroy complexity." (John L Casti, "X-Events: The Collapse of Everything", 2012)
"Principle of Equifinality: If a steady state is reached in an open system, it is independent of the initial conditions, and determined only by the system parameters, i.e. rates of reaction and transport." (Kevin Adams & Charles Keating, "Systems of systems engineering", 2012)
No comments:
Post a Comment