"All purposeful behavior may be considered to require negative feed-back. If a goal is to be attained, some signals from the goal are necessary at some time to direct the behavior. By non-feed-back behavior is meant that in which there are no signals from the goal which modify the activity of the object in the course of the behavior. Thus, a machine may be set to impinge upon a luminous object although the machine may be insensitive to light." (Arturo Rosenblueth, Norbert Wiener & Julian Bigelow, "Behavior, Purpose and Technology", Philosophy of Science Vol. 10 (1), 1943)
"Feedback is a method of controlling a system by reinserting into it the results of its past performance. If these results are merely used as numerical data for the criticism of the system and its regulation, we have the simple feedback of the control engineers. If, however, the information which proceeds backward from the performance is able to change the general method and pattern of performance, we have a process which may be called learning." (Norbert Wiener, 1954)
"[...] the concept of 'feedback', so simple and natural in certain elementary cases, becomes artificial and of little use when the interconnexions between the parts become more complex. When there are only two parts joined so that each affects the other, the properties of the feedback give important and useful information about the properties of the whole. But when the parts rise to even as few as four, if every one affects the other three, then twenty circuits can be traced through them; and knowing the properties of all the twenty circuits does not give complete information about the system. Such complex systems cannot be treated as an interlaced set of more or less independent feedback circuits, but only as a whole. For understanding the general principles of dynamic systems, therefore, the concept of feedback is inadequate in itself. What is important is that complex systems, richly cross-connected internally, have complex behaviours, and that these behaviours can be goal-seeking in complex patterns." (W Ross Ashby, "An Introduction to Cybernetics", 1956)
"To say a system is 'self-organizing' leaves open two quite different meanings. There is a first meaning that is simple and unobjectionable. This refers to the system that starts with its parts separate (so that the behavior of each is independent of the others' states) and whose parts then act so that they change towards forming connections of some type. Such a system is 'self-organizing' in the sense that it changes from 'parts separated' to 'parts joined'. […] In general such systems can be more simply characterized as 'self-connecting', for the change from independence between the parts to conditionality can always be seen as some form of 'connection', even if it is as purely functional […] 'Organizing' […] may also mean 'changing from a bad organization to a good one' […] The system would be 'self-organizing' if a change were automatically made to the feedback, changing it from positive to negative; then the whole would have changed from a bad organization to a good." (W Ross Ashby, "Principles of the self-organizing system", 1962)
"Traditional organizational theories have tended to view the human organization as a closed system. This tendency has led to a disregard of differing organizational environments and the nature of organizational dependency on environment. It has led also to an over-concentration on principles of internal organizational functioning, with consequent failure to develop and understand the processes of feedback which are essential to survival." (Daniel Katz, "The Social Psychology of Organizations", 1966)
"The structure of a complex system is not a simple feedback loop where one system state dominates the behavior. The complex system has a multiplicity of interacting feedback loops. Its internal rates of flow are controlled by non‐linear relationships. The complex system is of high order, meaning that there are many system states (or levels). It usually contains positive‐feedback loops describing growth processes as well as negative, goal‐seeking loops." (Jay F Forrester, "Urban Dynamics", 1969)
"To model the dynamic behavior of a system, four hierarchies of structure should be recognized: closed boundary around the system; feedback loops as the basic structural elements within the boundary; level variables representing accumulations within the feedback loops; rate variables representing activity within the feedback loops." (Jay W Forrester, "Urban Dynamics", 1969)
"A nonlinear relationship causes the feedback loop of which it is a part to vary in strength, depending on the state of the system. Linked nonlinear feedback loops thus form patterns of shifting loop dominance- under some conditions one part of the system is very active, and under other conditions another set of relationships takes control and shifts the entire system behavior. A model composed of several feedback loops linked nonlinearly can produce a wide variety of complex behavior patterns." (Jørgen Randers, "Elements of the System Dynamics Method", 1980)
"The autonomy of living systems is characterized by closed, recursive organization. [...] A system's highest order of recursion or feedback process defines, generates, and maintains the autonomy of a system. The range of deviation this feedback seeks to control concerns the organization of the whole system itself. If the system should move beyond the limits of its own range of organization it would cease to be a system. Thus, autonomy refers to the maintenance of a systems wholeness. In biology, it becomes a definition of what maintains the variable called living." (Bradford P Keeney, "Aesthetics of Change", 1983)
"Ultimately, uncontrolled escalation destroys a system. However, change in the direction of learning, adaptation, and evolution arises from the control of control, rather than unchecked change per se. In general, for the survival and co-evolution of any ecology of systems, feedback processes must be embodied by a recursive hierarchy of control circuits." (Bradford P Keeney, "Aesthetics of Change", 1983)
"What is sometimes called 'positive feedback' or 'amplified deviation' is therefore a partial arc or sequence of a more encompassing negative feedback process. The appearance of escalating runaways in systems is a consequence of the frame of reference an observer has punctuated. Enlarging one's frame of reference enables the 'runaway' to be seen as a variation subject to higher orders of control." (Bradford P Keeney, "Aesthetics of Change", 1983)
"Every system of whatever size must maintain its own structure and must deal with a dynamic environment, i.e., the system must strike a proper balance between stability and change. The cybernetic mechanisms for stability (i.e., homeostasis, negative feedback, autopoiesis, equifinality) and change (i.e., positive feedback, algedonodes, self-organization) are found in all viable systems." (Barry Clemson, "Cybernetics: A New Management Tool", 1984)
"The term closed loop-learning process refers to the idea that one learns by determining what s desired and comparing what is actually taking place as measured at the process and feedback for comparison. The difference between what is desired and what is taking place provides an error indication which is used to develop a signal to the process being controlled." (Harold Chestnut, 1984)
"The term chaos is used in a specific sense where it is an inherently random pattern of behaviour generated by fixed inputs into deterministic (that is fixed) rules (relationships). The rules take the form of non-linear feedback loops. Although the specific path followed by the behaviour so generated is random and hence unpredictable in the long-term, it always has an underlying pattern to it, a 'hidden' pattern, a global pattern or rhythm. That pattern is self-similarity, that is a constant degree of variation, consistent variability, regular irregularity, or more precisely, a constant fractal dimension. Chaos is therefore order (a pattern) within disorder (random behaviour)." (Ralph D Stacey, "The Chaos Frontier: Creative Strategic Control for Business", 1991)
"In many parts of the economy, stabilizing forces appear not to operate. Instead, positive feedback magnifies the effects of small economic shifts; the economic models that describe such effects differ vastly from the conventional ones. Diminishing returns imply a single equilibrium point for the economy, but positive feedback – increasing returns – makes for many possible equilibrium points. There is no guarantee that the particular economic outcome selected from among the many alternatives will be the ‘best’ one." (W Brian Arthur, "Returns and Path Dependence in the Economy", 1994)
"[…] self-organization is the spontaneous emergence of new structures and new forms of behavior in open systems far from equilibrium, characterized by internal feedback loops and described mathematically by nonlinear equations." (Fritjof Capra, "The web of life: a new scientific understanding of living systems" , 1996)
"[…] feedback is not necessarily transmitted and returned through the same system component - or even through the same system. It may travel through several intervening components within the system first, or return from an external system, before finally arriving again at the component where it started." (Virginia Anderson & Lauren Johnson, "Systems Thinking Basics: From Concepts to Causal Loops", 1997)
"Feedback is the transmission and return of information. […] A system has feedback within itself. But because all systems are part of larger systems, a system also has feedback between itself and external systems. In some systems, the feedback and adjustment processes happen so quickly that it is relatively easy for an observer to follow. In other systems, it may take a long time before the feedback is returned, so an observer would have trouble identifying the action that prompted the feedback." (Virginia Anderson & Lauren Johnson, "Systems Thinking Basics: From Concepts to Causal Loops", 1997)
"In a complex system, it is not uncommon for subsystems to have goals that compete directly with or diverge from the goals of the overall system. […] Feedback gathered from small, local subsystems for use by larger subsystems may be either inaccurately conveyed or inaccurately interpreted. Yet it is this very flexibility and looseness that allow large, complex systems to endure, although it can be hard to predict what these organizations are likely to do next." (Virginia Anderson & Lauren Johnson, "Systems Thinking Basics: From Concepts to Causal Loops", 1997)
"Reinforcing loops can be seen as the engines of growth and collapse. That is, they compound change in one direction with even more change in that direction. Many reinforcing loops have a quality of accelerating movement in a particular direction, a sense that the more one variable changes, the more another changes." (Virginia Anderson & Lauren Johnson, "Systems Thinking Basics: From Concepts to Causal Loops", 1997)
"Something of the previous state, however, survives every change. This is called in the language of cybernetics (which took it form the language of machines) feedback, the advantages of learning from experience and of having developed reflexes." (Guy Davenport, "The Geography of the Imagination: Forty Essays", 1997)"Cybernetics is the science of effective organization, of control and communication in animals and machines. It is the art of steersmanship, of regulation and stability. The concern here is with function, not construction, in providing regular and reproducible behaviour in the presence of disturbances. Here the emphasis is on families of solutions, ways of arranging matters that can apply to all forms of systems, whatever the material or design employed. [...] This science concerns the effects of inputs on outputs, but in the sense that the output state is desired to be constant or predictable – we wish the system to maintain an equilibrium state. It is applicable mostly to complex systems and to coupled systems, and uses the concepts of feedback and transformations (mappings from input to output) to effect the desired invariance or stability in the result." (Chris Lucas, "Cybernetics and Stochastic Systems", 1999)
"All dynamics arise from the interaction of just two types of feedback loops, positive (or self-reinforcing) and negative (or self-correcting) loops. Positive loops tend to reinforce or amplify whatever is happening in the system […] Negative loops counteract and oppose change." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)
"The self-reinforcing feedback between expectations and perceptions has been repeatedly demonstrated […]. Sometimes the positive feedback assists learning by sharpening our ability to perceive features of the environment, as when an experienced naturalist identifies a bird in a distant bush where the novice sees only a tangled thicket. Often, however, the mutual feedback of expectations and perception blinds us to the anomalies that might challenge our mental models and lead to deep insight." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)
"Much of the art of system dynamics modeling is discovering and representing the feedback processes, which, along with stock and flow structures, time delays, and nonlinearities, determine the dynamics of a system. […] the most complex behaviors usually arise from the interactions (feedbacks) among the components of the system, not from the complexity of the components themselves." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)
"The phenomenon of emergence takes place at critical points of instability that arise from fluctuations in the environment, amplified by feedback loops." (Fritjof Capra, "The Hidden Connections: A Science for Sustainable Living", 2002)
"All models are mental projections of our understanding of processes and feedbacks of systems in the real world. The general approach is that models are as good as the system upon which they are based. Models should be designed to answer specific questions and only incorporate the necessary details that are required to provide an answer." (Hördur V Haraldsson & Harald U Sverdrup, "Finding Simplicity in Complexity in Biogeochemical Modelling", 2004)
"[…] some systems […] are very sensitive to their starting conditions, so that a tiny difference in the initial ‘push’ you give them causes a big difference in where they end up, and there is feedback, so that what a system does affects its own behavior." (John Gribbin, "Deep Simplicity", 2004)
"Feedback and its big brother, control theory, are such important concepts that it is odd that they usually find no formal place in the education of physicists. On the practical side, experimentalists often need to use feedback. Almost any experiment is subject to the vagaries of environmental perturbations. Usually, one wants to vary a parameter of interest while holding all others constant. How to do this properly is the subject of control theory. More fundamentally, feedback is one of the great ideas developed (mostly) in the last century, with particularly deep consequences for biological systems, and all physicists should have some understanding of such a basic concept." (John Bechhoefer, "Feedback for physicists: A tutorial essay on control". Reviews of Modern Physics Vol. 77, 2005)
"[…] our mental models fail to take into account the complications of the real world - at least those ways that one can see from a systems perspective. It is a warning list. Here is where hidden snags lie. You can’t navigate well in an interconnected, feedback-dominated world unless you take your eyes off short-term events and look for long-term behavior and structure; unless you are aware of false boundaries and bounded rationality; unless you take into account limiting factors, nonlinearities and delays. You are likely to mistreat, misdesign, or misread systems if you don’t respect their properties of resilience, self-organization, and hierarchy." (Donella H Meadows, "Thinking in Systems: A Primer", 2008)
"The work around the complex systems map supported a concentration on causal mechanisms. This enabled poor system responses to be diagnosed as the unanticipated effects of previous policies as well as identification of the drivers of the sector. Understanding the feedback mechanisms in play then allowed experimentation with possible future policies and the creation of a coherent and mutually supporting package of recommendations for change." (David C Lane et al, "Blending systems thinking approaches for organisational analysis: reviewing child protection", 2015)
No comments:
Post a Comment