"True equilibria can occur only in closed systems and that, in open systems, disequilibria called ‘steady states’, or ‘flow equilibria’ are the predominant and characteristic feature. According to the second law of thermodynamics a closed system must eventually attain a time-independent equilibrium state, with maximum entropy and minimum free energy. An open system may, under certain conditions, attain a time-independent state where the system remains constant as a whole and in its phases, though there is a continuous flow of component materials. This is called a steady state. Steady states are irreversible as a whole. […] A closed system in equilibrium does not need energy for its preservation, nor can energy be obtained from it. In order to perform work, a system must be in disequilibrium, tending toward equilibrium and maintaining a steady state, Therefore the character of an open system is the necessary condition for the continuous working capacity of the organism." (Ludwig on Bertalanffy, "Theoretische Biologie: Band 1: Allgemeine Theorie, Physikochemie, Aufbau und Entwicklung des Organismus", 1932)
"By some definitions 'systems engineering' is suggested to be a new discovery. Actually it is a common engineering approach which has taken on a new and important meaning because of the greater complexity and scope of problems to be solved in industry, business, and the military. Newly discovered scientific phenomena, new machines and equipment, greater speed of communications, increased production capacity, the demand for control over ever-extending areas under constantly changing conditions, and the resultant complex interactions, all have created a tremendously accelerating need for improved systems engineering. Systems engineering can be complex, but is simply defined as 'logical engineering within physical, economic and technical limits' - bridging the gap from fundamental laws to a practical operating system." (Instrumentation Technology, 1957)
"Clearly, if the state of the system is coupled to parameters of an environment and the state of the environment is made to modify parameters of the system, a learning process will occur. Such an arrangement will be called a Finite Learning Machine, since it has a definite capacity. It is, of course, an active learning mechanism which trades with its surroundings. Indeed it is the limit case of a self-organizing system which will appear in the network if the currency supply is generalized." (Gordon Pask, "The Natural History of Networks", 1960)
"According to the science of cybernetics, which deals with the topic of control in every kind of system" (mechanical, electronic, biological, human, economic, and so on), there is a natural law that governs the capacity of a control system to work. It says that the control must be capable of generating as much 'variety' as the situation to be controlled." (Anthony S Beer, Management Science", 1968)
"Learning is any change in a system that produces a more or less permanent change in its capacity for adapting to its environment. Understanding systems, especially systems capable of understanding problems in new task domains, are learning systems." (Herbert A Simon, "The Sciences of the Artificial", 1968)
"The notion that the 'balance of nature' is delicately poised and easily upset is nonsense. Nature is extraordinarily tough and resilient, interlaced with checks and balances, with an astonishing capacity for recovering from disturbances in equilibrium. The formula for survival is not power; it is symbiosis." (Sir Eric Ashby, [Encounter] 1976)
"The greater the uncertainty, the greater the amount of decision making and information processing. It is hypothesized that organizations have limited capacities to process information and adopt different organizing modes to deal with task uncertainty. Therefore, variations in organizing modes are actually variations in the capacity of organizations to process information and make decisions about events which cannot be anticipated in advance." (John K Galbraith, "Organization Design", 1977)
"Real learning gets to the heart of what it means to be human. Through learning we re-create ourselves. Through learning we become able to do something we never were able to do. Through learning we reperceive the world and our relationship to it. Through learning we extend our capacity to create, to be part of the generative process of life." (Peter M Senge, "The Fifth Discipline: The Art and Practice of the Learning Organization", 1990)
"The organizations that will truly excel in the future will be the organizations that discover how to tap people's commitment and capacity to learn at all levels in an organization." (Peter M Senge, "The Fifth Discipline: The Art and Practice of the Learning Organization", 1990)
"Neural networks conserve the complexity of the systems they model because they have complex structures themselves. Neural networks encode information about their environment in a distributed form. […] Neural networks have the capacity to self-organise their internal structure." (Paul Cilliers, "Complexity and Postmodernism: Understanding Complex Systems", 1998)
"This is a general characteristic of self-organizing systems: they are robust or resilient. This means that they are relatively insensitive to perturbations or errors, and have a strong capacity to restore themselves, unlike most human designed systems." (Francis Heylighen, "The Science of Self-Organization and Adaptivity", 2001)
"The very essence of mass communication theory is a simple but all-embracing expression of technological determinism, since the essential features depend on what certain technologies have made possible, certain technologies have made possible, especially the following: communication at a distance, the multiplication and simultaneous distribution of diverse ‘messages’, the enormous capacity and speed of carriers, and the limitations on response. There is no escaping the implication that public communication as practised in modern societies is profoundly shaped by these general features." (Denis McQuail, "McQuail's Reader in Mass Communication Theory", 2002)
"In loosely coupled systems by contrast there is plenty of slack in terms of time, resources and organizational capacity. They are much less likely to produce normal accidents since incidents can be .coped with, so avoiding the interactive complexity found within the tightly coupled system. in the latter, moreover, the effects are non-linear. Up to a point, tightening the connections between elements in the system will increase efficiency when everything works smoothly. But, if one small item goes wrong, then that can have a catastrophic knock-on effect throughout the system. The system literally switches over; from smooth functioning to interactively complex disaster. And sometimes this results from a supposed improvement in the system." (John Urry, "Global Complexity", 2003)
"Mutual information is the receiver's entropy minus the conditional entropy of what the receiver receives - given what message the sender sends through the noisy channel. Conditioning or getting data can only reduce uncertainty and so this gap is always positive or zero. It can never be negative. You can only learn from further experience. Information theorists capture this theorem in a slogan: Conditioning reduces entropy. The channel capacity itself is the largest gap given all possible probability descriptions of what [the sender] sent. It is the most information that on average you could ever get out of the noisy channel." (Bart Kosko, "Noise", 2006)
"Information theory leads to the quantification of the information content of the source, as denoted by entropy, the characterization of the information-bearing capacity of the communication channel, as related to its noise characteristics, and consequently the establishment of the relationship between the information content of the source and the capacity of the channel. In short, information theory provides a quantitative measure of the information contained in message signals and help determine the capacity of a communication system to transfer this information from source to sink over a noisy channel in a reliable fashion." (Ali Grami, "Information Theory", 2016)
No comments:
Post a Comment