05 December 2020

Systems Thinking: On Entropy (Quotes)

"If for the entire universe we conceive the same magnitude to be determined, consistently and with due regard to all circumstances, which for a single body I have called entropy, and if at the same time we introduce the other and simpler conception of energy, we may express in the following manner the fundamental laws of the universe which correspond to the two fundamental theorems of the mechanical theory of heat: (1) The energy of the universe is constant. (2) The entropy of the universe tends to a maximum." (Rudolf Clausius, "The Mechanical Theory of Heat - With its Applications to the Steam Engine and to Physical Properties of Bodies", 1867)

"[…] the quantities of heat which must be imparted to, or withdrawn from a changeable body are not the same, when these changes occur in a non-reversible manner, as they are when the same changes occur reversibly. In the second place, with each non-reversible change is associated an uncompensated transformation […] I propose to call the magnitude S the entropy of the body […] I have intentionally formed the word entropy so as to be as similar as possible to the word energy […]" (Rudolf Clausius, "The Mechanical Theory of Heat", 1867)

"The second fundamental theorem [the second law of thermodynamics], in the form which I have given to it, asserts that all transformations occurring in nature may take place in a certain direction, which I have assumed as positive, by themselves, that is, without compensation […] the entire condition of the universe must always continue to change in that first direction, and the universe must consequently approach incessantly a limiting condition. […] For every body two magnitudes have thereby presented themselves - the transformation value of its thermal content [the amount of inputted energy that is converted to 'work'], and its disgregation [separation or disintegration]; the sum of which constitutes its entropy." (Rudolf Clausius, "The Mechanical Theory of Heat", 1867)

"Since a given system can never of its own accord go over into another equally probable state but into a more probable one, it is likewise impossible to construct a system of bodies that after traversing various states returns periodically to its original state, that is a perpetual motion machine." (Ludwig E Boltzmann, "The Second Law of Thermodynamics", [Address to a Formal meeting of the Imperial Academy of Science], 1886) 

"[…] only a part of the whole intrinsic energy of the system is capable of being converted into mechanical work by actions going on within the vessel, and without any communication with external space by the passage either of matter or of heat. This part is sometimes called the Available Energy of the system. Clausius has called the remainder of the energy, which cannot be converted into work, the Entropy of the system. We shall find it more convenient to adopt the suggestion of Professor Tait, and give the name of Entropy to the part which can be converted into mechanical work." (James C Maxwell, "Theory of Heat", 1899)

"The Entropy of a system is the mechanical work it can perform without communication of heat, or alteration of its total volume, all transference of heat being performed by reversible engines. When the pressure and temperature of the system have become uniform the entropy is exhausted. The original energy of the system is equal to the sum of the entropy and the energy remaining in the state of uniform pressure and temperature. The entropy of a system consisting of several component systems is the same in whatever order the entropy of the parts is exhausted. It is therefore equal to the sum of the entropy of each component system, together with the entropy of the system consisting of the component systems, each with its own entropy exhausted." (James C Maxwell, "Theory of Heat", 1899)

"[…] the result of the conduction and radiation of heat from one part of a system to another is to diminish the entropy of the system, or the energy, available as work, which can be obtained from the system. The energy of the system, however, is indestructible, and as it has not been removed from the system, it must remain' in it. Hence the intrinsic energy of the system, when the entropy is exhausted by thermal communication, conduction, and radiation, is equal to its original energy, and is of course greater than in the case in which the entropy is exhausted by means of the reversible engine." (James C Maxwell, "Theory of Heat", 1899)

"Heretics are the only (bitter) remedy against the entropy of human thought." (Yevgeny Zamiatin, "On Literature, Revolution, Entropy, and Other Matters", 1923)

"No revolution, no heresy is comfortable or easy. For it is a leap, it is a break in the smooth evolutionary curve, and a break is a wound, a pain. But the wound is necessary; most of mankind suffers from hereditary sleeping sickness, and victims of this sickness (entropy) must not be allowed to sleep, or it will be their final sleep, death."  (Yevgeny Zamiatin, "On Literature, Revolution, Entropy, and Other Matters", 1923)

"The second law of thermodynamics appears solely as a law of probability, entropy as a measure of the probability, and the increase of entropy is equivalent to a statement that more probable events follow less probable ones." (Max Planck, "A Survey of Physics", 1923)

"Revolution is everywhere, in everything. It is infinite. There is no final revolution, no final number. The social revolution is only one of an infinite number of numbers; the law of revolution is not a social law, but an immeasurably greater one. It is a cosmic, universal law - like the laws of the conservation of energy and of the dissipation of energy (entropy)." (Yevgeny Zamiatin, "On Literature, Revolution, Entropy, and Other Matters", 1923)

"Let us draw an arrow arbitrarily. If as we follow the arrow[,] we find more and more of the random element in the state of the world, then the arrow is pointing towards the future; if the random element decreases[,] the arrow points towards the past. That is the only distinction known to physics. This follows at once if our fundamental contention is admitted that the introduction of randomness is the only thing which cannot be undone. I shall use the phrase 'time's arrow' to express this one-way property of time which has no analogue in space." (Arthur S Eddington, "The Nature of the Physical World", 1928) 

"So far as physics is concerned, time's arrow is a property of entropy alone." (Arthur S Eddington, "The Nature of the Physical World", 1928)

"It was not easy for a person brought up in the ways of classical thermodynamics to come around to the idea that gain of entropy eventually is nothing more nor less than loss of information." (Gilbert N. Lewis, [Letter to Irving Langmuir] 1930)

"Thought interferes with the probability of events, and, in the long run therefore, with entropy." (David L Watson, 1930)

"True equilibria can occur only in closed systems and that, in open systems, disequilibria called ‘steady states’, or ‘flow equilibria’ are the predominant and characteristic feature. According to the second law of thermodynamics a closed system must eventually attain a time-independent equilibrium state, with maximum entropy and minimum free energy. An open system may, under certain conditions, attain a time-independent state where the system remains constant as a whole and in its phases, though there is a continuous flow of component materials. This is called a steady state. Steady states are irreversible as a whole. […] A closed system in equilibrium does not need energy for its preservation, nor can energy be obtained from it. In order to perform work, a system must be in disequilibrium, tending toward equilibrium and maintaining a steady state, Therefore the character of an open system is the necessary condition for the continuous working capacity of the organism." (Ludwig on Bertalanffy, "Theoretische Biologie: Band 1: Allgemeine Theorie, Physikochemie, Aufbau und Entwicklung des Organismus", 1932)

"[A living organism] feeds upon negative entropy […] Thus, the device by which an organism maintains itself stationary at a fairly high level of orderliness really consists in continually sucking orderliness from its environment." (Erwin Schrödinger, "What is Life? The Physical Aspect of the Living Cell", 1944)

"A permanent state is reached, in which no observable events occur. The physicist calls this the state of thermodynamical equilibrium, or of ‘maximum entropy’. Practically, a state of this kind is usually reached very rapidly. Theoretically, it is very often not yet an absolute equilibrium, not yet the true maximum of entropy. But then the final approach to equilibrium is very slow. It could take anything between hours, years, centuries […]." (Erwin Schrödinger, "What is Life?", 1944)

"An isolated system or a system in a uniform environment (which for the present consideration we do best to include as a part of the system we contemplate) increases its entropy and more or less rapidly approaches the inert state of maximum entropy. We now recognize this fundamental law of physics to be just the natural tendency of things to approach the chaotic state (the same tendency that the books of a library or the piles of papers and manuscripts on a writing desk display) unless we obviate it. (The analogue of irregular heat motion, in this case, is our handling those objects now and again without troubling to put them back in their proper places.)" (Erwin Schrödinger, "What is Life?", 1944)

"Every process, event, happening – call it what you will; in a word, everything that is going on in Nature means an increase of the entropy of the part of the world where it is going on. Thus a living organism continually increases its entropy – or, as you may say, produces positive entropy – and thus tends to approach the dangerous state of maximum entropy, which is death. It can only keep aloof from it, i.e. alive, by continually drawing from its environment negative entropy – which is something very positive as we shall immediately see. What an organism feeds upon is negative entropy. Or, to put it less paradoxically, the essential thing in metabolism is that the organism succeeds in freeing itself from all the entropy it cannot help producing while alive." (Erwin Schrödinger, "What is Life?", 1944)

"Hence the awkward expression ‘negative entropy’ can be replaced by a better one: entropy, taken with the negative sign, is itself a measure of order. Thus the device by which an organism maintains itself stationary at a fairly high level of orderliness ( = fairly low level of entropy) really consists in continually sucking orderliness from its environment. (Erwin Schrödinger, "What is Life?", 1944)

"There is no concept in the whole field of physics which is more difficult to understand than is the concept of entropy, nor is there one which is more fundamental." (Francis W Sears, "Mechanics, Heat and Sound", 1944)

"Time itself will come to an end. For entropy points the direction of time. Entropy is the measure of randomness. When all system and order in the universe have vanished, when randomness is at its maximum, and entropy cannot be increased, when there is no longer any sequence of cause and effect, in short when the universe has run down, there will be no direction to time - there will be no time." (Lincoln Barnett, "The Universe and Dr. Einstein", 1948)

"In classical physics, most of the fundamental laws of nature were concerned either with the stability of certain configurations of bodies, e.g. the solar system, or else with the conservation of certain properties of matter, e.g. mass, energy, angular momentum or spin. The outstanding exception was the famous Second Law of Thermodynamics, discovered by Clausius in 1850. This law, as usually stated, refers to an abstract concept called entropy, which for any enclosed or thermally isolated system tends to increase continually with lapse of time. In practice, the most familiar example of this law occurs when two bodies are in contact: in general, heat tends to flow from the hotter body to the cooler. Thus, while the First Law of Thermodynamics, viz. the conservation of energy, is concerned only with time as mere duration, the Second Law involves the idea of trend." (Gerald J Whitrow, "The Structure of the Universe: An Introduction to Cosmology", 1949)

"But in no case is there any question of time flowing backward, and in fact the concept of backward flow of time seems absolutely meaningless. […] If it were found that the entropy of the universe were decreasing, would one say that time was flowing backward, or would one say that it was a law of nature that entropy decreases with time?" (Percy W Bridgman, "Reflections of a Physicist", 1950)

"It is my thesis that the physical functioning of the living individual and the operation of some of the newer communication machines are precisely parallel in their analogous attempts to control entropy through feedback. Both of them have sensory receptors as one stage of their cycle of operation: that is, in both of them there exists a special apparatus for collecting information from the outer world at low energy levels, and for making it available in the operation of the individual or of the machine. In both cases these external messages are not taken neat, but through the internal transforming powers of the apparatus, whether it be alive or dead. The information is then turned into a new form available for the further stages of performance. In both the animal and the machine this performance is made to be effective on the outer world. In both of them, their performed action on the outer world, and not merely their intended action, is reported back to the central regulatory apparatus." (Norbert Wiener, "The Human Use of Human Beings", 1950)

"Just as entropy is a measure of disorganization, the information carried by a set of messages is a measure of organization. In fact, it is possible to interpret the information carried by a message as essentially the negative of its entropy, and the negative logarithm of its probability. That is, the more probable the message, the less information it gives. Clichés, for example, are less illuminating than great poems." (Norbert Wiener, "The Human Use of Human Beings", 1950)

"Progress imposes not only new possibilities for the future but new restrictions. It seems almost as if progress itself and our fight against the increase of entropy intrinsically must end in the downhill path from which we are trying to escape." (Norbert Wiener, "The Human Use of Human Beings", 1950)

"[…] the characteristic tendency of entropy is to increase. As entropy increases, the universe, and all closed systems in the universe, tend naturally to deteriorate and lose their distinctiveness, to move from the least to the most probable state, from a state of organization and differentiation in which distinctions and forms exist, to a state of chaos and sameness." (Norbert Wiener, "The Human Use of Human Beings", 1950)

"There is no concept in the whole field of physics which is more difficult to understand than is the concept of entropy, nor is there one which is more fundamental." (Francis W Sears, "Mechanics, heat and sound", 1950)

 "The powerful notion of entropy, which comes from a very special branch of physics […] is certainly useful in the study of communication and quite helpful when applied in the theory of language." (J Robert Oppenheimer, "The Growth of Science and the Structure of Culture", Daedalus 87 (1), 1958) 

"But in addition to what we decide to do by way of transformation, there are certain tendencies in the way systems behave of their own volition when left to their own devices. The convenient analogy for one of these processes is found in the second law of thermodynamics: an 'ordering' process goes on, for which the name is entropy. This can be explained without technicalities as the tendency of a system to settle down to a uniform distribution of its energy. The dissipation of local pockets of high energy is measured by an increase in entropy, until at maximum entropy all is uniform. According to this model, order is more 'natural' than chaos. This is the reason why it is convenient to discuss cybernetic systems, with their self-regulating tendency to attain stability or orderliness, in terms of entropy - a term which has been taken over to name a key tool of cybernetics." (Stafford Beer, "Cybernetics and Management", 1959)

"Science is usually understood to depict a universe of strict order and lawfulness, of rigorous economy - one whose currency is energy, convertible against a service charge into a growing common pool called entropy." (Paul A Weiss,"Organic Form: Scientific and Aesthetic Aspects", 1960)

"The basic objection to attempts to deduce the unidirectional nature of time from concepts such as entropy is that they are attempts to reduce a more fundamental concept to a less fundamental one." (Gerald J Whitrow, "The Natural Philosophy of Time", 1961)

"Entropy is a measure of the heat energy in a substance that has been lost and is no longer available for work. It is a measure of the deterioration of a system." (William B. Sill & Norman Hoss (Eds.), "Popular Science Encyclopedia of the Sciences", 1963)

"Suppose we divide the space into little volume elements. If we have black and white molecules, how many ways could we distribute them among the volume elements so that white is on one side and black is on the other? On the other hand, how many ways could we distribute them with no restriction on which goes where? Clearly, there are many more ways to arrange them in the latter case. We measure 'disorder' by the number of ways that the insides can be arranged, so that from the outside it looks the same. The logarithm of that number of ways is the entropy. The number of ways in the separated case is less, so the entropy is less, or the 'disorder' is less." (Richard P Feynman, "Order And Entropy" ["The Feynman Lectures on Physics"], 1964)

"The homeostatic principle does not apply literally to the functioning of all complex living systems, in that in counteracting entropy they move toward growth and expansion." (Daniel Katz, "The Social Psychology of Organizations", 1966)

"Higher, directed forms of energy (e.g., mechanical, electric, chemical) are dissipated, that is, progressively converted into the lowest form of energy, i.e., undirected heat movement of molecules; chemical systems tend toward equilibria with maximum entropy; machines wear out owing to friction; in communication channels, information can only be lost by conversion of messages into noise but not vice versa, and so forth." (Ludwig von Bertalanffy, "Robots, Men and Minds", 1967)

"No structure, even an artificial one, enjoys the process of entropy. It is the ultimate fate of everything, and everything resists it." (Philip K Dick, "Galactic Pot-Healer", 1969)

"There is a kind of second law of cultural dynamics which states simply that when anything has been done, it cannot be done again. In other words, we start off any system with a potential for novelty which is gradually exhausted. We see this in every field of human life, in the arts as well as the sciences. Once Beethoven has written the Ninth Symphony, nobody else can do it. Consequently, we find that in any evolutionary process, even in the arts, the search for novelty becomes corrupting. The 'entropy trap' is perhaps the most subtle and the most fundamental of the obstacles toward realising the developed society." (Kenneth Boulding, "The Science Revelation", Bulletin of the Atomic Scientists Vol. 26 (7), 1970)

"To adapt to a changing environment, the system needs a variety of stable states that is large enough to react to all perturbations but not so large as to make its evolution uncontrollably chaotic. The most adequate states are selected according to their fitness, either directly by the environment, or by subsystems that have adapted to the environment at an earlier stage. Formally, the basic mechanism underlying self-organization is the (often noise-driven) variation which explores different regions in the system’s state space until it enters an attractor. This precludes further variation outside the attractor, and thus restricts the freedom of the system’s components to behave independently. This is equivalent to the increase of coherence, or decrease of statistical entropy, that defines self-organization." (Francis Heylighen, "The Science Of Self-Organization And Adaptivity", 1970)

"You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage." (John von Neumann) [Suggesting to Claude Shannon a name for his new uncertainty function, see Scientific American Vol. 225 (3), 1971]

"In an isolated system, which cannot exchange energy and matter with the surroundings, this tendency is expressed in terms of a function of the macroscopic state of the system: the entropy." (Ilya Prigogine, "Thermodynamics of Evolution", 1972) 

"The functional order maintained within living systems seems to defy the Second Law; nonequilibrium thermodynamics describes how such systems come to terms with entropy." (Ilya Prigogine, "Thermodynamics of Evolution", 1972) 

"There is nothing supernatural about the process of self-organization to states of higher entropy; it is a general property of systems, regardless of their materials and origin. It does not violate the Second Law of thermodynamics since the decrease in entropy within an open system is always offset by the increase of entropy in its surroundings." (Ervin László, "Introduction to Systems Philosophy", 1972)

"Life, this anti-entropy, ceaselessly reloaded with energy, is a climbing force, toward order amidst chaos, toward light, among the darkness of the indefinite, toward the mystic dream of Love, between the fire which devours itself and the silence of the Cold. Such a Nature does not accept abdication, nor skepticism." (Albert Claude, [Nobel lecture for award received] 1974)

"Entropy theory is indeed a first attempt to deal with global form; but it has not been dealing with structure. All it says is that a large sum of elements may have properties not found in a smaller sample of them." (Rudolf Arnheim, "Entropy and Art: An Essay on Disorder and Order", 1974) 

"Entropy theory, on the other hand, is not concerned with the probability of succession in a series of items but with the overall distribution of kinds of items in a given arrangement." (Rudolf Arnheim, "Entropy and Art: An Essay on Disorder and Order", 1974) 

"If entropy must constantly and continuously increase, then the universe is remorselessly running down, thus setting a limit (a long one, to be sure) on the existence of humanity. To some human beings, this ultimate end poses itself almost as a threat to their personal immortality, or as a denial of the omnipotence of God. There is, therefore, a strong emotional urge to deny that entropy must increase." (Isaac Asimov, "Asimov on Physics", 1976)

"The amount of information conveyed by the message increases as the amount of uncertainty as to what message actually will be produced becomes greater. A message which is one out of ten possible messages conveys a smaller amount of information than a message which is one out of a million possible messages. The entropy of communication theory is a measure of this uncertainty and the uncertainty, or entropy, is taken as the measure of the amount of information conveyed by a message from a source. The more we know about what message the source will produce, the less uncertainty, the less the entropy, and the less the information." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals and Noise", 1979) 

"The interaction between parts of mind is triggered by difference, and difference is a nonsubstantial phenomenon not located in space or time; difference is related to negentropy and entropy rather than energy." (Gregory Bateson, "Mind and Nature: A Necessary Unity", 1979)

"Thus, an increase in entropy means a decrease in our ability to change thermal energy, the energy of heat, into mechanical energy. An increase of entropy means a decrease of available energy." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals and Noise", 1979)

"Time goes forward because energy itself is always moving from an available to an unavailable state. Our consciousness is continually recording the entropy change in the world around us. [...] we experience the passage of time by the succession of one event after another. And every time an event occurs anywhere in this world energy is expended and the overall entropy is increased. To say the world is running out of time then, to say the world is running out of usable energy. In the words of Sir Arthur Eddington, 'Entropy is time's arrow'." (Jeremy Rifkin, "Entropy", 1980)

"Thus, in physics, entropy is associated with the possibility of converting thermal energy into mechanical energy. If the entropy does not change during a process, the process is reversible. If the entropy increases, the available energy decreases. Statistical mechanics interprets an increase of entropy as a decrease in order or, if we wish, as a decrease in our knowledge." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

"In microscopic systems, consisting of only a few molecules, the second law is violated regularly, but in macroscopic systems, which consist of vast numbers of molecules, the probability that the total entropy of the system will increase becomes virtual certainty. Thus in any isolated system, made up of a large number of molecules, the entropy - or disorder -will keep increasing until, eventually, the system reaches a state of maximum entropy, also known as 'heat death'; in this state all activity has ceased, all material being evenly distributed and at the same temperature. According to classical physics, the universe as a whole is going toward such a state of maximum entropy; it is running down and will eventually grind to a halt." (Fritjof Capra, "The Turning Point: Science, Society, and the Turning Culture", 1982)

"The phenomenon of self-organization is not limited to living matter but occurs also in certain chemical systems […] [Ilya] Prigogine has called these systems 'dissipative structures' to express the fact that they maintain and develop structure by breaking down other structures in the process of metabolism, thus creating entropy­ disorder - which is subsequently dissipated in the form of degraded waste products. Dissipative chemical structures display the dynamics of self-organization in its simplest form, exhibiting most of the phenomena characteristic of life self-renewal, adaptation, evolution, and even primitive forms of 'mental' processes." (Fritjof Capra, "The Turning Point: Science, Society, and the Turning Culture", 1982)

"The third model regards mind as an information processing system. This is the model of mind subscribed to by cognitive psychologists and also to some extent by the ego psychologists. Since an acquisition of information entails maximization of negative entropy and complexity, this model of mind assumes mind to be an open system." (Thaddus E Weckowicz, "Models of Mental Illness", 1984) 

"Disorder increases with time because we measure time in the direction in which disorder increases." (Stephen W Hawking, "The Direction of Time", New Scientist 115 (1568), 1987)

"Somehow, after all, as the universe ebbs toward its final equilibrium in the featureless heat bath of maximum entropy, it manages to create interesting structures." (James Gleick, "Chaos: Making a New Science", 1987)

"Just like a computer, we must remember things in the order in which entropy increases. This makes the second law of thermodynamics almost trivial. Disorder increases with time because we measure time in the direction in which disorder increases."  (Stephen Hawking, "A Brief History of Time", 1988)

"The increase of disorder or entropy with time is one example of what is called an arrow of time something that gives a direction to time and distinguishes the past from the future. There are at least three different directions of time. First, there is the thermodynamic arrow of time - the direction of time in which disorder or entropy increases. Second, there is the psychological arrow of time. This is the direction in which we feel time passes - the direction of time in which we remember the past, but not the future. Third, there is the cosmological arrow of time. This is the direction of time in which the universe is expanding rather than contracting." (Stephen W. Hawking, "The Direction of Time", New Scientist 46, 1987)

"Complexity is not an objective factor but a subjective one. Supersignals reduce complexity, collapsing a number of features into one. Consequently, complexity must be understood in terms of a specific individual and his or her supply of supersignals. We learn supersignals from experience, and our supply can differ greatly from another individual's. Therefore there can be no objective measure of complexity." (Dietrich Dorner, "The Logic of Failure: Recognizing and Avoiding Error in Complex Situations", 1989)

"The view of science is that all processes ultimately run down, but entropy is maximized only in some far, far away future. The idea of entropy makes an assumption that the laws of the space-time continuum are infinitely and linearly extendable into the future. In the spiral time scheme of the timewave this assumption is not made. Rather, final time means passing out of one set of laws that are conditioning existence and into another radically different set of laws. The universe is seen as a series of compartmentalized eras or epochs whose laws are quite different from one another, with transitions from one epoch to another occurring with unexpected suddenness." (Terence McKenna, "True Hallucinations", 1989)

"The inflationary period of expansion does not smooth out irregularity by entropy-producing processes like those explored by the cosmologies of the seventies. Rather it sweeps the irregularity out beyond the Horizon of our visible Universe, where we cannot see it . The entire universe of stars and galaxies on view to us. […] on this hypothesis, is but the reflection of a minute, perhaps infinitesimal, portion of the universe's initial conditions, whose ultimate extent and structure must remain forever unknowable to us. A theory of everything does not help here. The information contained in the observable part of the universe derives from the evolution of a tiny part of the initial conditions for the entire universe. The sum total of all the observations we could possibly make can only tell us about a minuscule portion of the whole." (John D Barrow, "Theories of Everything: The Quest for Ultimate Explanation", 1991)

"Three laws governing black hole changes were thus found, but it was soon noticed that something unusual was going on. If one merely replaced the words 'surface area' by 'entropy' and 'gravitational field' by 'temperature', then the laws of black hole changes became merely statements of the laws of thermodynamics. The rule that the horizon surface areas can never decrease in physical processes becomes the second law of thermodynamics that the entropy can never decrease; the constancy of the gravitational field around the horizon is the so-called zeroth law of thermodynamics that the temperature must be the same everywhere in a state of thermal equilibrium. The rule linking allowed changes in the defining quantities of the black hole just becomes the first law of thermodynamics, which is more commonly known as the conservation of energy." (John D Barrow, "Theories of Everything: The Quest for Ultimate Explanation", 1991)

"The new information technologies can be seen to drive societies toward increasingly dynamic high-energy regions further and further from thermodynamical equilibrium, characterized by decreasing specific entropy and increasingly dense free-energy flows, accessed and processed by more and more complex social, economic, and political structures." (Ervin László, "Information Technology and Social Change: An Evolutionary Systems Analysis", Behavioral Science 37, 1992) 

"The Law of Entropy Nonconservation required that life be lived forward, from birth to death. […] To wish for the reverse was to wish for the entropy of the universe to diminish with time, which was impossible. One might as well wish for autumn leaves to assemble themselves in neat stacks just as soon as they had fallen from trees or for water to freeze whenever it was heated." (Michael Guillen, "Five Equations That Changed the World", 1995)

"The second law of thermodynamics, which requires average entropy (or disorder) to increase, does not in any way forbid local order from arising through various mechanisms of self-organization, which can turn accidents into frozen ones producing extensive regularities. Again, such mechanisms are not restricted to complex adaptive systems." (Murray Gell-Mann, "What is Complexity?", Complexity Vol 1 (1), 1995)

"All systems evolve, although the rates of evolution may vary over time both between and within systems. The rate of evolution is a function of both the inherent stability of the system and changing environmental circumstances. But no system can be stabilized forever. For the universe as a whole, an isolated system, time’s arrow points toward greater and greater breakdown, leading to complete molecular chaos, maximum entropy, and heat death. For open systems, including the living systems that are of major interest to us and that interchange matter and energy with their external environments, time’s arrow points to evolution toward greater and greater complexity. Thus, the universe consists of islands of increasing order in a sea of decreasing order. Open systems evolve and maintain structure by exporting entropy to their external environments." (L Douglas Kiel, "Chaos Theory in the Social Sciences: Foundations and Applications", 1996)

"Contrary to what happens at equilibrium, or near equilibrium, systems far from equilibrium do not conform to any minimum principle that is valid for functions of free energy or entropy production." (Ilya Prigogine, "The End of Certainty: Time, Chaos, and the New Laws of Nature", 1996) 

"Complex systems operate under conditions far from equilibrium. Complex systems need a constant flow of energy to change, evolve and survive as complex entities. Equilibrium, symmetry and complete stability mean death. Just as the flow, of energy is necessary to fight entropy and maintain the complex structure of the system, society can only survive as a process. It is defined not by its origins or its goals, but by what it is doing." (Paul Cilliers,"Complexity and Postmodernism: Understanding Complex Systems", 1998)

"In a closed system, the change in entropy must always be 'positive', meaning toward death. However, in open biological or social systems, entropy can be arrested and may even be transformed into negative entropy - a process of more complete organization and enhanced ability to transform resources. Why? Because the system imports energy and resources from its environment, leading to renewal. This is why education and learning are so important, as they provide new and stimulating input (termed neg-entropy) that can transform each of us." (Stephen G Haines, "The Managers Pocket Guide to Systems Thinking & Learning", 1998)

"No one has yet succeeded in deriving the second law from any other law of nature. It stands on its own feet. It is the only law in our everyday world that gives a direction to time, which tells us that the universe is moving toward equilibrium and which gives us a criteria for that state, namely, the point of maximum entropy, of maximum probability. The second law involves no new forces. On the contrary, it says nothing about forces whatsoever." (Brian L Silver, "The Ascent of Science", 1998)

"Physical systems are subject to the force of entropy, which increases until eventually the entire system fails. The tendency toward maximum entropy is a movement to disorder, complete lack of resource transformation, and death." (Stephen G Haines, "The Managers Pocket Guide to Systems Thinking & Learning", 1998)

"All systems have a tendency toward maximum entropy, disorder, and death. Importing resources from the environment is key to long-term viability; closed systems move toward this disorganization faster than open systems." (Stephen G Haines, "The Systems Thinking Approach to Strategic Planning and Management", 2000)

"Defined from a societal standpoint, information may be seen as an entity which reduces maladjustment between system and environment. In order to survive as a thermodynamic entity, all social systems are dependent upon an information flow. This explanation is derived from the parallel between entropy and information where the latter is regarded as negative entropy (negentropy). In more common terms information is a form of processed data or facts about objects, events or persons, which are meaningful for the receiver, inasmuch as an increase in knowledge reduces uncertainty." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Emergent self-organization in multi-agent systems appears to contradict the second law of thermodynamics. This paradox has been explained in terms of a coupling between the macro level that hosts self-organization (and an apparent reduction in entropy), and the micro level (where random processes greatly increase entropy). Metaphorically, the micro level serves as an entropy 'sink', permitting overall system entropy to increase while sequestering this increase from the interactions where self-organization is desired." (H Van Dyke Parunak & Sven Brueckner, "Entropy and Self-Organization in Multi-Agent Systems", Proceedings of the International Conference on Autonomous Agents, 2001)

"Entropy [...] is the amount of disorder or randomness present in any system. All non-living systems tend toward disorder; left alone they will eventually lose all motion and degenerate into an inert mass. When this permanent stage is reached and no events occur, maximum entropy is attained. A living system can, for a finite time, avert this unalterable process by importing energy from its environment. It is then said to create negentropy, something which is characteristic of all kinds of life." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"The acquisition of information is a flow from noise to order - a process converting entropy to redundancy. During this process, the amount of information decreases but is compensated by constant re- coding. In the recoding the amount of information per unit increases by means of a new symbol which represents the total amount of the old. The maturing thus implies information condensation. Simultaneously, the redundance decreases, which render the information more difficult to interpret." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"The function of living matter is apparently to expand the organization of the universe. Here, locally decreased entropy as a result of biological order in existing life is invalidating the effects of the second law of thermodynamics, although at the expense of increased entropy in the whole system. It is the running down of the universe that made the sun and the earth possible. It is the running down of the sun that made life and us possible." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Nature normally hates power laws. In ordinary systems all quantities follow bell curves, and correlations decay rapidly, obeying exponential laws. But all that changes if the system is forced to undergo a phase transition. Then power laws emerge-nature's unmistakable sign that chaos is departing in favor of order. The theory of phase transitions told us loud and clear that the road from disorder to order is maintained by the powerful forces of self-organization and is paved by power laws. It told us that power laws are not just another way of characterizing a system's behavior. They are the patent signatures of self-organization in complex systems." (Albert-László Barabási, "Linked: How Everything Is Connected to Everything Else and What It Means for Business, Science, and Everyday Life", 2002)

"Entropy is not about speeds or positions of particles, the way temperature and pressure and volume are, but about our lack of information." (Hans C von Baeyer," Information, The New Language of Science", 2003)

"The best case that can be made for human sync to the environment (outside of circadian entrainment) has to do with the possibility that electrical rhythms in our brains can be influenced by external signals." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"The principle of maximum entropy is employed for estimating unknown probabilities (which cannot be derived deductively) on the basis of the available information. According to this principle, the estimated probability distribution should be such that its entropy reaches maximum within the constraints of the situation, i.e., constraints that represent the available information. This principle thus guarantees that no more information is used in estimating the probabilities than available." (George J Klir & Doug Elias, "Architecture of Systems Problem Solving" 2nd Ed, 2003) 

"The principle of minimum entropy is employed in the formulation of resolution forms and related problems. According to this principle, the entropy of the estimated probability distribution, conditioned by a particular classification of the given events (e.g., states of the variable involved), is minimum subject to the constraints of the situation. This principle thus guarantees that all available information is used, as much as possible within the given constraints (e.g., required number of states), in the estimation of the unknown probabilities." (George J Klir & Doug Elias, "Architecture of Systems Problem Solving" 2nd Ed, 2003)


"The total disorder in the universe, as measured by the quantity that physicists call entropy, increases steadily steadily as we go from past to future. On the other hand, the total order in the universe, as measured by the complexity and permanence of organized structures, also increases steadily as we go from past to future." (Freeman Dyson, [Page-Barbour lecture], 2004)

"At the foundation of classical thermodynamics are the first and second laws. The first law formulates that the total energy of a system is conserved, while the second law states that the entropy of an isolated system can only increase. The second law implies that the free energy of an isolated system is successively degraded by diabatic processes over time, leading to entropy production. This eventually results in an equilibrium state of maximum entropy. In its statistical interpretation, the direction towards higher entropy can be interpreted as a transition to more probable states." (Axel Kleidon & Ralph D Lorenz, "Entropy Production by Earth System Processes" [in "Non- quilibrium Thermodynamics and the Production of Entropy"], 2005)

"However, the law of accelerating returns pertains to evolution, which is not a closed system. It takes place amid great chaos and indeed depends on the disorder in its midst, from which it draws its options for diversity. And from these options, an evolutionary process continually prunes its choices to create ever greater order."  (Ray Kurzweil, "The Singularity is Near", 2005)

"A bit involves both probability and an experiment that decides a binary or yes-no question. Consider flipping a coin. One bit of in-formation is what we learn from the flip of a fair coin. With an unfair or biased coin the odds are other than even because either heads or tails is more likely to appear after the flip. We learn less from flipping the biased coin because there is less surprise in the outcome on average. Shannon's bit-based concept of entropy is just the average information of the experiment. What we gain in information from the coin flip we lose in uncertainty or entropy." (Bart Kosko, "Noise", 2006)

"The second law of thermodynamics states that in an isolated system, entropy can only increase, not decrease. Such systems evolve to their state of maximum entropy, or thermodynamic equilibrium. Therefore, physical self-organizing systems cannot be isolated: they require a constant input of matter or energy with low entropy, getting rid of the internally generated entropy through the output of heat ('dissipation'). This allows them to produce ‘dissipative structures’ which maintain far from thermodynamic equilibrium. Life is a clear example of order far from thermodynamic equilibrium." (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)

"The total disorder in the universe, as measured by the quantity that physicists call entropy, increases steadily as we go from past to future. On the other hand, the total order in the universe, as measured by the complexity and permanence of organized structures, also increases steadily as we go from past to future." (Freeman J Dyson, "A Many-Colored Glass: Reflections on the Place of Life in the Universe", 2007)

"We have to be aware that even in mathematical and physical models of self-organizing systems, it is the observer who ascribes properties, aspects, states, and probabilities; and therefore entropy or order to the system. But organization is more than low entropy: it is structure that has a function or purpose." (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)

"Heat is the energy of random chaotic motion, and entropy is the amount of hidden microscopic information." (Leonard Susskind, "The Black Hole War", 2008)

"In fact, H [entropy] measures the amount of uncertainty that exists in the phenomenon. If there were only one event, its probability would be equal to 1, and H would be equal to 0 - that is, there is no uncertainty about what will happen in a phenomenon with a single event because we always know what is going to occur. The more events that a phenomenon possesses, the more uncertainty there is about the state of the phenomenon. In other words, the more entropy, the more information." (Diego Rasskin-Gutman, "Chess Metaphors: Artificial Intelligence and the Human Mind", 2009)

"Entropy is the crisp scientific name for waste, chaos, and disorder. As far as we know, the sole law of physics with no known exceptions anywhere in the universe is this: All creation is headed to the basement. Everything in the universe is steadily sliding down the slope toward the supreme equality of wasted heat and maximum entropy." (Kevin Kelly, "What Technology Wants", 2010)

"Second Law of thermodynamics is not an equality, but an inequality, asserting merely that a certain quantity referred to as the entropy of an isolated system - which is a measure of the system’s disorder, or ‘randomness’ - is greater (or at least not smaller) at later times than it was at earlier times." (Roger Penrose, "Cycles of Time: An Extraordinary New View of the Universe", 2010) 

"The laws of thermodynamics tell us something quite different. Economic activity is merely borrowing low-entropy energy inputs from the environment and transforming them into temporary products and services of value. In the transformation process, often more energy is expended and lost to the environment than is embedded in the particular good or service being produced." (Jeremy Rifkin, "The Third Industrial Revolution", 2011)

"The psychic entropy peculiar to the human condition involves seeing more to do than one can actually accomplish and feeling able to accomplish more than what conditions allow."(Mihaly Csikszentmihalyi, "Flow: The Psychology of Happiness", 2013)

"In a physical system, information is the opposite of entropy, as it involves uncommon and highly correlated configurations that are difficult to arrive at." (César A. Hidalgo, "Why Information Grows: The Evolution of Order, from Atoms to Economies", 2015)

"The passage of time and the action of entropy bring about ever-greater complexity - a branching, blossoming tree of possibilities. Blossoming disorder (things getting worse), now unfolding within the constraints of the physics of our universe, creates novel opportunities for spontaneous ordered complexity to arise." (D J MacLennan, "Frozen to Life", 2015)

"Information theory leads to the quantification of the information content of the source, as denoted by entropy, the characterization of the information-bearing capacity of the communication channel, as related to its noise characteristics, and consequently the establishment of the relationship between the information content of the source and the capacity of the channel. In short, information theory provides a quantitative measure of the information contained in message signals and help determine the capacity of a communication system to transfer this information from source to sink over a noisy channel in a reliable fashion." (Ali Grami, "Information Theory", 2016)

"The natural effect of processes going on in the Universe is to move from a state of order to a state of disorder, unless there is an input of energy from outside." (John R Gribbin, "The Time Illusion", 2016) 

"The Second Law of Thermodynamics states that in an isolated system (one that is not taking in energy), entropy never decreases. (The First Law is that energy is conserved; the Third, that a temperature of absolute zero is unreachable.) Closed systems inexorably become less structured, less organized, less able to accomplish interesting and useful outcomes, until they slide into an equilibrium of gray, tepid, homogeneous monotony and stay there." (Steven Pinker, "The Second Law of Thermodynamics", 2017)

"In information theory this notion, introduced by Claude Shannon, is used to express unpredictability of information content. For instance, if a data set containing n items was divided into k groups each comprising n i items, then the entropy of such a partition is H = p 1 log( p 1 ) + … + p k log( p k ), where p i = n i / n . In case of two alternative partitions, the mutual information is a measure of the mutual dependence between these partitions." (Slawomir T Wierzchon, "Ensemble Clustering Data Mining and Databases", 2018) [where i is used as index]

"Our greatest enemies are ultimately not our political adversaries but entropy, evolution (in the form of pestilence and the flaws in human nature), and most of all ignorance - a shortfall of knowledge of how best to solve our problems." (Steven Pinker, "Enlightenment Now: The Case for Reason, Science, Humanism, and Progress", 2018)

"Entropy is a measure of amount of uncertainty or disorder present in the system within the possible probability distribution. The entropy and amount of unpredictability are directly proportional to each other." (G Suseela & Y Asnath V Phamila, "Security Framework for Smart Visual Sensor Networks", 2019)

"In the physics [entropy is the] rate of system's messiness or disorder in a physical system. In the social systems theory - social entropy is a sociological theory that evaluates social behaviors using a method based on the second law of thermodynamics." (Justína Mikulášková et al, "Spiral Management: New Concept of the Social Systems Management", 2020)

"Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy." (Václav Havel, [Letter to Gustáv Husák]) 

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...