Showing posts with label life. Show all posts
Showing posts with label life. Show all posts

01 November 2023

❄️Systems Thinking: On Living Organisms (Just the Quotes)

"Constantly regard the universe as one living being, having one substance and one soul; and observe how all things have reference to one perception, the perception of this one living being; and how all things act with one movement; and how all things are the cooperating causes of all things which exist; observe too the continuous spinning of the thread and the contexture of the web." (Marcus Aurelius, "Meditations", cca. 121–180 AD)

"Living things have no inertia, and tend to no equilibrium." (Thomas H Huxley, "On the Educational Value of the Natural History Sciences", 1854)

"A vital phenomenon can only be regarded as explained if it has been proven that it appears as the result of the material components of living organisms interacting according to the laws which those same components follow in their interactions outside of living systems." (Adolf E Fick, "Gesammelte Schriften" Vol. 3, 1904)

"Since the fundamental character of the living thing is its organization, the customary investigation of the single parts and processes cannot provide a complete explanation of the vital phenomena. This investigation gives us no information about the coordination of parts and processes. Thus, the chief task of biology must be to discover the laws of biological systems (at all levels of organization). We believe that the attempts to find a foundation for theoretical biology point at a fundamental change in the world picture. This view, considered as a method of investigation, we shall call ‘organismic biology’ and, as an attempt at an explanation, ‘the system theory of the organism’" (Ludwig von Bertalanffy, "Kritische Theorie der Formbildung", 1928)

"[…] to the scientific mind the living and the non-living form one continuous series of systems of differing degrees of complexity […], while to the philosophic mind the whole universe, itself perhaps an organism, is composed of a vast number of interlacing organisms of all sizes." (James G Needham, "Developments in Philosophy of Biology", Quarterly Review of Biology Vol. 3 (1), 1928)

"[A living organism] feeds upon negative entropy […] Thus, the device by which an organism maintains itself stationary at a fairly high level of orderliness really consists in continually sucking orderliness from its environment." (Erwin Schrodinger, "What is Life? The Physical Aspect of the Living Cell", 1944)

"Cybernetics is a word invented to define a new field in science. It combines under one heading the study of what in a human context is sometimes loosely described as thinking and in engineering is known as control and communication. In other words, cybernetics attempts to find the common elements in the functioning of automatic machines and of the human nervous system, and to develop a theory which will cover the entire field of control and communication in machines and in living organisms." (Norbert Wiener, "Cybernetics", 1948) 

"General systems theory is a series of related definitions, assumptions, and postulates about all levels of systems from atomic particles through atoms, molecules, crystals, viruses, cells, organs, individuals, small groups, societies, planets, solar systems, and galaxies. General behavior systems theory is a subcategory of such theory, dealing with living systems, extending roughly from viruses through societies. A significant fact about living things is that they are open systems, with important inputs and outputs. Laws which apply to them differ from those applying to relatively closed systems." (James G Miller, "General behavior systems theory and summary", Journal of Counseling Psychology 3 (2), 1956)

"Many of the activities of living organisms permit this double aspect. On the one hand the observer can notice the great deal of actual movement and change that occurs, and on the other hand he can observe that throughout these activities, so far as they are coordinated or homeostatic, there are invariants and constancies that show the degree of regulation that is being achieved." (W Ross Ashby, "An Introduction to Cybernetics", 1956)

"Learning is a property of all living organisms. […] Since organized groups can be looked upon as living entities, they can be expected to exhibit learning […]" (Winfred B. Hirschmann, "Profit from the Learning Curve", Harvard Business Review, 1964)

"[...] in a state of dynamic equilibrium with their environments. If they do not maintain this equilibrium they die; if they do maintain it they show a degree of spontaneity, variability, and purposiveness of response unknown in the non-living world. This is what is meant by ‘adaptation to environment’ […] [Its] essential feature […] is stability - that is, the ability to withstand disturbances." (Kenneth Craik, 'Living organisms', "The Nature of Psychology", 1966)

"System theory is basically concerned with problems of relationships, of structure, and of interdependence rather than with the constant attributes of objects. In general approach it resembles field theory except that its dynamics deal with temporal as well as spatial patterns. Older formulations of system constructs dealt with the closed systems of the physical sciences, in which relatively self-contained structures could be treated successfully as if they were independent of external forces. But living systems, whether biological organisms or social organizations, are acutely dependent on their external environment and so must be conceived of as open systems." (Daniel Katz, "The Social Psychology of Organizations", 1966)

"Conventional physics deals only with closed systems, i.e. systems which are considered to be isolated from their environment. [...] However, we find systems which by their very nature and definition are not closed systems. Every living organism is essentially an open system. It maintains itself in a continuous inflow and outflow, a building up and breaking down of components, never being, so long as it is alive, in a state of chemical and thermodynamic equilibrium but maintained in a so-called steady state which is distinct from the latter." (Ludwig von Bertalanffy, "General System Theory", 1968)

"Today our main problem is that of organized complexity. Concepts like those of organization, wholeness, directiveness, teleology, control, self-regulation, differentiation and the like are alien to conventional physics. However, they pop up everywhere in the biological, behavioural and social sciences, and are, in fact, indispensable for dealing with living organisms or social groups. Thus, a basic problem posed to modern science is a general theory of organization." (Ludwig von Bertalanff, "General System Theory, 1968)

"The fundamental problem today is that of organized complexity. Concepts like those of organization, wholeness, directiveness, teleology, and differentiation are alien to conventional physics. However, they pop up everywhere in the biological, behavioral and social sciences, and are, in fact, indispensable for dealing with living organisms or social groups. Thus a basic problem posed to modern science is a general theory of organization. General system theory is, in principle, capable of giving exact definitions for such concepts and, in suitable cases, of putting them to quantitative analysis." (Ludwig von Bertalanffy, "General System Theory", 1968)

"A living system, due to its circular organization, is an inductive system and functions always in a predictive manner: what happened once will occur again. Its organization, (genetic and otherwise) is conservative and repeats only that which works. For this same reason living systems are historical systems; the relevance of a given conduct or mode of behavior is always determined in the past." (Humberto Maturana, "Biology of Cognition", 1970)

"The words 'general systems theory' imply that some things can usefully be said about (living) systems in general, despite the immense diversity of their specific forms. One of these things should be a scheme of classification. Every science begins by classifying its subject matter, if only descriptively, and learns a lot about it in the process; and systems especially need this attention, because an adequate classification cuts across familiar boundaries and at the same time draws valid and important distinctions which have previously been sensed but not defined." (Geoffrey Vickers, 1970)

"Any living thing possesses an enormous amount of 'intelligence' [...] Today, this 'intelligence' is called 'information', but it is still the same thing. [...] This 'intelligence' is the sine qua non of life. If absent, no living being is imaginable. Where does it come from? This is a problem which concerns both biologists and philosophers, and, at present, science seems incapable of solving it." (Pierre P Grassé, "Evolution of Living Organisms: Evidence for a New Theory of Transformation", 1977)

"The most general form of systems theory is a set of logical or mathematical statements about all conceptual systems. A subset of this concerns all concrete systems. A subsubset concerns the very special and very important living systems, i. e., general living systems theory." (James G Miller, "Living systems", 1978)

"Systems theory looks at the world in terms of the interrelatedness and interdependence of all phenomena, and in this framework an integrated whole whose properties cannot be reduced to those of its parts is called a system. Living organisms, societies, and ecosystems are all systems." (Fritjof Capra, "The Turning Point: Science, Society, and the Turning Culture", 1982)

"The autonomy of living systems is characterized by closed, recursive organization. [...] A system's highest order of recursion or feedback process defines, generates, and maintains the autonomy of a system. The range of deviation this feedback seeks to control concerns the organization of the whole system itself. If the system should move beyond the limits of its own range of organization it would cease to be a system. Thus, autonomy refers to the maintenance of a systems wholeness. In biology, it becomes a definition of what maintains the variable called living." (Bradford P Keeney, "Aesthetics of Change", 1983)

"The dynamics of any system can be explained by showing the relations between its parts and the regularities of their interactions so as to reveal its organization. For us to fully understand it, however, we need not only to see it as a unity operating in its internal dynamics, but also to see it in its circumstances, i.e., in the context to which its operation connects it. This understanding requires that we adopt a certain distance for observation, a perspective that in the case of historical systems implies a reference to their origin. This can be easy, for instance, in the case of man-made machines, for we have access to every detail of their manufacture. The situation is not that easy, however, as regards living beings: their genesis and their history are never directly visible and can be reconstructed only by fragments."  (Humberto Maturana, "The Tree of Knowledge", 1987)

"Systems thinking is a discipline for seeing wholes. It is a framework for seeing interrelationships rather than things, for seeing patterns of change rather than static 'snapshots'. It is a set of general principles- distilled over the course of the twentieth century, spanning fields as diverse as the physical and social sciences, engineering, and management. [...] During the last thirty years, these tools have been applied to understand a wide range of corporate, urban, regional, economic, political, ecological, and even psychological systems. And systems thinking is a sensibility for the subtle interconnectedness that gives living systems their unique character." (Peter Senge, "The Fifth Discipline", 1990)

"We need to abandon the economist's notion of the economy as a machine, with its attendant concept of equilibrium. A more helpful way of thinking about the economy is to imagine it as a living organism." (Paul Ormerod, "The Death of Economics", 1994)

"All systems evolve, although the rates of evolution may vary over time both between and within systems. The rate of evolution is a function of both the inherent stability of the system and changing environmental circumstances. But no system can be stabilized forever. For the universe as a whole, an isolated system, time’s arrow points toward greater and greater breakdown, leading to complete molecular chaos, maximum entropy, and heat death. For open systems, including the living systems that are of major interest to us and that interchange matter and energy with their external environments, time’s arrow points to evolution toward greater and greater complexity. Thus, the universe consists of islands of increasing order in a sea of decreasing order. Open systems evolve and maintain structure by exporting entropy to their external environments." (L Douglas Kiel, "Chaos Theory in the Social Sciences: Foundations and Applications", 1996)

"Entropy [...] is the amount of disorder or randomness present in any system. All non-living systems tend toward disorder; left alone they will eventually lose all motion and degenerate into an inert mass. When this permanent stage is reached and no events occur, maximum entropy is attained. A living system can, for a finite time, avert this unalterable process by importing energy from its environment. It is then said to create negentropy, something which is characteristic of all kinds of life." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"All living organisms must feed on continual flows of matter and energy: from their environment to stay alive, and all living organisms continually produce waste. However, an ecosystem generates no net waste, one species' waste being another species' food. Thus, matter cycles continually through the web of life." (Fritjof Capra, "The Hidden Connections", 2002)

"The science of cybernetics is not about thermostats or machines; that characterization is a caricature. Cybernetics is about purposiveness, goals, information flows, decision-making control processes and feedback (properly defined) at all levels of living systems." (Peter Corning, "Synergy, Cybernetics, and the Evolution of Politics", 2005) 

"When defining living systems, the term dynamic equilibrium is essential. It does not imply something which is steady or stable. On the contrary, it is a floating state characterized by invisible movements and preparedness for change. To be in dynamic equilibrium is adapting adjustment to balance. Homeostasis stands for the sum of all control functions creating the state of dynamic equilibrium in a healthy organism. It is the ability of the body to maintain a narrow range of internal conditions in spite of environmental changes." (Lars Skyttner, "General Systems Theory: Problems, Perspective, Practice", 2005)

"The living world can be viewed as a biological hierarchy that starts with subcellular particles, and continues up through cells, tissues and organs. Ecology deals with the next three levels: the individual organism, the population (consisting of individuals of the same species) and the community (consisting of a greater or lesser number of species populations). At the level of the organism, ecology deals with how individuals are affected by (and how they affect) their environment. At the level of the population, ecology is concerned with the presence or absence of particular species, their abundance or rarity, and with the trends and fluctuations in their numbers. Community ecology then deals with the composition and organization of ecological communities." (Michael Begon et al, "Ecology: From individuals to ecosystems", 2006)

"Organisms live and die by the amount of information they acquire about their environment [...]" (Andreas Wagner, "From bit to it: how a complex metabolic network transforms information into living matter", BMC Systems Biology, 2007)

"We can draw several general conclusions. First, because populations of living organisms tend to grow exponentially, numbers can rise very rapidly. This explains the inevitable population pressure that helped Darwin realize the role of natural selection, Second, exponential growth must always be a short-term, temporary phenomenon; for living organisms, the growth typically stops because of predation or a lack of sufficient nutrients or energy. Third, these laws about growth apply to all species- our intelligence cannot make us immune to simple mathematical laws. This is a critical lesson, because human population has been growing exponentially for the past few centuries. Of course, our intelligence gives us one option not available to bacteria. Exponential growth can stop only through some combination of an increase in the death rate and a decrease in the birth rate." (Jeffrey O Bennett & Seth Shostak, "Life in the universe" 3rd Ed., 2012)

"Deep ecology does not separate humans - or anything else-from the natural environment. It sees the world not as a collection of isolated objects, but as a network of phenomena that are fundamentally interconnected and interdependent. Deep ecology recognizes the intrinsic value of all living beings and views humans as just one particular strand in the web of life." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"Shallow ecology is anthropocentric, or human-centered. It views humans as above or outside of nature, as the source of all value, and ascribes only instrumental, or ‘use’, value to nature. Deep ecology does not separate humans - or anything else-from the natural environment. It sees the world not as a collection of isolated objects, but as a network of phenomena that are fundamentally interconnected and interdependent. Deep ecology recognizes the intrinsic value of all living beings and views humans as just one particular strand in the web of life." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"Cybernetics studies the concepts of control and communication in living organisms, machines and organizations including self-organization. It focuses on how a (digital, mechanical or biological) system processes information, responds to it and changes or being changed for better functioning (including control and communication)." (Dmitry A Novikov, "Cybernetics 2.0", 2016)

"Living organisms are not just bags of information: they are computers. It follows that a full understanding of life will come only from unravelling its computational mechanisms. And that requires an excursion into the esoteric but fascinating foundations of logic, mathematics and computing." (Paul Davies, "The Demon in the Machine: How Hidden Webs of Information Are Solving the Mystery of Life", 2019)

"[...] living organisms manifest deep new physical principles, and that we are on the threshold of uncovering and harnessing those principles. What is different this time, and why it has taken so many decades to discover the real secret of life, is that the new physics is not simply a matter of an additional type of force – a 'life force' – but something altogether more subtle, something that interweaves matter and information, wholes and parts, simplicity and complexity." (Paul Davies, "The Demon in the Machine: How Hidden Webs of Information Are Solving the Mystery of Life", 2019)

04 January 2021

❄️🧬Systems Thinking: On Automata (Quotes)

"Ninety-nine [students] out of a hundred are automata, careful to walk in prescribed paths, careful to follow the prescribed custom. This is not an accident but the result of substantial education, which, scientifically defined, is the subsumption of the individual." (William T Harris, "The Philosophy of Education", 1889)

"We are automata entirely controlled by the forces of the medium being tossed about like corks on the surface of the water, but mistaking the resultant of the impulses from the outside for free will. The movements and other actions we perform are always life preservative and tho seemingly quite independent from one another, we are connected by invisible links." (Nikola Tesla, "My Inventions", 1919)

"Besides electrical engineering theory of the transmission of messages, there is a larger field [cybernetics] which includes not only the study of language but the study of messages as a means of controlling machinery and society, the development of computing machines and other such automata, certain reflections upon psychology and the nervous system, and a tentative new theory of scientific method." (Norbert Wiener, "Cybernetics", 1948)

"Automata have begun to invade certain parts of mathematics too, particularly but not exclusively mathematical physics or applied mathematics. The natural systems (e.g., central nervous system) are of enormous complexity and it is clearly necessary first to subdivide what they represent into several parts that to a certain extent are independent, elementary units. The problem then consists of understanding how these elements are organized as a whole. It is the latter problem which is likely to attract those who have the background and tastes of the mathematician or a logician. With this attitude, he will be inclined to forget the origins and then, after the process of axiomatization is complete, concentrate on the mathematical aspects." (John Von Neumann, "The General and Logical Theory of Automata", 1951)

"A world of automata - of creatures that worked like machines - would hardly be worth creating." (Clive S Lewis, Mere Christianity, 1952)

"It bothers me that, according to the laws as we understand them today, it takes a computing machine an infinite number of logical operations to figure out what goes on in no matter how tiny a region of space, and no matter how tiny a region of time. How can all that be going on in that tiny space? Why should it take an infinite amount of logic to figure out what a tiny piece of space-time is going to do? So I have often made the hypothesis that ultimately physics will not require a mathematical statement, that in the end the machinery will be revealed and the laws will turn out to be simple, like the checker board with all its apparent complexities." (Richard P Feynman, "The Character of Physical Law", 1965)

"Cellular automata are discrete dynamical systems with simple construction but complex self-organizing behaviour. Evidence is presented that all one-dimensional cellular automata fall into four distinct universality classes. Characterizations of the structures generated in these classes are discussed. Three classes exhibit behaviour analogous to limit points, limit cycles and chaotic attractors. The fourth class is probably capable of universal computation, so that properties of its infinite time behaviour are undecidable." (Stephen Wolfram, "Nonlinear Phenomena, Universality and complexity in cellular automata", Physica 10D, 1984)

"Cellular automata are mathematical models for complex natural systems containing large numbers of simple identical components with local interactions. They consist of a lattice of sites, each with a finite set of possible values. The value of the sites evolve synchronously in discrete time steps according to identical rules. The value of a particular site is determined by the previous values of a neighbourhood of sites around it." (Stephen Wolfram, "Nonlinear Phenomena, Universality and complexity in cellular automata", Physica 10D, 1984)

"Cellular automata may be considered as discrete dynamical systems. In almost all cases, cellular automaton evolution is irreversible. Trajectories in the configuration space for cellular automata therefore merge with time, and after many time steps, trajectories starting from almost all initial states become concentrated onto 'attractors'. These attractors typically contain only a very small fraction of possible states. Evolution to attractors from arbitrary initial states allows for 'self-organizing' behaviour, in which structure may evolve at large times from structureless initial states. The nature of the attractors determines the form and extent of such structures." (Stephen Wolfram, "Nonlinear Phenomena, Universality and complexity in cellular automata", Physica D (10), 1984)

"Perhaps the most exciting implication [of CA representation of biological phenomena] is the possibility that life had its origin in the vicinity of a phase transition and that evolution reflects the process by which life has gained local control over a successively greater number of environmental parameters affecting its ability to maintain itself at a critical balance point between order and chaos." ( Chris G Langton, "Computation at the Edge of Chaos: Phase Transitions and Emergent Computation", Physica D (42), 1990)

"Cellular automata are now being used to model varied physical phenomena normally modelled by wave equations, fluid dynamics, Ising models, etc. We hypothesize that there will be found a single cellular automaton rule that models all of microscopic physics; and models it exactly." (Edward Fredkin, "Nonlinear Phenomena", Physica D (45), 1990)

"Finite Nature is a hypothesis that ultimately every quantity of physics, including space and time, will turn out to be discrete and finite; that the amount of information in any small volume of space-time will be finite and equal to one of a small number of possibilities. [...] We take the position that Finite Nature implies that the basic substrate of physics operates in a manner similar to the workings of certain specialized computers called cellular automata." (Edward Fredkin, "A New Cosmogony", PhysComp ’92: Proceedings of the Workshop on Physics and Computation, 1993)

"Over and over again we will see the same kind of thing: that even though the underlying rules for a system are simple, and even though the system is started from simple initial conditions, the behavior that the system shows can nevertheless be highly complex." (Stephen Wolfram, "A New Kind of Science", 2002)

"Cellular Automata (CA) are discrete, spatially explicit extended dynamic systems composed of adjacent cells characterized by an internal state whose value belongs to a finite set. The updating of these states is made simultaneously according to a common local transition rule involving only a neighborhood of each cell." (Ramon Alonso-Sanz, "Cellular Automata with Memory", 2009) 

"Cellular automata are dynamical systems that are discrete in space, time, and value. A state of a cellular automaton is a spatial array of discrete cells, each containing a value chosen from a finite alphabet. The state space for a cellular automaton is the set of all such configurations." (Burton Voorhees, "Additive Cellular Automata", 2009)

"Discrete dynamic systems that evolve in space and time. A cellular automaton is composed of a set of discrete elements – the cells – connected with other cells of the automaton, and in each time unit each cell receives information about the current state of the cells to which it is connected. The cellular automaton evolve according a transition rule that specifies the current possible states of each cell as a function of the preceding state of the cell and the states of the connected cells." (Francesc S. Beltran et al, "A Language Shift Simulation Based on Cellular Automata", 2011)

"Cellular automata (henceforth: CA) are discrete, abstract computational systems that have proved useful both as general models of complexity and as more specific representations of non-linear dynamics in a variety of scientific fields. Firstly, CA are (typically) spatially and temporally discrete: they are composed of a finite or denumerable set of homogenous, simple units, the atoms or cells. [...] Secondly, CA are abstract: they can be specified in purely mathematical terms and physical structures can implement them. Thirdly, CA are computational systems: they can compute functions and solve algorithmic problems." (Francesco Berto & Jacopo Tagliabue, "Cellular Automata", Stanford Encyclopedia of Philosophy, 2012) [source]

"One of the unique features of typical CA [ cellular automata] models is that time, space, and states of cells are all discrete. Because of such discreteness, the number of all possible state-transition functions is finite, i.e., there are only a finite number of “universes” possible in a given CA setting. Moreover, if the space is finite, all possible configurations of the entire system are also enumerable. This means that, for reasonably small CA settings, one can conduct an exhaustive search of the entire rule space or phase space to study the properties of all the 'parallel universes'." (Hiroki Sayama, "Introduction to the Modeling and Analysis of Complex Systems", 2015)


05 December 2020

❄️Systems Thinking: On Entropy (Quotes)

"If for the entire universe we conceive the same magnitude to be determined, consistently and with due regard to all circumstances, which for a single body I have called entropy, and if at the same time we introduce the other and simpler conception of energy, we may express in the following manner the fundamental laws of the universe which correspond to the two fundamental theorems of the mechanical theory of heat: (1) The energy of the universe is constant. (2) The entropy of the universe tends to a maximum." (Rudolf Clausius, "The Mechanical Theory of Heat - With its Applications to the Steam Engine and to Physical Properties of Bodies", 1867)

"[…] the quantities of heat which must be imparted to, or withdrawn from a changeable body are not the same, when these changes occur in a non-reversible manner, as they are when the same changes occur reversibly. In the second place, with each non-reversible change is associated an uncompensated transformation […] I propose to call the magnitude S the entropy of the body […] I have intentionally formed the word entropy so as to be as similar as possible to the word energy […]" (Rudolf Clausius, "The Mechanical Theory of Heat", 1867)

"The second fundamental theorem [the second law of thermodynamics], in the form which I have given to it, asserts that all transformations occurring in nature may take place in a certain direction, which I have assumed as positive, by themselves, that is, without compensation […] the entire condition of the universe must always continue to change in that first direction, and the universe must consequently approach incessantly a limiting condition. […] For every body two magnitudes have thereby presented themselves - the transformation value of its thermal content [the amount of inputted energy that is converted to 'work'], and its disgregation [separation or disintegration]; the sum of which constitutes its entropy." (Rudolf Clausius, "The Mechanical Theory of Heat", 1867)

"Since a given system can never of its own accord go over into another equally probable state but into a more probable one, it is likewise impossible to construct a system of bodies that after traversing various states returns periodically to its original state, that is a perpetual motion machine." (Ludwig E Boltzmann, "The Second Law of Thermodynamics", [Address to a Formal meeting of the Imperial Academy of Science], 1886) 

"[…] only a part of the whole intrinsic energy of the system is capable of being converted into mechanical work by actions going on within the vessel, and without any communication with external space by the passage either of matter or of heat. This part is sometimes called the Available Energy of the system. Clausius has called the remainder of the energy, which cannot be converted into work, the Entropy of the system. We shall find it more convenient to adopt the suggestion of Professor Tait, and give the name of Entropy to the part which can be converted into mechanical work." (James C Maxwell, "Theory of Heat", 1899)

"The Entropy of a system is the mechanical work it can perform without communication of heat, or alteration of its total volume, all transference of heat being performed by reversible engines. When the pressure and temperature of the system have become uniform the entropy is exhausted. The original energy of the system is equal to the sum of the entropy and the energy remaining in the state of uniform pressure and temperature. The entropy of a system consisting of several component systems is the same in whatever order the entropy of the parts is exhausted. It is therefore equal to the sum of the entropy of each component system, together with the entropy of the system consisting of the component systems, each with its own entropy exhausted." (James C Maxwell, "Theory of Heat", 1899)

"[…] the result of the conduction and radiation of heat from one part of a system to another is to diminish the entropy of the system, or the energy, available as work, which can be obtained from the system. The energy of the system, however, is indestructible, and as it has not been removed from the system, it must remain' in it. Hence the intrinsic energy of the system, when the entropy is exhausted by thermal communication, conduction, and radiation, is equal to its original energy, and is of course greater than in the case in which the entropy is exhausted by means of the reversible engine." (James C Maxwell, "Theory of Heat", 1899)

"Heretics are the only (bitter) remedy against the entropy of human thought." (Yevgeny Zamiatin, "On Literature, Revolution, Entropy, and Other Matters", 1923)

"No revolution, no heresy is comfortable or easy. For it is a leap, it is a break in the smooth evolutionary curve, and a break is a wound, a pain. But the wound is necessary; most of mankind suffers from hereditary sleeping sickness, and victims of this sickness (entropy) must not be allowed to sleep, or it will be their final sleep, death."  (Yevgeny Zamiatin, "On Literature, Revolution, Entropy, and Other Matters", 1923)

"The second law of thermodynamics appears solely as a law of probability, entropy as a measure of the probability, and the increase of entropy is equivalent to a statement that more probable events follow less probable ones." (Max Planck, "A Survey of Physics", 1923)

"Revolution is everywhere, in everything. It is infinite. There is no final revolution, no final number. The social revolution is only one of an infinite number of numbers; the law of revolution is not a social law, but an immeasurably greater one. It is a cosmic, universal law - like the laws of the conservation of energy and of the dissipation of energy (entropy)." (Yevgeny Zamiatin, "On Literature, Revolution, Entropy, and Other Matters", 1923)

"Let us draw an arrow arbitrarily. If as we follow the arrow[,] we find more and more of the random element in the state of the world, then the arrow is pointing towards the future; if the random element decreases[,] the arrow points towards the past. That is the only distinction known to physics. This follows at once if our fundamental contention is admitted that the introduction of randomness is the only thing which cannot be undone. I shall use the phrase 'time's arrow' to express this one-way property of time which has no analogue in space." (Arthur S Eddington, "The Nature of the Physical World", 1928) 

"So far as physics is concerned, time's arrow is a property of entropy alone." (Arthur S Eddington, "The Nature of the Physical World", 1928)

"It was not easy for a person brought up in the ways of classical thermodynamics to come around to the idea that gain of entropy eventually is nothing more nor less than loss of information." (Gilbert N. Lewis, [Letter to Irving Langmuir] 1930)

"Thought interferes with the probability of events, and, in the long run therefore, with entropy." (David L Watson, 1930)

"True equilibria can occur only in closed systems and that, in open systems, disequilibria called ‘steady states’, or ‘flow equilibria’ are the predominant and characteristic feature. According to the second law of thermodynamics a closed system must eventually attain a time-independent equilibrium state, with maximum entropy and minimum free energy. An open system may, under certain conditions, attain a time-independent state where the system remains constant as a whole and in its phases, though there is a continuous flow of component materials. This is called a steady state. Steady states are irreversible as a whole. […] A closed system in equilibrium does not need energy for its preservation, nor can energy be obtained from it. In order to perform work, a system must be in disequilibrium, tending toward equilibrium and maintaining a steady state, Therefore the character of an open system is the necessary condition for the continuous working capacity of the organism." (Ludwig on Bertalanffy, "Theoretische Biologie: Band 1: Allgemeine Theorie, Physikochemie, Aufbau und Entwicklung des Organismus", 1932)

"[A living organism] feeds upon negative entropy […] Thus, the device by which an organism maintains itself stationary at a fairly high level of orderliness really consists in continually sucking orderliness from its environment." (Erwin Schrödinger, "What is Life? The Physical Aspect of the Living Cell", 1944)

"A permanent state is reached, in which no observable events occur. The physicist calls this the state of thermodynamical equilibrium, or of ‘maximum entropy’. Practically, a state of this kind is usually reached very rapidly. Theoretically, it is very often not yet an absolute equilibrium, not yet the true maximum of entropy. But then the final approach to equilibrium is very slow. It could take anything between hours, years, centuries […]." (Erwin Schrödinger, "What is Life?", 1944)

"An isolated system or a system in a uniform environment (which for the present consideration we do best to include as a part of the system we contemplate) increases its entropy and more or less rapidly approaches the inert state of maximum entropy. We now recognize this fundamental law of physics to be just the natural tendency of things to approach the chaotic state (the same tendency that the books of a library or the piles of papers and manuscripts on a writing desk display) unless we obviate it. (The analogue of irregular heat motion, in this case, is our handling those objects now and again without troubling to put them back in their proper places.)" (Erwin Schrödinger, "What is Life?", 1944)

"Every process, event, happening – call it what you will; in a word, everything that is going on in Nature means an increase of the entropy of the part of the world where it is going on. Thus a living organism continually increases its entropy – or, as you may say, produces positive entropy – and thus tends to approach the dangerous state of maximum entropy, which is death. It can only keep aloof from it, i.e. alive, by continually drawing from its environment negative entropy – which is something very positive as we shall immediately see. What an organism feeds upon is negative entropy. Or, to put it less paradoxically, the essential thing in metabolism is that the organism succeeds in freeing itself from all the entropy it cannot help producing while alive." (Erwin Schrödinger, "What is Life?", 1944)

"Hence the awkward expression ‘negative entropy’ can be replaced by a better one: entropy, taken with the negative sign, is itself a measure of order. Thus the device by which an organism maintains itself stationary at a fairly high level of orderliness ( = fairly low level of entropy) really consists in continually sucking orderliness from its environment. (Erwin Schrödinger, "What is Life?", 1944)

"There is no concept in the whole field of physics which is more difficult to understand than is the concept of entropy, nor is there one which is more fundamental." (Francis W Sears, "Mechanics, Heat and Sound", 1944)

"Time itself will come to an end. For entropy points the direction of time. Entropy is the measure of randomness. When all system and order in the universe have vanished, when randomness is at its maximum, and entropy cannot be increased, when there is no longer any sequence of cause and effect, in short when the universe has run down, there will be no direction to time - there will be no time." (Lincoln Barnett, "The Universe and Dr. Einstein", 1948)

"In classical physics, most of the fundamental laws of nature were concerned either with the stability of certain configurations of bodies, e.g. the solar system, or else with the conservation of certain properties of matter, e.g. mass, energy, angular momentum or spin. The outstanding exception was the famous Second Law of Thermodynamics, discovered by Clausius in 1850. This law, as usually stated, refers to an abstract concept called entropy, which for any enclosed or thermally isolated system tends to increase continually with lapse of time. In practice, the most familiar example of this law occurs when two bodies are in contact: in general, heat tends to flow from the hotter body to the cooler. Thus, while the First Law of Thermodynamics, viz. the conservation of energy, is concerned only with time as mere duration, the Second Law involves the idea of trend." (Gerald J Whitrow, "The Structure of the Universe: An Introduction to Cosmology", 1949)

"But in no case is there any question of time flowing backward, and in fact the concept of backward flow of time seems absolutely meaningless. […] If it were found that the entropy of the universe were decreasing, would one say that time was flowing backward, or would one say that it was a law of nature that entropy decreases with time?" (Percy W Bridgman, "Reflections of a Physicist", 1950)

"It is my thesis that the physical functioning of the living individual and the operation of some of the newer communication machines are precisely parallel in their analogous attempts to control entropy through feedback. Both of them have sensory receptors as one stage of their cycle of operation: that is, in both of them there exists a special apparatus for collecting information from the outer world at low energy levels, and for making it available in the operation of the individual or of the machine. In both cases these external messages are not taken neat, but through the internal transforming powers of the apparatus, whether it be alive or dead. The information is then turned into a new form available for the further stages of performance. In both the animal and the machine this performance is made to be effective on the outer world. In both of them, their performed action on the outer world, and not merely their intended action, is reported back to the central regulatory apparatus." (Norbert Wiener, "The Human Use of Human Beings", 1950)

"Just as entropy is a measure of disorganization, the information carried by a set of messages is a measure of organization. In fact, it is possible to interpret the information carried by a message as essentially the negative of its entropy, and the negative logarithm of its probability. That is, the more probable the message, the less information it gives. Clichés, for example, are less illuminating than great poems." (Norbert Wiener, "The Human Use of Human Beings", 1950)

"Progress imposes not only new possibilities for the future but new restrictions. It seems almost as if progress itself and our fight against the increase of entropy intrinsically must end in the downhill path from which we are trying to escape." (Norbert Wiener, "The Human Use of Human Beings", 1950)

"[…] the characteristic tendency of entropy is to increase. As entropy increases, the universe, and all closed systems in the universe, tend naturally to deteriorate and lose their distinctiveness, to move from the least to the most probable state, from a state of organization and differentiation in which distinctions and forms exist, to a state of chaos and sameness." (Norbert Wiener, "The Human Use of Human Beings", 1950)

"There is no concept in the whole field of physics which is more difficult to understand than is the concept of entropy, nor is there one which is more fundamental." (Francis W Sears, "Mechanics, heat and sound", 1950)

 "The powerful notion of entropy, which comes from a very special branch of physics […] is certainly useful in the study of communication and quite helpful when applied in the theory of language." (J Robert Oppenheimer, "The Growth of Science and the Structure of Culture", Daedalus 87 (1), 1958) 

"But in addition to what we decide to do by way of transformation, there are certain tendencies in the way systems behave of their own volition when left to their own devices. The convenient analogy for one of these processes is found in the second law of thermodynamics: an 'ordering' process goes on, for which the name is entropy. This can be explained without technicalities as the tendency of a system to settle down to a uniform distribution of its energy. The dissipation of local pockets of high energy is measured by an increase in entropy, until at maximum entropy all is uniform. According to this model, order is more 'natural' than chaos. This is the reason why it is convenient to discuss cybernetic systems, with their self-regulating tendency to attain stability or orderliness, in terms of entropy - a term which has been taken over to name a key tool of cybernetics." (Stafford Beer, "Cybernetics and Management", 1959)

"Science is usually understood to depict a universe of strict order and lawfulness, of rigorous economy - one whose currency is energy, convertible against a service charge into a growing common pool called entropy." (Paul A Weiss,"Organic Form: Scientific and Aesthetic Aspects", 1960)

"The basic objection to attempts to deduce the unidirectional nature of time from concepts such as entropy is that they are attempts to reduce a more fundamental concept to a less fundamental one." (Gerald J Whitrow, "The Natural Philosophy of Time", 1961)

"Entropy is a measure of the heat energy in a substance that has been lost and is no longer available for work. It is a measure of the deterioration of a system." (William B. Sill & Norman Hoss (Eds.), "Popular Science Encyclopedia of the Sciences", 1963)

"Suppose we divide the space into little volume elements. If we have black and white molecules, how many ways could we distribute them among the volume elements so that white is on one side and black is on the other? On the other hand, how many ways could we distribute them with no restriction on which goes where? Clearly, there are many more ways to arrange them in the latter case. We measure 'disorder' by the number of ways that the insides can be arranged, so that from the outside it looks the same. The logarithm of that number of ways is the entropy. The number of ways in the separated case is less, so the entropy is less, or the 'disorder' is less." (Richard P Feynman, "Order And Entropy" ["The Feynman Lectures on Physics"], 1964)

"The homeostatic principle does not apply literally to the functioning of all complex living systems, in that in counteracting entropy they move toward growth and expansion." (Daniel Katz, "The Social Psychology of Organizations", 1966)

"Higher, directed forms of energy (e.g., mechanical, electric, chemical) are dissipated, that is, progressively converted into the lowest form of energy, i.e., undirected heat movement of molecules; chemical systems tend toward equilibria with maximum entropy; machines wear out owing to friction; in communication channels, information can only be lost by conversion of messages into noise but not vice versa, and so forth." (Ludwig von Bertalanffy, "Robots, Men and Minds", 1967)

"No structure, even an artificial one, enjoys the process of entropy. It is the ultimate fate of everything, and everything resists it." (Philip K Dick, "Galactic Pot-Healer", 1969)

"There is a kind of second law of cultural dynamics which states simply that when anything has been done, it cannot be done again. In other words, we start off any system with a potential for novelty which is gradually exhausted. We see this in every field of human life, in the arts as well as the sciences. Once Beethoven has written the Ninth Symphony, nobody else can do it. Consequently, we find that in any evolutionary process, even in the arts, the search for novelty becomes corrupting. The 'entropy trap' is perhaps the most subtle and the most fundamental of the obstacles toward realising the developed society." (Kenneth Boulding, "The Science Revelation", Bulletin of the Atomic Scientists Vol. 26 (7), 1970)

"To adapt to a changing environment, the system needs a variety of stable states that is large enough to react to all perturbations but not so large as to make its evolution uncontrollably chaotic. The most adequate states are selected according to their fitness, either directly by the environment, or by subsystems that have adapted to the environment at an earlier stage. Formally, the basic mechanism underlying self-organization is the (often noise-driven) variation which explores different regions in the system’s state space until it enters an attractor. This precludes further variation outside the attractor, and thus restricts the freedom of the system’s components to behave independently. This is equivalent to the increase of coherence, or decrease of statistical entropy, that defines self-organization." (Francis Heylighen, "The Science Of Self-Organization And Adaptivity", 1970)

"You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage." (John von Neumann) [Suggesting to Claude Shannon a name for his new uncertainty function, see Scientific American Vol. 225 (3), 1971]

"In an isolated system, which cannot exchange energy and matter with the surroundings, this tendency is expressed in terms of a function of the macroscopic state of the system: the entropy." (Ilya Prigogine, "Thermodynamics of Evolution", 1972) 

"The functional order maintained within living systems seems to defy the Second Law; nonequilibrium thermodynamics describes how such systems come to terms with entropy." (Ilya Prigogine, "Thermodynamics of Evolution", 1972) 

"There is nothing supernatural about the process of self-organization to states of higher entropy; it is a general property of systems, regardless of their materials and origin. It does not violate the Second Law of thermodynamics since the decrease in entropy within an open system is always offset by the increase of entropy in its surroundings." (Ervin László, "Introduction to Systems Philosophy", 1972)

"Life, this anti-entropy, ceaselessly reloaded with energy, is a climbing force, toward order amidst chaos, toward light, among the darkness of the indefinite, toward the mystic dream of Love, between the fire which devours itself and the silence of the Cold. Such a Nature does not accept abdication, nor skepticism." (Albert Claude, [Nobel lecture for award received] 1974)

"Entropy theory is indeed a first attempt to deal with global form; but it has not been dealing with structure. All it says is that a large sum of elements may have properties not found in a smaller sample of them." (Rudolf Arnheim, "Entropy and Art: An Essay on Disorder and Order", 1974) 

"Entropy theory, on the other hand, is not concerned with the probability of succession in a series of items but with the overall distribution of kinds of items in a given arrangement." (Rudolf Arnheim, "Entropy and Art: An Essay on Disorder and Order", 1974) 

"If entropy must constantly and continuously increase, then the universe is remorselessly running down, thus setting a limit (a long one, to be sure) on the existence of humanity. To some human beings, this ultimate end poses itself almost as a threat to their personal immortality, or as a denial of the omnipotence of God. There is, therefore, a strong emotional urge to deny that entropy must increase." (Isaac Asimov, "Asimov on Physics", 1976)

"The amount of information conveyed by the message increases as the amount of uncertainty as to what message actually will be produced becomes greater. A message which is one out of ten possible messages conveys a smaller amount of information than a message which is one out of a million possible messages. The entropy of communication theory is a measure of this uncertainty and the uncertainty, or entropy, is taken as the measure of the amount of information conveyed by a message from a source. The more we know about what message the source will produce, the less uncertainty, the less the entropy, and the less the information." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals and Noise", 1979) 

"The interaction between parts of mind is triggered by difference, and difference is a nonsubstantial phenomenon not located in space or time; difference is related to negentropy and entropy rather than energy." (Gregory Bateson, "Mind and Nature: A Necessary Unity", 1979)

"Thus, an increase in entropy means a decrease in our ability to change thermal energy, the energy of heat, into mechanical energy. An increase of entropy means a decrease of available energy." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals and Noise", 1979)

"Time goes forward because energy itself is always moving from an available to an unavailable state. Our consciousness is continually recording the entropy change in the world around us. [...] we experience the passage of time by the succession of one event after another. And every time an event occurs anywhere in this world energy is expended and the overall entropy is increased. To say the world is running out of time then, to say the world is running out of usable energy. In the words of Sir Arthur Eddington, 'Entropy is time's arrow'." (Jeremy Rifkin, "Entropy", 1980)

"Thus, in physics, entropy is associated with the possibility of converting thermal energy into mechanical energy. If the entropy does not change during a process, the process is reversible. If the entropy increases, the available energy decreases. Statistical mechanics interprets an increase of entropy as a decrease in order or, if we wish, as a decrease in our knowledge." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

"In microscopic systems, consisting of only a few molecules, the second law is violated regularly, but in macroscopic systems, which consist of vast numbers of molecules, the probability that the total entropy of the system will increase becomes virtual certainty. Thus in any isolated system, made up of a large number of molecules, the entropy - or disorder -will keep increasing until, eventually, the system reaches a state of maximum entropy, also known as 'heat death'; in this state all activity has ceased, all material being evenly distributed and at the same temperature. According to classical physics, the universe as a whole is going toward such a state of maximum entropy; it is running down and will eventually grind to a halt." (Fritjof Capra, "The Turning Point: Science, Society, and the Turning Culture", 1982)

"The phenomenon of self-organization is not limited to living matter but occurs also in certain chemical systems […] [Ilya] Prigogine has called these systems 'dissipative structures' to express the fact that they maintain and develop structure by breaking down other structures in the process of metabolism, thus creating entropy­ disorder - which is subsequently dissipated in the form of degraded waste products. Dissipative chemical structures display the dynamics of self-organization in its simplest form, exhibiting most of the phenomena characteristic of life self-renewal, adaptation, evolution, and even primitive forms of 'mental' processes." (Fritjof Capra, "The Turning Point: Science, Society, and the Turning Culture", 1982)

"The third model regards mind as an information processing system. This is the model of mind subscribed to by cognitive psychologists and also to some extent by the ego psychologists. Since an acquisition of information entails maximization of negative entropy and complexity, this model of mind assumes mind to be an open system." (Thaddus E Weckowicz, "Models of Mental Illness", 1984) 

"Disorder increases with time because we measure time in the direction in which disorder increases." (Stephen W Hawking, "The Direction of Time", New Scientist 115 (1568), 1987)

"Somehow, after all, as the universe ebbs toward its final equilibrium in the featureless heat bath of maximum entropy, it manages to create interesting structures." (James Gleick, "Chaos: Making a New Science", 1987)

"Just like a computer, we must remember things in the order in which entropy increases. This makes the second law of thermodynamics almost trivial. Disorder increases with time because we measure time in the direction in which disorder increases."  (Stephen Hawking, "A Brief History of Time", 1988)

"The increase of disorder or entropy with time is one example of what is called an arrow of time something that gives a direction to time and distinguishes the past from the future. There are at least three different directions of time. First, there is the thermodynamic arrow of time - the direction of time in which disorder or entropy increases. Second, there is the psychological arrow of time. This is the direction in which we feel time passes - the direction of time in which we remember the past, but not the future. Third, there is the cosmological arrow of time. This is the direction of time in which the universe is expanding rather than contracting." (Stephen W. Hawking, "The Direction of Time", New Scientist 46, 1987)

"Complexity is not an objective factor but a subjective one. Supersignals reduce complexity, collapsing a number of features into one. Consequently, complexity must be understood in terms of a specific individual and his or her supply of supersignals. We learn supersignals from experience, and our supply can differ greatly from another individual's. Therefore there can be no objective measure of complexity." (Dietrich Dorner, "The Logic of Failure: Recognizing and Avoiding Error in Complex Situations", 1989)

"The view of science is that all processes ultimately run down, but entropy is maximized only in some far, far away future. The idea of entropy makes an assumption that the laws of the space-time continuum are infinitely and linearly extendable into the future. In the spiral time scheme of the timewave this assumption is not made. Rather, final time means passing out of one set of laws that are conditioning existence and into another radically different set of laws. The universe is seen as a series of compartmentalized eras or epochs whose laws are quite different from one another, with transitions from one epoch to another occurring with unexpected suddenness." (Terence McKenna, "True Hallucinations", 1989)

"The inflationary period of expansion does not smooth out irregularity by entropy-producing processes like those explored by the cosmologies of the seventies. Rather it sweeps the irregularity out beyond the Horizon of our visible Universe, where we cannot see it . The entire universe of stars and galaxies on view to us. […] on this hypothesis, is but the reflection of a minute, perhaps infinitesimal, portion of the universe's initial conditions, whose ultimate extent and structure must remain forever unknowable to us. A theory of everything does not help here. The information contained in the observable part of the universe derives from the evolution of a tiny part of the initial conditions for the entire universe. The sum total of all the observations we could possibly make can only tell us about a minuscule portion of the whole." (John D Barrow, "Theories of Everything: The Quest for Ultimate Explanation", 1991)

"Three laws governing black hole changes were thus found, but it was soon noticed that something unusual was going on. If one merely replaced the words 'surface area' by 'entropy' and 'gravitational field' by 'temperature', then the laws of black hole changes became merely statements of the laws of thermodynamics. The rule that the horizon surface areas can never decrease in physical processes becomes the second law of thermodynamics that the entropy can never decrease; the constancy of the gravitational field around the horizon is the so-called zeroth law of thermodynamics that the temperature must be the same everywhere in a state of thermal equilibrium. The rule linking allowed changes in the defining quantities of the black hole just becomes the first law of thermodynamics, which is more commonly known as the conservation of energy." (John D Barrow, "Theories of Everything: The Quest for Ultimate Explanation", 1991)

"The new information technologies can be seen to drive societies toward increasingly dynamic high-energy regions further and further from thermodynamical equilibrium, characterized by decreasing specific entropy and increasingly dense free-energy flows, accessed and processed by more and more complex social, economic, and political structures." (Ervin László, "Information Technology and Social Change: An Evolutionary Systems Analysis", Behavioral Science 37, 1992) 

"The Law of Entropy Nonconservation required that life be lived forward, from birth to death. […] To wish for the reverse was to wish for the entropy of the universe to diminish with time, which was impossible. One might as well wish for autumn leaves to assemble themselves in neat stacks just as soon as they had fallen from trees or for water to freeze whenever it was heated." (Michael Guillen, "Five Equations That Changed the World", 1995)

"The second law of thermodynamics, which requires average entropy (or disorder) to increase, does not in any way forbid local order from arising through various mechanisms of self-organization, which can turn accidents into frozen ones producing extensive regularities. Again, such mechanisms are not restricted to complex adaptive systems." (Murray Gell-Mann, "What is Complexity?", Complexity Vol 1 (1), 1995)

"All systems evolve, although the rates of evolution may vary over time both between and within systems. The rate of evolution is a function of both the inherent stability of the system and changing environmental circumstances. But no system can be stabilized forever. For the universe as a whole, an isolated system, time’s arrow points toward greater and greater breakdown, leading to complete molecular chaos, maximum entropy, and heat death. For open systems, including the living systems that are of major interest to us and that interchange matter and energy with their external environments, time’s arrow points to evolution toward greater and greater complexity. Thus, the universe consists of islands of increasing order in a sea of decreasing order. Open systems evolve and maintain structure by exporting entropy to their external environments." (L Douglas Kiel, "Chaos Theory in the Social Sciences: Foundations and Applications", 1996)

"Contrary to what happens at equilibrium, or near equilibrium, systems far from equilibrium do not conform to any minimum principle that is valid for functions of free energy or entropy production." (Ilya Prigogine, "The End of Certainty: Time, Chaos, and the New Laws of Nature", 1996) 

"Complex systems operate under conditions far from equilibrium. Complex systems need a constant flow of energy to change, evolve and survive as complex entities. Equilibrium, symmetry and complete stability mean death. Just as the flow, of energy is necessary to fight entropy and maintain the complex structure of the system, society can only survive as a process. It is defined not by its origins or its goals, but by what it is doing." (Paul Cilliers,"Complexity and Postmodernism: Understanding Complex Systems", 1998)

"In a closed system, the change in entropy must always be 'positive', meaning toward death. However, in open biological or social systems, entropy can be arrested and may even be transformed into negative entropy - a process of more complete organization and enhanced ability to transform resources. Why? Because the system imports energy and resources from its environment, leading to renewal. This is why education and learning are so important, as they provide new and stimulating input (termed neg-entropy) that can transform each of us." (Stephen G Haines, "The Managers Pocket Guide to Systems Thinking & Learning", 1998)

"No one has yet succeeded in deriving the second law from any other law of nature. It stands on its own feet. It is the only law in our everyday world that gives a direction to time, which tells us that the universe is moving toward equilibrium and which gives us a criteria for that state, namely, the point of maximum entropy, of maximum probability. The second law involves no new forces. On the contrary, it says nothing about forces whatsoever." (Brian L Silver, "The Ascent of Science", 1998)

"Physical systems are subject to the force of entropy, which increases until eventually the entire system fails. The tendency toward maximum entropy is a movement to disorder, complete lack of resource transformation, and death." (Stephen G Haines, "The Managers Pocket Guide to Systems Thinking & Learning", 1998)

"All systems have a tendency toward maximum entropy, disorder, and death. Importing resources from the environment is key to long-term viability; closed systems move toward this disorganization faster than open systems." (Stephen G Haines, "The Systems Thinking Approach to Strategic Planning and Management", 2000)

"Defined from a societal standpoint, information may be seen as an entity which reduces maladjustment between system and environment. In order to survive as a thermodynamic entity, all social systems are dependent upon an information flow. This explanation is derived from the parallel between entropy and information where the latter is regarded as negative entropy (negentropy). In more common terms information is a form of processed data or facts about objects, events or persons, which are meaningful for the receiver, inasmuch as an increase in knowledge reduces uncertainty." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Emergent self-organization in multi-agent systems appears to contradict the second law of thermodynamics. This paradox has been explained in terms of a coupling between the macro level that hosts self-organization (and an apparent reduction in entropy), and the micro level (where random processes greatly increase entropy). Metaphorically, the micro level serves as an entropy 'sink', permitting overall system entropy to increase while sequestering this increase from the interactions where self-organization is desired." (H Van Dyke Parunak & Sven Brueckner, "Entropy and Self-Organization in Multi-Agent Systems", Proceedings of the International Conference on Autonomous Agents, 2001)

"Entropy [...] is the amount of disorder or randomness present in any system. All non-living systems tend toward disorder; left alone they will eventually lose all motion and degenerate into an inert mass. When this permanent stage is reached and no events occur, maximum entropy is attained. A living system can, for a finite time, avert this unalterable process by importing energy from its environment. It is then said to create negentropy, something which is characteristic of all kinds of life." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"The acquisition of information is a flow from noise to order - a process converting entropy to redundancy. During this process, the amount of information decreases but is compensated by constant re- coding. In the recoding the amount of information per unit increases by means of a new symbol which represents the total amount of the old. The maturing thus implies information condensation. Simultaneously, the redundance decreases, which render the information more difficult to interpret." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"The function of living matter is apparently to expand the organization of the universe. Here, locally decreased entropy as a result of biological order in existing life is invalidating the effects of the second law of thermodynamics, although at the expense of increased entropy in the whole system. It is the running down of the universe that made the sun and the earth possible. It is the running down of the sun that made life and us possible." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Nature normally hates power laws. In ordinary systems all quantities follow bell curves, and correlations decay rapidly, obeying exponential laws. But all that changes if the system is forced to undergo a phase transition. Then power laws emerge-nature's unmistakable sign that chaos is departing in favor of order. The theory of phase transitions told us loud and clear that the road from disorder to order is maintained by the powerful forces of self-organization and is paved by power laws. It told us that power laws are not just another way of characterizing a system's behavior. They are the patent signatures of self-organization in complex systems." (Albert-László Barabási, "Linked: How Everything Is Connected to Everything Else and What It Means for Business, Science, and Everyday Life", 2002)

"Entropy is not about speeds or positions of particles, the way temperature and pressure and volume are, but about our lack of information." (Hans C von Baeyer," Information, The New Language of Science", 2003)

"The best case that can be made for human sync to the environment (outside of circadian entrainment) has to do with the possibility that electrical rhythms in our brains can be influenced by external signals." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"The principle of maximum entropy is employed for estimating unknown probabilities (which cannot be derived deductively) on the basis of the available information. According to this principle, the estimated probability distribution should be such that its entropy reaches maximum within the constraints of the situation, i.e., constraints that represent the available information. This principle thus guarantees that no more information is used in estimating the probabilities than available." (George J Klir & Doug Elias, "Architecture of Systems Problem Solving" 2nd Ed, 2003) 

"The principle of minimum entropy is employed in the formulation of resolution forms and related problems. According to this principle, the entropy of the estimated probability distribution, conditioned by a particular classification of the given events (e.g., states of the variable involved), is minimum subject to the constraints of the situation. This principle thus guarantees that all available information is used, as much as possible within the given constraints (e.g., required number of states), in the estimation of the unknown probabilities." (George J Klir & Doug Elias, "Architecture of Systems Problem Solving" 2nd Ed, 2003)


"The total disorder in the universe, as measured by the quantity that physicists call entropy, increases steadily steadily as we go from past to future. On the other hand, the total order in the universe, as measured by the complexity and permanence of organized structures, also increases steadily as we go from past to future." (Freeman Dyson, [Page-Barbour lecture], 2004)

"At the foundation of classical thermodynamics are the first and second laws. The first law formulates that the total energy of a system is conserved, while the second law states that the entropy of an isolated system can only increase. The second law implies that the free energy of an isolated system is successively degraded by diabatic processes over time, leading to entropy production. This eventually results in an equilibrium state of maximum entropy. In its statistical interpretation, the direction towards higher entropy can be interpreted as a transition to more probable states." (Axel Kleidon & Ralph D Lorenz, "Entropy Production by Earth System Processes" [in "Non- quilibrium Thermodynamics and the Production of Entropy"], 2005)

"However, the law of accelerating returns pertains to evolution, which is not a closed system. It takes place amid great chaos and indeed depends on the disorder in its midst, from which it draws its options for diversity. And from these options, an evolutionary process continually prunes its choices to create ever greater order."  (Ray Kurzweil, "The Singularity is Near", 2005)

"A bit involves both probability and an experiment that decides a binary or yes-no question. Consider flipping a coin. One bit of in-formation is what we learn from the flip of a fair coin. With an unfair or biased coin the odds are other than even because either heads or tails is more likely to appear after the flip. We learn less from flipping the biased coin because there is less surprise in the outcome on average. Shannon's bit-based concept of entropy is just the average information of the experiment. What we gain in information from the coin flip we lose in uncertainty or entropy." (Bart Kosko, "Noise", 2006)

"The second law of thermodynamics states that in an isolated system, entropy can only increase, not decrease. Such systems evolve to their state of maximum entropy, or thermodynamic equilibrium. Therefore, physical self-organizing systems cannot be isolated: they require a constant input of matter or energy with low entropy, getting rid of the internally generated entropy through the output of heat ('dissipation'). This allows them to produce ‘dissipative structures’ which maintain far from thermodynamic equilibrium. Life is a clear example of order far from thermodynamic equilibrium." (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)

"The total disorder in the universe, as measured by the quantity that physicists call entropy, increases steadily as we go from past to future. On the other hand, the total order in the universe, as measured by the complexity and permanence of organized structures, also increases steadily as we go from past to future." (Freeman J Dyson, "A Many-Colored Glass: Reflections on the Place of Life in the Universe", 2007)

"We have to be aware that even in mathematical and physical models of self-organizing systems, it is the observer who ascribes properties, aspects, states, and probabilities; and therefore entropy or order to the system. But organization is more than low entropy: it is structure that has a function or purpose." (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)

"Heat is the energy of random chaotic motion, and entropy is the amount of hidden microscopic information." (Leonard Susskind, "The Black Hole War", 2008)

"In fact, H [entropy] measures the amount of uncertainty that exists in the phenomenon. If there were only one event, its probability would be equal to 1, and H would be equal to 0 - that is, there is no uncertainty about what will happen in a phenomenon with a single event because we always know what is going to occur. The more events that a phenomenon possesses, the more uncertainty there is about the state of the phenomenon. In other words, the more entropy, the more information." (Diego Rasskin-Gutman, "Chess Metaphors: Artificial Intelligence and the Human Mind", 2009)

"Entropy is the crisp scientific name for waste, chaos, and disorder. As far as we know, the sole law of physics with no known exceptions anywhere in the universe is this: All creation is headed to the basement. Everything in the universe is steadily sliding down the slope toward the supreme equality of wasted heat and maximum entropy." (Kevin Kelly, "What Technology Wants", 2010)

"Second Law of thermodynamics is not an equality, but an inequality, asserting merely that a certain quantity referred to as the entropy of an isolated system - which is a measure of the system’s disorder, or ‘randomness’ - is greater (or at least not smaller) at later times than it was at earlier times." (Roger Penrose, "Cycles of Time: An Extraordinary New View of the Universe", 2010) 

"The laws of thermodynamics tell us something quite different. Economic activity is merely borrowing low-entropy energy inputs from the environment and transforming them into temporary products and services of value. In the transformation process, often more energy is expended and lost to the environment than is embedded in the particular good or service being produced." (Jeremy Rifkin, "The Third Industrial Revolution", 2011)

"The psychic entropy peculiar to the human condition involves seeing more to do than one can actually accomplish and feeling able to accomplish more than what conditions allow."(Mihaly Csikszentmihalyi, "Flow: The Psychology of Happiness", 2013)

"In a physical system, information is the opposite of entropy, as it involves uncommon and highly correlated configurations that are difficult to arrive at." (César A. Hidalgo, "Why Information Grows: The Evolution of Order, from Atoms to Economies", 2015)

"The passage of time and the action of entropy bring about ever-greater complexity - a branching, blossoming tree of possibilities. Blossoming disorder (things getting worse), now unfolding within the constraints of the physics of our universe, creates novel opportunities for spontaneous ordered complexity to arise." (D J MacLennan, "Frozen to Life", 2015)

"Information theory leads to the quantification of the information content of the source, as denoted by entropy, the characterization of the information-bearing capacity of the communication channel, as related to its noise characteristics, and consequently the establishment of the relationship between the information content of the source and the capacity of the channel. In short, information theory provides a quantitative measure of the information contained in message signals and help determine the capacity of a communication system to transfer this information from source to sink over a noisy channel in a reliable fashion." (Ali Grami, "Information Theory", 2016)

"The natural effect of processes going on in the Universe is to move from a state of order to a state of disorder, unless there is an input of energy from outside." (John R Gribbin, "The Time Illusion", 2016) 

"The Second Law of Thermodynamics states that in an isolated system (one that is not taking in energy), entropy never decreases. (The First Law is that energy is conserved; the Third, that a temperature of absolute zero is unreachable.) Closed systems inexorably become less structured, less organized, less able to accomplish interesting and useful outcomes, until they slide into an equilibrium of gray, tepid, homogeneous monotony and stay there." (Steven Pinker, "The Second Law of Thermodynamics", 2017)

"In information theory this notion, introduced by Claude Shannon, is used to express unpredictability of information content. For instance, if a data set containing n items was divided into k groups each comprising n i items, then the entropy of such a partition is H = p 1 log( p 1 ) + … + p k log( p k ), where p i = n i / n . In case of two alternative partitions, the mutual information is a measure of the mutual dependence between these partitions." (Slawomir T Wierzchon, "Ensemble Clustering Data Mining and Databases", 2018) [where i is used as index]

"Our greatest enemies are ultimately not our political adversaries but entropy, evolution (in the form of pestilence and the flaws in human nature), and most of all ignorance - a shortfall of knowledge of how best to solve our problems." (Steven Pinker, "Enlightenment Now: The Case for Reason, Science, Humanism, and Progress", 2018)

"Entropy is a measure of amount of uncertainty or disorder present in the system within the possible probability distribution. The entropy and amount of unpredictability are directly proportional to each other." (G Suseela & Y Asnath V Phamila, "Security Framework for Smart Visual Sensor Networks", 2019)

"In the physics [entropy is the] rate of system's messiness or disorder in a physical system. In the social systems theory - social entropy is a sociological theory that evaluates social behaviors using a method based on the second law of thermodynamics." (Justína Mikulášková et al, "Spiral Management: New Concept of the Social Systems Management", 2020)

"Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy." (Václav Havel, [Letter to Gustáv Husák]) 

28 November 2020

❄️Systems Thinking: On Living Systems Theory (Quotes)

"A vital phenomenon can only be regarded as explained if it has been proven that it appears as the result of the material components of living organisms interacting according to the laws which those same components follow in their interactions outside of living systems." (Adolf E Fick, "Gesammelte Schriften" Vol. 3, 1904)

"Since the fundamental character of the living thing is its organization, the customary investigation of the single parts and processes cannot provide a complete explanation of the vital phenomena. This investigation gives us no information about the coordination of parts and processes. Thus, the chief task of biology must be to discover the laws of biological systems (at all levels of organization). We believe that the attempts to find a foundation for theoretical biology point at a fundamental change in the world picture. This view, considered as a method of investigation, we shall call ‘organismic biology’ and, as an attempt at an explanation, ‘the system theory of the organism’." (Ludwig von Bertalanffy, “Kritische Theorie der Formbildung”, 1928)

"[…] to the scientific mind the living and the non-living form one continuous series of systems of differing degrees of complexity […], while to the philosophic mind the whole universe, itself perhaps an organism, is composed of a vast number of interlacing organisms of all sizes." (James G Needham, "Developments in Philosophy of Biology", Quarterly Review of Biology Vol. 3 (1), 1928)

"General systems theory is a series of related definitions, assumptions, and postulates about all levels of systems from atomic particles through atoms, molecules, crystals, viruses, cells, organs, individuals, small groups, societies, planets, solar systems, and galaxies. General behavior systems theory is a subcategory of such theory, dealing with living systems, extending roughly from viruses through societies. A significant fact about living things is that they are open systems, with important inputs and outputs. Laws which apply to them differ from those applying to relatively closed systems." (James G Miller, "General behavior systems theory and summary", Journal of Counseling Psychology 3 (2), 1956)

"A system is primarily a living system, and the process which defines it is the maintenance of an organization which we know as life." (Ralph W Gerard, "Units and Concepts of Biology", 1958)

"System theory is basically concerned with problems of relationships, of structure, and of interdependence rather than with the constant attributes of objects. In general approach it resembles field theory except that its dynamics deal with temporal as well as spatial patterns. Older formulations of system constructs dealt with the closed systems of the physical sciences, in which relatively self-contained structures could be treated successfully as if they were independent of external forces. But living systems, whether biological organisms or social organizations, are acutely dependent on their external environment and so must be conceived of as open systems." (Daniel Katz, "The Social Psychology of Organizations", 1966)

"The homeostatic principle does not apply literally to the functioning of all complex living systems, in that in counteracting entropy they move toward growth and expansion." (Daniel Katz, "The Social Psychology of Organizations", 1966) 

"Conventional physics deals only with closed systems, i.e. systems which are considered to be isolated from their environment. [...] However, we find systems which by their very nature and definition are not closed systems. Every living organism is essentially an open system. It maintains itself in a continuous inflow and outflow, a building up and breaking down of components, never being, so long as it is alive, in a state of chemical and thermodynamic equilibrium but maintained in a so-called steady state which is distinct from the latter." (Ludwig von Bertalanffy, "General System Theory", 1968)

"My analysis of living systems uses concepts of thermodynamics, information theory, cybernetics, and systems engineering, as well as the classical concepts appropriate to each level. The purpose is to produce a description of living structure and process in terms of input and output, flows through systems, steady states, and feedbacks, which will clarify and unify the facts of life." (James G Miller, "Living Systems: Basic Concepts", 1969)

"A cognitive system is a system whose organization defines a domain of interactions in which it can act with relevance to the maintenance of itself, and the process of cognition is the actual (inductive) acting or behaving in this domain. Living systems are cognitive systems, and living as a process is a process of cognition. This statement is valid for all organisms, with and without a nervous system." (Humberto R Maturana, "Biology of Cognition", 1970)

"A living system, due to its circular organization, is an inductive system and functions always in a predictive manner: what happened once will occur again. Its organization, (genetic and otherwise) is conservative and repeats only that which works. For this same reason living systems are historical systems; the relevance of a given conduct or mode of behavior is always determined in the past." (Humberto Maturana, "Biology of Cognition", 1970)

"The functional order maintained within living systems seems to defy the Second Law; nonequilibrium thermodynamics describes how such systems come to terms with entropy." (Ilya Prigogine, "Thermodynamics of Evolution", 1972)

"All nature is a continuum. The endless complexity of life is organized into patterns which repeat themselves - theme and variations - at each level of system. These similarities and differences are proper concerns for science. From the ceaseless streaming of protoplasm to the many-vectored activities of supranational systems, there are continuous flows through living systems as they maintain their highly organized steady states." (James G Miller, "Living Systems", 1978)

"Living systems are units of interactions; they exist in an ambience. From a purely biological point of view they cannot be understood independently of that part of the ambience with which they interact: the niche; nor can the niche be defined independently of the living system that specifies it." (Humberto Maturana, "Biology of Cognition", 1970)

"Information is carried by physical entities, such as books or sound waves or brains, but it is not itself material. Information in a living system is a feature of the order and arrangement of its parts, which arrangement provides the signs that constitute a ‘code’ or ‘language’." (John Z Young, "Programs of the Brain", 1978)

"In a biological or social system each holon must assert its individuality in order to maintain the system's stratified order, but it must also submit to the demands of the whole in order to make the system viable. These two tendencies are opposite but complementary. In a healthy system - an individual, a society, or an ecosystem - there is a balance between integration and self-assertion. This balance is not static but consists of a dynamic interplay between the two complementary tendencies, which makes the whole system flexible and open to change." (Fritjof Capra, "The Turning Point: Science, Society, and the Turning Culture", 1982)

"Living systems are organized in such a way that they form multileveled structures, each level consisting of subsystems which are wholes in regard to their parts, and parts with respect to the larger wholes." (Fritjof Capra, "The Turning Point: Science, Society, and the Turning Culture", 1982)

"The autonomy of living systems is characterized by closed, recursive organization. [...] A system's highest order of recursion or feedback process defines, generates, and maintains the autonomy of a system. The range of deviation this feedback seeks to control concerns the organization of the whole system itself. If the system should move beyond the limits of its own range of organization it would cease to be a system. Thus, autonomy refers to the maintenance of a systems wholeness. In biology, it becomes a definition of what maintains the variable called living." (Bradford P Keeney, "Aesthetics of Change", 1983)

"Living systems are never in equilibrium. They are inherently unstable. They may seem stable, but they're not. Everything is moving and changing. In a sense, everything is on the edge of collapse. Michael Crichton, "Jurassic Park", 1990)

"Systems thinking is a discipline for seeing wholes. It is a framework for seeing interrelationships rather than things, for seeing patterns of change rather than static 'snapshots'. It is a set of general principles- distilled over the course of the twentieth century, spanning fields as diverse as the physical and social sciences, engineering, and management. [...] During the last thirty years, these tools have been applied to understand a wide range of corporate, urban, regional, economic, political, ecological, and even psychological systems. And systems thinking is a sensibility for the subtle interconnectedness that gives living systems their unique character." (Peter Senge, "The Fifth Discipline", 1990)

"Living systems exist in the solid regime near the edge of chaos, and natural selection achieves and sustains such a poised state." (Stuart Kauffman, "The Origins of Order: Self-organization and selection in evolution", 1993) 

"It [Living Systems Theory (LST)] involves observing and measuring important relationships between inputs and outputs of the total system and identifying the structures that perform each of the sub‐system processes. […] The flows of relevant matter, energy, and information through the system and the adjustment processes of subsystems and the total system are also examined. The status and function of the system are analyzed and compared with what is average or normal for that type of system. If the system is experiencing a disturbance in some steady state, an effort is made to discover the source of the strain and correct it." (James G Miller & Jessie L Miller, "Applications of living systems theory", Systemic Practice and Action Research 8, 1995)

"According to the systems view, the essential properties of an organism, or living system, are properties of the whole, which none of the parts have. They arise from the interactions and relationships among the parts. These properties are destroyed when the system is dissected, either physically or theoretically, into isolated elements. Although we can discern individual parts in any system, these parts are not isolated, and the nature of the whole is always different from the mere sum of its parts." (Fritjof Capra, "The Web of Life", 1996)

"This spontaneous emergence of order at critical points of instability is one of the most important concepts of the new understanding of life. It is technically known as self-organization and is often referred to simply as ‘emergence’. It has been recognized as the dynamic origin of development, learning and evolution. In other words, creativity-the generation of new forms-is a key property of all living systems. And since emergence is an integral part of the dynamics of open systems, we reach the important conclusion that open systems develop and evolve. Life constantly reaches out into novelty." (Fritjof  Capra, "The Hidden Connections", 2002)

"The science of cybernetics is not about thermostats or machines; that characterization is a caricature. Cybernetics is about purposiveness, goals, information flows, decision-making control processes and feedback (properly defined) at all levels of living systems." (Peter Corning, "Synergy, Cybernetics, and the Evolution of Politics", 2005) 

"When defining living systems, the term dynamic equilibrium is essential. It does not imply something which is steady or stable. On the contrary, it is a floating state characterized by invisible movements and preparedness for change. To be in dynamic equilibrium is adapting adjustment to balance. Homeostasis stands for the sum of all control functions creating the state of dynamic equilibrium in a healthy organism. It is the ability of the body to maintain a narrow range of internal conditions in spite of environmental changes." (Lars Skyttner, "General Systems Theory: Problems, Perspective, Practice", 2005)

"The universe of all things that exist may be understood as a universe of systems where a system is defined as any set of related and interacting elements. This concept is primitive and powerful and has been used increasingly over the last half-century to organize knowledge in virtually all domains of interest to investigators. As human inventions and social interactions grow more complex, general conceptual frameworks that integrate knowledge among different disciplines studying those emerging systems grow more important. Living systems theory (LST) instructs integrative research among biological and social sciences and related academic disciplines." (G A Swanson & James G Miller, "Living Systems Theory", 2013)

"All living systems are networks of smaller components, and the web of life as a whole is a multilayered structure of living systems nesting within other living systems - networks within networks." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"Deep ecology does not separate humans - or anything else-from the natural environment. It sees the world not as a collection of isolated objects, but as a network of phenomena that are fundamentally interconnected and interdependent. Deep ecology recognizes the intrinsic value of all living beings and views humans as just one particular strand in the web of life." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"This spontaneous emergence of order at critical points of instability, which is often referred to simply as "emergence," is one of the hallmarks of life. It has been recognized as the dynamic origin of development, learning, and evolution. In other words, creativity-the generation of new forms-is a key property of all living systems." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

Related Posts Plugin for WordPress, Blogger...