08 August 2021

Science: On Events (Quotes)

"[…] as it is thus demonstrable that there are, in the constitution of things, certain Laws according to which Events happen, it is no less evident from Observation, that these Laws serve to wise, useful and beneficent purposes, to preserve the steadfast Order of the Universe, to propagate the several Species of Beings, and furnish to the sentient Kind such degrees of happiness as are suited to their State." (Abraham de Moivre, "The Doctrine of Chances: or, A Method of Calculating the Probabilities of Events in Play", 1718)

"Further, the same Arguments which explode the Notion of Luck, may, on the other side, be useful in some Cases to establish a due comparison between Chance and Design: We may imagine Chance and Design to be, as it were, in Competition with each other, for the production of some sorts of Events, and many calculate what Probability there is, that those Events should be rather be owing to the one than to the other." (Abraham de Moivre, "The Doctrine of Chances", 1718)

"[…] chance, that is, an infinite number of events, with respect to which our ignorance will not permit us to perceive their causes, and the chain that connects them together. Now, this chance has a greater share in our education than is imagined. It is this that places certain objects before us and, in consequence of this, occasions more happy ideas, and sometimes leads us to the greatest discoveries […]" (Claude A Helvetius, "On Mind", 1751)

"But ignorance of the different causes involved in the production of events, as well as their complexity, taken together with the imperfection of analysis, prevents our reaching the same certainty about the vast majority of phenomena. Thus there are things that are uncertain for us, things more or less probable, and we seek to compensate for the impossibility of knowing them by determining their different degrees of likelihood. So it was that we owe to the weakness of the human mind one of the most delicate and ingenious of mathematical theories, the science of chance or probability." (Pierre-Simon Laplace, "Recherches, 1º, sur l'Intégration des Équations Différentielles aux Différences Finies, et sur leur Usage dans la Théorie des Hasards", 1773)

"[…] determine the probability of a future or unknown event not on the basis of the number of possible combinations resulting in this event or in its complementary event, but only on the basis of the knowledge of order of familiar previous events of this kind" (Marquis de Condorcet, "Essai sur l'application de l'analyse à la probabilité des décisions rendues à la pluralité des voix", 1785)

"It is contrary to the usual order of things, that events so harmonious as those of the system of the world, should depend on such diversified agents as are supposed to exist in our artificial arrangements; and there is reason to anticipate a great reduction in the number of undecompounded bodies, and to expect that the analogies of nature will be found conformable to the refined operations of art. The more the phenomena of the universe are studied, the more distinct their connection appears, and the more simple their causes, the more magnificent their design, and the more wonderful the wisdom and power of their Author." (Sir Humphry Davy, "Elements of Chemical Philosophy", 1812)

"Probability has reference partly to our ignorance, partly to our knowledge [..] The theory of chance consists in reducing all the events of the same kind to a certain number of cases equally possible, that is to say, to such as we may be equally undecided about in regard to their existence, and in determining the number of cases favorable to the event whose probability is sought. The ratio of this number to that of all cases possible is the measure of this probability, which is thus simply a fraction whose number is the number of favorable cases and whose denominator is the number of all cases possible." (Pierre-Simon Laplace, "Philosophical Essay on Probabilities", 1814)

"Things of all kinds are subject to a universal law which may be called the law of large numbers. It consists in the fact that, if one observes very considerable numbers of events of the same nature, dependent on constant causes and causes which vary irregularly, sometimes in one direction, sometimes in the other, it is to say without their variation being progressive in any definite direction, one shall find, between these numbers, relations which are almost constant." (Siméon-Denis Poisson, "Poisson’s Law of Large Numbers", 1837)

"The very events which in their own nature appear most capricious and uncertain, and which in any individual case no attainable degree of knowledge would enable us to foresee, occur, when considerable numbers are taken into account, with a degree of regularity approaching to mathematical." (John S Mills, "A System of Logic", 1862)

"When we merely note and record the phenomena which occur around us in the ordinary course of nature we are said to observe. When we change the course of nature by the intervention of our will and muscular powers, and thus produce unusual combinations and conditions of phenomena, we are said to experiment. […] an experiment differs from a mere observation in the fact that we more or less influence the character of the events which we observe." (William S Jevons, "The Principles of Science: A Treatise on Logic and Scientific Method", 1874)

"Some of the common ways of producing a false statistical argument are to quote figures without their context, omitting the cautions as to their incompleteness, or to apply them to a group of phenomena quite different to that to which they in reality relate; to take these estimates referring to only part of a group as complete; to enumerate the events favorable to an argument, omitting the other side; and to argue hastily from effect to cause, this last error being the one most often fathered on to statistics. For all these elementary mistakes in logic, statistics is held responsible." (Sir Arthur L Bowley, "Elements of Statistics", 1901)

"The theory of chance consists in reducing all the events of the same kind to a certain number of cases equally possible, that is to say, to such as we may be equally undecided about in regard to their existence, and in determining the number of cases favorable to the event whose probability is sought." (Pierre-Simon de Laplace, "Philosophical Essay on Probabilities", 1902)

"Now a system is nothing but a mental connexion applied to a number of isolated events." (William Smith, The Quarterly review, 1906)

"The second law of thermodynamics appears solely as a law of probability, entropy as a measure of the probability, and the increase of entropy is equivalent to a statement that more probable events follow less probable ones." (Max Planck, "A Survey of Physics", 1923)

"Every theory of the course of events in nature is necessarily based on some process of simplification and is to some extent, therefore, a fairy tale." (Sir Napier Shaw, "Manual of Meteorology", 1932)

"The most important application of the theory of probability is to what we may call 'chance-like' or 'random' events, or occurrences. These seem to be characterized by a peculiar kind of incalculability which makes one disposed to believe - after many unsuccessful attempts - that all known rational methods of prediction must fail in their case. We have, as it were, the feeling that not a scientist but only a prophet could predict them. And yet, it is just this incalculability that makes us conclude that the calculus of probability can be applied to these events." (Karl R Popper, "The Logic of Scientific Discovery", 1934)

"A permanent state is reached, in which no observable events occur. The physicist calls this the state of thermodynamical equilibrium, or of ‘maximum entropy’. Practically, a state of this kind is usually reached very rapidly. Theoretically, it is very often not yet an absolute equilibrium, not yet the true maximum of entropy. But then the final approach to equilibrium is very slow. It could take anything between hours, years, centuries […]." (Erwin Schrödinger, "What is Life?", 1944)

"A fundamental value in the scientific outlook is concern with the best available map of reality. The scientist will always seek a description of events which enables him to predict most by assuming least. He thus already prefers a particular form of behavior. If moralities are systems of preferences, here is at least one point at which science cannot be said to be completely without preferences. Science prefers good maps." (Anatol Rapoport, "Science and the goals of man: a study in semantic orientation", 1950)

"It is indeed wrong to think that the poetry of Nature’s moods in all their infinite variety is lost on one who observes them scientifically, for the habit of observation refines our sense of beauty and adds a brighter hue to the richly coloured background against which each separate fact is outlined. The connection between events, the relation of cause and effect in different parts of a landscape, unite harmoniously what would otherwise be merely a series of detached sciences." (Marcel Minnaert, "The Nature of Light and Colour in the Open Air", 1954)

"Multiple equilibria are not necessarily useless, but from the standpoint of any exact science the existence of a uniquely determined equilibrium is, of course, of the utmost importance, even if proof has to be purchased at the price of very restrictive assumptions; without any possibility of proving the existence of (a) uniquely determined equilibrium - or at all events, of a small number of possible equilibria - at however high a level of abstraction, a field of phenomena is really a chaos that is not under analytical control." (Joseph A Schumpeter, "History of Economic Analysis", 1954)

"In fact, it is empirically ascertainable that every event is actually produced by a number of factors, or is at least accompanied by numerous other events that are somehow connected with it, so that the singling out involved in the picture of the causal chain is an extreme abstraction. Just as ideal objects cannot be isolated from their proper context, material existents exhibit multiple interconnections; therefore the universe is not a heap of things but a system of interacting systems." (Mario Bunge, "Causality: The place of the casual principles in modern science", 1959)

"Certain properties are necessary or sufficient conditions for other properties, and the network of causal relations thus established will make the occurrence of one property at least tend, subject to the presence of other properties, to promote or inhibit the occurrence of another. Arguments from models involve those analogies which can be used to predict the occurrence of certain properties or events, and hence the relevant relations are causal, at least in the sense of implying a tendency to co-occur." (Mary B Hesse," Models and Analogies in Science", 1963)

"For Science in its totality, the ultimate goal is the creation of a monistic system in which - on the symbolic level and in terms of the inferred components of invisibility and intangibly fine structure - the world’s enormous multiplicity is reduced to something like unity, and the endless successions of unique events of a great many different kinds get tidied and simplified into a single rational order. Whether this goal will ever be reached remains to be seen. Meanwhile we have the various sciences, each with its own system coordinating concepts, its own criterion of explanation." (Aldous Huxley, "Literature and Science", 1963)

"The subject matter of the scientist is a crowd of natural events at all times; he presupposes that this crowd is not real but apparent, and seeks to discover the true place of events in the system of nature. The subject matter of the poet is a crowd of historical occasions of feeling recollected from the past; he presupposes that this crowd is real but should not be, and seeks to transform it into a community. Both science and art are primarily spiritual activities, whatever practical applications may be derived from their results. Disorder, lack of meaning, are spiritual not physical discomforts, order and sense spiritual not physical satisfactions." (Wystan H Auden, "The Dyer’s Hand and Other Essays", 1965)

"In complex systems cause and effect are often not closely related in either time or space. The structure of a complex system is not a simple feedback loop where one system state dominates the behavior. The complex system has a multiplicity of interacting feedback loops. Its internal rates of flow are controlled by nonlinear relationships. The complex system is of high order, meaning that there are many system states (or levels). It usually contains positive-feedback loops describing growth processes as well as negative, goal-seeking loops. In the complex system the cause of a difficulty may lie far back in time from the symptoms, or in a completely different and remote part of the system. In fact, causes are usually found, not in prior events, but in the structure and policies of the system." (Jay Wright Forrester, "Urban dynamics", 1969)

"There are different levels of organization in the occurrence of events. You cannot explain the events of one level in terms of the events of another. For example, you cannot explain life in terms of mechanical concepts, nor society in terms of individual psychology. Analysis can only take you down the scale of organization. It cannot reveal the workings of things on a higher level. To some extent the holistic philosophers are right." (Anatol Rapoport, "General Systems" Vol. 14, 1969)

"[I]n probability theory we are faced with situations in which our intuition or some physical experiments we have carried out suggest certain results. Intuition and experience lead us to an assignment of probabilities to events. As far as the mathematics is concerned, any assignment of probabilities will do, subject to the rules of mathematical consistency." (Robert Ash, "Basic probability theory", 1970) 

"This parallel, between cybernetic explanation and the tactics of logical or mathematical proof, is of more than trivial interest. Outside of cybernetics, we look for explanation, but not for anything which would simulate logical proof. This simulation of proof is something new. We can say, however, with hindsight wisdom, that explanation by simulation of logical or mathematical proof was expectable. After all, the subject matter of cybernetics is not events and objects but the information 'carried' by events and objects. We consider the objects or events only as proposing facts, propositions, messages, percepts, and the like. The subject matter being propositional, it is expectable that explanation would simulate the logical." (Gregory Bateson, "Steps to an Ecology of Mind", 1972)

"General systems theory deals with the most fundamental concepts and aspects of systems. Many theories dealing with more specific types of systems (e. g., dynamical systems, automata, control systems, game-theoretic systems, among many others) have been under development for quite some time. General systems theory is concerned with the basic issues common to all these specialized treatments. Also, for truly complex phenomena, such as those found predominantly in the social and biological sciences, the specialized descriptions used in classical theories (which are based on special mathematical structures such as differential or difference equations, numerical or abstract algebras, etc.) do not adequately and properly represent the actual events. Either because of this inadequate match between the events and types of descriptions available or because of the pure lack of knowledge, for many truly complex problems one can give only the most general statements, which are qualitative and too often even only verbal. General systems theory is aimed at providing a description and explanation for such complex phenomena." (Mihajlo D. Mesarovic & Yasuhiko Takahare, "General Systems Theory: Mathematical foundations", 1975)

"[…] there is an external world which can in principle be exhaustively described in scientific language. The scientist, as both observer and language-user, can capture the external facts of the world in propositions that are true if they correspond to the facts and false if they do not. Science is ideally a linguistic system in which true propositions are in one-to-one relation to facts, including facts that are not directly observed because they involve hidden entities or properties, or past events or far distant events. These hidden events are described in theories, and theories can be inferred from observation, that is, the hidden explanatory mechanism of the world can be discovered from what is open to observation. Man as scientist is regarded as standing apart from the world and able to experiment and theorize about it objectively and dispassionately." (Mary B Hesse, "Revolutions and Reconstructions in the Philosophy of Science", 1980)

"Perhaps randomness is not merely an adequate description for complex causes that we cannot specify. Perhaps the world really works this way, and many events are uncaused in any conventional sense of the word." (Stephen Jay Gould,"Hen's Teeth and Horse's Toes", 1983)

"If you perceive the world as some place where things happen at random - random events over which you have sometimes very little control, sometimes fairly good control, but still random events - well, one has to be able to have some idea of how these things behave. […] People who are not used to statistics tend to see things in data - there are random fluctuations which can sometimes delude them - so you have to understand what can happen randomly and try to control whatever can be controlled. You have to expect that you are not going to get a clean-cut answer. So how do you interpret what you get? You do it by statistics." (Lucien LeCam, [interview] 1988)

"In the end, each life is no more than the sum of contingent facts, a chronicle of chance intersections, of flukes, of random events that divulge nothing but their own lack of purpose." (Paul Auster, "The Locked Room", 1988)

"This transition from uncertainty to near certainty when we observe long series of events, or large systems, is an essential theme in the study of chance." (David Ruelle, "Chance and Chaos", 1991)

"According to the narrower definition of randomness, a random sequence of events is one in which anything that can ever happen can happen next. Usually it is also understood that the probability that a given event will happen next is the same as the probability that a like event will happen at any later time. [...] According to the broader definition of randomness, a random sequence is simply one in which any one of several things can happen next, even though not necessarily anything that can ever happen can happen next." (Edward N Lorenz, "The Essence of Chaos", 1993)

"At the other far extreme, we find many systems ordered as a patchwork of parallel operations, very much as in the neural network of a brain or in a colony of ants. Action in these systems proceeds in a messy cascade of interdependent events. Instead of the discrete ticks of cause and effect that run a clock, a thousand clock springs try to simultaneously run a parallel system. Since there is no chain of command, the particular action of any single spring diffuses into the whole, making it easier for the sum of the whole to overwhelm the parts of the whole. What emerges from the collective is not a series of critical individual actions but a multitude of simultaneous actions whose collective pattern is far more important. This is the swarm model." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"So we pour in data from the past to fuel the decision-making mechanisms created by our models, be they linear or nonlinear. But therein lies the logician's trap: past data from real life constitute a sequence of events rather than a set of independent observations, which is what the laws of probability demand.[...] It is in those outliers and imperfections that the wildness lurks." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"Paradigms are the most general-rather like a philosophical or ideological framework. Theories are more specific, based on the paradigm and designed to describe what happens in one of the many realms of events encompassed by the paradigm. Models are even more specific providing the mechanisms by which events occur in a particular part of the theory's realm. Of all three, models are most affected by empirical data - models come and go, theories only give way when evidence is overwhelmingly against them and paradigms stay put until a radically better idea comes along." (Lee R Beach, "The Psychology of Decision Making: People in Organizations", 1997)

"Is a random outcome completely determined, and random only by virtue of our ignorance of the most minute contributing factors? Or are the contributing factors unknowable, and therefore render as random an outcome that can never be determined? Are seemingly random events merely the result of fluctuations superimposed on a determinate system, masking its predictability, or is there some disorderliness built into the system itself?" (Deborah J Bennett, "Randomness", 1998)

"Events may appear to us to be random, but this could be attributed to human ignorance about the details of the processes involved." (Brain S Everitt, "Chance Rules", 1999)

"It is in the nature of exponential growth that events develop extremely slowly for extremely long periods of time, but as one glides through the knee of the curve, events erupt at an increasingly furious pace. And that is what we will experience as we enter the twenty-first century." (Ray Kurzweil, "The Age of Spiritual Machines: When Computers Exceed Human Intelligence", 1999)

"Randomness is the very stuff of life, looming large in our everyday experience. […] The fascination of randomness is that it is pervasive, providing the surprising coincidences, bizarre luck, and unexpected twists that color our perception of everyday events." (Edward Beltrami, "What is Random?: Chaos and Order in Mathematics and Life", 1999)

"The Law of Accelerating Returns: As order exponentially increases, time exponentially speeds up (that is, the time interval between salient events grows shorter as time passes)." (Ray Kurzweil, "The Age of Spiritual Machines: When Computers Exceed Human Intelligence", 1999)

"The subject of probability begins by assuming that some mechanism of uncertainty is at work giving rise to what is called randomness, but it is not necessary to distinguish between chance that occurs because of some hidden order that may exist and chance that is the result of blind lawlessness. This mechanism, figuratively speaking, churns out a succession of events, each individually unpredictable, or it conspires to produce an unforeseeable outcome each time a large ensemble of possibilities is sampled."  (Edward Beltrami, "What is Random?: Chaos and Order in Mathematics and Life", 1999)

"Entropy [...] is the amount of disorder or randomness present in any system. All non-living systems tend toward disorder; left alone they will eventually lose all motion and degenerate into an inert mass. When this permanent stage is reached and no events occur, maximum entropy is attained. A living system can, for a finite time, avert this unalterable process by importing energy from its environment. It is then said to create negentropy, something which is characteristic of all kinds of life." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"One can be highly functionally numerate without being a mathematician or a quantitative analyst. It is not the mathematical manipulation of numbers (or symbols representing numbers) that is central to the notion of numeracy. Rather, it is the ability to draw correct meaning from a logical argument couched in numbers. When such a logical argument relates to events in our uncertain real world, the element of uncertainty makes it, in fact, a statistical argument." (Eric R Sowey, "The Getting of Wisdom: Educating Statisticians to Enhance Their Clients' Numeracy", The American Statistician 57(2), 2003)

"Randomness is a difficult notion for people to accept. When events come in clusters and streaks, people look for explanations and patterns. They refuse to believe that such patterns - which frequently occur in random data - could equally well be derived from tossing a coin. So it is in the stock market as well." (Didier Sornette, "Why Stock Markets Crash: Critical events in complex financial systems", 2003)

"What appear to be the most valuable aspects of the theoretical physics we have are the mathematical descriptions which enable us to predict events. These equations are, we would argue, the only realities we can be certain of in physics; any other ways we have of thinking about the situation are visual aids or mnemonics which make it easier for beings with our sort of macroscopic experience to use and remember the equations." (Celia Green, "The Lost Cause", 2003)

"[…] we would like to observe that the butterfly effect lies at the root of many events which we call random. The final result of throwing a dice depends on the position of the hand throwing it, on the air resistance, on the base that the die falls on, and on many other factors. The result appears random because we are not able to take into account all of these factors with sufficient accuracy. Even the tiniest bump on the table and the most imperceptible move of the wrist affect the position in which the die finally lands. It would be reasonable to assume that chaos lies at the root of all random phenomena." (Iwo Bialynicki-Birula & Iwona Bialynicka-Birula, "Modeling Reality: How Computers Mirror Life", 2004)

"The basic concept of complexity theory is that systems show patterns of organization without organizer (autonomous or self-organization). Simple local interactions of many mutually interacting parts can lead to emergence of complex global structures. […] Complexity originates from the tendency of large dynamical systems to organize themselves into a critical state, with avalanches or 'punctuations' of all sizes. In the critical state, events which would otherwise be uncoupled became correlated." (Jochen Fromm, "The Emergence of Complexity", 2004)

"[...] in probability theory we are faced with situations in which our intuition or some physical experiments we have carried out suggest certain results. Intuition and experience lead us to an assignment of probabilities to events. As far as the mathematics is concerned, any assignment of probabilities will do, subject to the rules of mathematical consistency." (Robert Ash, "Basic Probability Theory", 2008)

"Prior to the discovery of the butterfly effect it was generally believed that small differences averaged out and were of no real significance. The butterfly effect showed that small things do matter. This has major implications for our notions of predictability, as over time these small differences can lead to quite unpredictable outcomes. For example, first of all, can we be sure that we are aware of all the small things that affect any given system or situation? Second, how do we know how these will affect the long-term outcome of the system or situation under study? The butterfly effect demonstrates the near impossibility of determining with any real degree of accuracy the long term outcomes of a series of events." (Elizabeth McMillan, Complexity, "Management and the Dynamics of Change: Challenges for practice", 2008)

"Regression toward the mean. That is, in any series of random events an extraordinary event is most likely to be followed, due purely to chance, by a more ordinary one." (Leonard Mlodinow, "The Drunkard’s Walk: How Randomness Rules Our Lives", 2008)

"You can’t navigate well in an interconnected, feedback-dominated world unless you take your eyes off short-term events and look for long term behavior and structure; unless you are aware of false boundaries and bounded rationality; unless you take into account limiting factors, nonlinearities and delays." (Donella H Meadow, "Thinking in Systems: A Primer", 2008)

"In the network society, the space of flows dissolves time by disordering the sequence of events and making them simultaneous in the communication networks, thus installing society in structural ephemerality: being cancels becoming." (Manuel Castells, "Communication Power", 2009)

"Because the question for me was always whether that shape we see in our lives was there from the beginning or whether these random events are only called a pattern after the fact. Because otherwise we are nothing." (Cormac McCarthy, "All the Pretty Horses", 2010)

"Without precise predictability, control is impotent and almost meaningless. In other words, the lesser the predictability, the harder the entity or system is to control, and vice versa. If our universe actually operated on linear causality, with no surprises, uncertainty, or abrupt changes, all future events would be absolutely predictable in a sort of waveless orderliness." (Lawrence K Samuels, "Defense of Chaos: The Chaology of Politics, Economics and Human Action", 2013)

"The problem of complexity is at the heart of mankind’s inability to predict future events with any accuracy. Complexity science has demonstrated that the more factors found within a complex system, the more chances of unpredictable behavior. And without predictability, any meaningful control is nearly impossible. Obviously, this means that you cannot control what you cannot predict. The ability ever to predict long-term events is a pipedream. Mankind has little to do with changing climate; complexity does." (Lawrence K Samuels, "The Real Science Behind Changing Climate", 2014)

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...