"All human actions have one or more of these seven causes: chance, nature, compulsions, habit, reason, passion, desire." (Aristotle, 4th century BC)
"In all disciplines in which there is systematic knowledge of things with principles, causes, or elements, it arises from a grasp of those: we think we have knowledge of a thing when we have found its primary causes and principles, and followed it back to its elements." (Aristotle, "Physics", cca. 350 BC)
"Constantly regard the universe as one living being, having one substance and one soul; and observe how all things have reference to one perception, the perception of this one living being; and how all things act with one movement; and how all things are the cooperating causes of all things which exist; observe too the continuous spinning of the thread and the contexture of the web." (Marcus Aurelius, "Meditations". cca. 121–180 AD)
"The universal cause is one thing, a particular cause another. An effect can be haphazard with respect to the plan of the second, but not of the first. For an effect is not taken out of the scope of one particular cause save by another particular cause which prevents it, as when wood dowsed with water, will not catch fire. The first cause, however, cannot have a random effect in its own order, since all particular causes are comprehended in its causality. When an effect does escape from a system of particular causality, we speak of it as fortuitous or a chance happening […]" (Thomas Aquinas, “Summa Theologica”, cca. 1266-1273)
"In the discovery of hidden things and the investigation of hidden causes, stronger reasons are obtained from sure experiments and demonstrated arguments than from probable conjectures and the opinions of philosophical speculators of the common sort […]" (William Gilbert, "De Magnete", 1600)
"The art of discovering the causes of phenomena, or true hypothesis, is like the art of decyphering, in which an ingenious conjecture greatly shortens the road." (Gottfried W Leibniz, "New Essays Concerning Human Understanding", 1704) [published 1765]
"All effects follow not with like certainty from their supposed causes." (David Hume, "An Enquiry Concerning Human Understanding", 1748)
"[…] chance, that is, an infinite number of events, with respect to which our ignorance will not permit us to perceive their causes, and the chain that connects them together. Now, this chance has a greater share in our education than is imagined. It is this that places certain objects before us and, in consequence of this, occasions more happy ideas, and sometimes leads us to the greatest discoveries […]" (Claude Adrien Helvetius, "On Mind", 1751)
"One of the most intimate of all associations in the human mind is that of cause and effect. They suggest one another with the utmost readiness upon all occasions; so that it is almost impossible to contemplate the one, without having some idea of, or forming some conjecture about the other." (Joseph Priestley, "The History and Present State of Electricity", 1767)
"But ignorance of the different causes involved in the production of events, as well as their complexity, taken together with the imperfection of analysis, prevents our reaching the same certainty about the vast majority of phenomena. Thus there are things that are uncertain for us, things more or less probable, and we seek to compensate for the impossibility of knowing them by determining their different degrees of likelihood. So it was that we owe to the weakness of the human mind one of the most delicate and ingenious of mathematical theories, the science of chance or probability." (Pierre-Simon Laplace, "Recherches, 1º, sur l'Intégration des Équations Différentielles aux Différences Finies, et sur leur Usage dans la Théorie des Hasards", 1773)
"If an event can be produced by a number n of different causes, the probabilities of the existence of these causes, given the event (prises de l'événement), are to each other as the probabilities of the event, given the causes: and the probability of each cause is equal to the probability of the event, given that cause, divided by the sum of all the probabilities of the event, given each of the causes.” (Pierre-Simon Laplace, "Mémoire sur la Probabilité des Causes par les Événements", 1774)
"The word ‘chance’ then expresses only our ignorance of the causes of the phenomena that we observe to occur and to succeed one another in no apparent order. Probability is relative in part to this ignorance, and in part to our knowledge." (Pierre-Simon Laplace, "Mémoire sur les Approximations des Formules qui sont Fonctions de Très Grands Nombres", 1783)
"We know the effects of many things, but the causes of few; experience, therefore, is a surer guide than imagination, and inquiry than conjecture." (Charles C Colton, "Lacon", 1820)
"Things of all kinds are subject to a universal law which may be called the law of large numbers. It consists in the fact that, if one observes very considerable numbers of events of the same nature, dependent on constant causes and causes which vary irregularly, sometimes in one direction, sometimes in the other, it is to say without their variation being progressive in any definite direction, one shall find, between these numbers, relations which are almost constant." (Siméon-Denis Poisson, "Poisson’s Law of Large Numbers", 1837)
"Man’s mind cannot grasp the causes of events in their completeness, but the desire to find those causes is implanted in man’s soul. And without considering the multiplicity and complexity of the conditions any one of which taken separately may seem to be the cause, he snatches at the first approximation to a cause that seems to him intelligible and says: ‘This is the cause!’" (Leo Tolstoy, "War and Peace", 1867)
"It is surprising to learn the number of causes of error which enter into the simplest experiment, when we strive to attain rigid accuracy." (William S Jevons, "The Principles of Science: A Treatise on Logic and Scientific Method", 1874)
"There is a maxim which is often quoted, that ‘The same causes will always produce the same effects.’ To make this maxim intelligible we must define what we mean by the same causes and the same effects, since it is manifest that no event ever happens more that once, so that the causes and effects cannot be the same in all respects. [...] There is another maxim which must not be confounded with that quoted at the beginning of this article, which asserts ‘That like causes produce like effects’. This is only true when small variations in the initial circumstances produce only small variations in the final state of the system. In a great many physical phenomena this condition is satisfied; but there are other cases in which a small initial variation may produce a great change in the final state of the system, as when the displacement of the ‘points’ causes a railway train to run into another instead of keeping its proper course." (James C Maxwell, "Matter and Motion", 1876)
"If statistical graphics, although born just yesterday, extends its reach every day, it is because it replaces long tables of numbers and it allows one not only to embrace at glance the series of phenomena, but also to signal the correspondences or anomalies, to find the causes, to identify the laws." (Émile Cheysson, cca. 1877)
"Before we can completely explain a phenomenon we require not only to find its true cause, its chief relations to other causes, and all the conditions which determine how the cause operates, and what its effect and amount of effect are, but also all the coincidences." (George Gore, "The Art of Scientific Discovery", 1878)
"Some of the common ways of producing a false statistical argument are to quote figures without their context, omitting the cautions as to their incompleteness, or to apply them to a group of phenomena quite different to that to which they in reality relate; to take these estimates referring to only part of a group as complete; to enumerate the events favorable to an argument, omitting the other side; and to argue hastily from effect to cause, this last error being the one most often fathered on to statistics. For all these elementary mistakes in logic, statistics is held responsible." (Sir Arthur L Bowley, "Elements of Statistics", 1901)
"An exceedingly small cause which escapes our notice determines a considerable effect that we cannot fail to see, and then we say the effect is due to chance. If we knew exactly the laws of nature and the situation of the universe at the initial moment, we could predict exactly the situation of that same universe at a succeeding moment. But even if it were the case that the natural laws had no longer any secret for us, we could still only know the initial situation 'approximately'. If that enabled us to predict the succeeding situation with 'the same approximation', that is all we require, and we should say that the phenomenon had been predicted, that it is governed by laws. But it is not always so; it may happen that small differences in the initial conditions produce very great ones in the final phenomena. A small error in the former will produce an enormous error in the latter. Prediction becomes impossible, and we have the fortuitous phenomenon. (Jules H Poincaré, "Science and Method", 1908)
"To speak of the cause of an event is therefore misleading. Any set of antecedents from which the event can theoretically be inferred by means of correlations might be called a cause of the event. But to speak of the cause is to imply a uniqueness [...]." (Bertrand Russell, "Mysticism and Logic: And Other Essays", 1910)
"'Causation' has been popularly used to express the condition of association, when applied to natural phenomena. There is no philosophical basis for giving it a wider meaning than partial or absolute association. In no case has it been proved that there is an inherent necessity in the laws of nature. Causation is correlation. [...] perfect correlation, when based upon sufficient experience, is causation in the scientific sense." (Henry E. Niles, "Correlation, Causation and Wright's Theory of 'Path Coefficients'", Genetics, 1922)
"What in the whole denotes a causal equilibrium process, appears for the part as a teleological event." (Ludwig von Bertalanffy, 1929)
"Postulate 1. All chance systems of causes are not alike in the sense that they enable us to predict the future in terms of the past. Postulate 2. Constant systems of chance causes do exist in nature. Postulate 3. Assignable causes of variation may be found and eliminated."(Walter A Shewhart, "Economic Control of Quality of Manufactured Product", 1931)
"To apply the category of cause and effect means to find out which parts of nature stand in this relation. Similarly, to apply the gestalt category means to find out which parts of nature belong as parts to functional wholes, to discover their position in these wholes, their degree of relative independence, and the articulation of larger wholes into sub-wholes." (Kurt Koffka, 1931)
"When the number of factors coming into play in a phenomenological complex is too large, scientific method in most cases fails us. One need only think of the weather, in which case prediction even for a few days ahead is impossible. Nevertheless no one doubts that we are confronted with a causal connection whose causal components are in the main known to us. Occurrences in this domain are beyond the reach of exact prediction because of the variety of factors in operation, not because of any lack of order in nature." (Albert Einstein, "Science and Religion", 1941)
"[...] the conception of chance enters in the very first steps of scientific activity in virtue of the fact that no observation is absolutely correct. I think chance is a more fundamental conception that causality; for whether in a concrete case, a cause-effect relation holds or not can only be judged by applying the laws of chance to the observation." (Max Born, 1949)
"Keep in mind that a correlation may be real and based on real cause and effect, and still be almost worthless in determining action in any single case." (Darell Huff, "How to Lie with Statistics", 1954)
"There is no correlation between the cause and the effect. The events reveal only an aleatory determination, connected not so much with the imperfection of our knowledge as with the structure of the human world." (Raymond Aron, "The Opium of the Intellectuals", 1955)
"The well-known virtue of the experimental method is that it brings situational variables under tight control. It thus permits rigorous tests of hypotheses and confidential statements about causation. The correlational method, for its part, can study what man has not learned to control. Nature has been experimenting since the beginning of time, with a boldness and complexity far beyond the resources of science. The correlator’s mission is to observe and organize the data of nature’s experiments." (Lee J Cronbach, "The Two Disciplines of Scientific Psychology", The American Psychologist Vol. 12, 1957)
"Nature is pleased with simplicity, and affects not the pomp of superfluous causes." (Morris Kline, "Mathematics and the Physical World", 1959)
"Can there be laws of chance? The answer, it would seem should be negative, since chance is in fact defined as the characteristic of the phenomena which follow no law, phenomena whose causes are too complex to permit prediction." (Félix E Borel, "Probabilities and Life", 1962)
"Every part of the system is so related to every other part that a change in a particular part causes a changes in all other parts and in the total system." (Arthur D Hall, "A methodology for systems engineering", 1962)
"Certain properties are necessary or sufficient conditions for other properties, and the network of causal relations thus established will make the occurrence of one property at least tend, subject to the presence of other properties, to promote or inhibit the occurrence of another. Arguments from models involve those analogies which can be used to predict the occurrence of certain properties or events, and hence the relevant relations are causal, at least in the sense of implying a tendency to co-occur." (Mary B Hesse," Models and Analogies in Science", 1963)
"Today we preach that science is not science unless it is quantitative. We substitute correlation for causal studies, and physical equations for organic reasoning. Measurements and equations are supposed to sharpen thinking, but [...] they more often tend to make the thinking non-causal and fuzzy." (John R Platt, "Strong Inference", Science Vol. 146 (3641), 1964)
"Cybernetics, based upon the principle of feedback or circular causal trains providing mechanisms for goal-seeking and self-controlling behavior." (Ludwig von Bertalanffy, "General System Theory", 1968)
"In complex systems cause and effect are often not closely related in either time or space. The structure of a complex system is not a simple feedback loop where one system state dominates the behavior. The complex system has a multiplicity of interacting feedback loops. Its internal rates of flow are controlled by nonlinear relationships. The complex system is of high order, meaning that there are many system states (or levels). It usually contains positive-feedback loops describing growth processes as well as negative, goal-seeking loops. In the complex system the cause of a difficulty may lie far back in time from the symptoms, or in a completely different and remote part of the system. In fact, causes are usually found, not in prior events, but in the structure and policies of the system." (Jay W Forrester, "Urban dynamics", 1969)
"Technology can relieve the symptoms of a problem without affecting the underlying causes. Faith in technology as the ultimate solution to all problems can thus divert our attention from the most fundamental problem - the problem of growth in a finite system - and prevent us from taking effective action to solve it." (Donella H Meadows, "The Limits to Growth", 1972)
"The invalid assumption that correlation implies cause is probably among the two or three most serious and common errors of human reasoning." (Stephen J Gould, "The Mismeasure of Man", 1980)
"Correlation and causation are two quite different words, and the innumerate are more prone to mistake them than most." (John A Paulos, "Innumeracy: Mathematical Illiteracy and its Consequences", 1988)
"We use mathematics and statistics to describe the diverse realms of randomness. From these descriptions, we attempt to glean insights into the workings of chance and to search for hidden causes. With such tools in hand, we seek patterns and relationships and propose predictions that help us make sense of the world." (Ivars Peterson, "The Jungles of Randomness: A Mathematical Safari", 1998)
"The complexities of cause and effect defy analysis." (Douglas Adams, "Dirk Gently's Holistic Detective Agency", 1987)
"Systems philosophy brings forth a reorganization of ways of thinking. It creates a new worldview, a new paradigm of perception and explanation, which is manifested in integration, holistic thinking, purpose-seeking, mutual causality, and process-focused inquiry." (Béla H Bánáthy, "Systems Design of Education", 1991)
"Chaos demonstrates that deterministic causes can have random effects […] There's a similar surprise regarding symmetry: symmetric causes can have asymmetric effects. […] This paradox, that symmetry can get lost between cause and effect, is called symmetry-breaking. […] From the smallest scales to the largest, many of nature's patterns are a result of broken symmetry; […]" (Ian Stewart & Martin Golubitsky, "Fearful Symmetry: Is God a Geometer?", 1992)
"Symmetry breaking in psychology is governed by the nonlinear causality of complex systems (the 'butterfly effect'), which roughly means that a small cause can have a big effect. Tiny details of initial individual perspectives, but also cognitive prejudices, may 'enslave' the other modes and lead to one dominant view." (Klaus Mainzer, "Thinking in Complexity", 1994)
"First, social systems are inherently insensitive to most policy changes that people choose in an effort to alter the behavior of systems. In fact, social systems draw attention to the very points at which an attempt to intervene will fail. Human intuition develops from exposure to simple systems. In simple systems, the cause of a trouble is close in both time and space to symptoms of the trouble. If one touches a hot stove, the burn occurs here and now; the cause is obvious. However, in complex dynamic systems, causes are often far removed in both time and space from the symptoms. True causes may lie far back in time and arise from an entirely different part of the system from when and where the symptoms occur. However, the complex system can mislead in devious ways by presenting an apparent cause that meets the expectations derived from simple systems." (Jay W Forrester, "Counterintuitive Behavior of Social Systems", 1995)
"Swarm systems generate novelty for three reasons: (1) They are 'sensitive to initial conditions' - a scientific shorthand for saying that the size of the effect is not proportional to the size of the cause - so they can make a surprising mountain out of a molehill. (2) They hide countless novel possibilities in the exponential combinations of many interlinked individuals. (3) They don’t reckon individuals, so therefore individual variation and imperfection can be allowed. In swarm systems with heritability, individual variation and imperfection will lead to perpetual novelty, or what we call evolution." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)
"A good model makes the right strategic simplifications. In fact, a really good model is one that generates a lot of understanding from focusing on a very small number of causal arrows." (Robert M Solow, "How Did Economics Get That Way and What Way Did It Get?", Daedalus, Vol. 126, No. 1, 1997)
"A model is a deliberately simplified representation of a much more complicated situation. […] The idea is to focus on one or two causal or conditioning factors, exclude everything else, and hope to understand how just these aspects of reality work and interact." (Robert M Solow, "How Did Economics Get That Way and What Way Did It Get?", Daedalus, Vol. 126, No. 1, 1997)
"Delay time, the time between causes and their impacts, can highly influence systems. Yet the concept of delayed effect is often missed in our impatient society, and when it is recognized, it’s almost always underestimated. Such oversight and devaluation can lead to poor decision making as well as poor problem solving, for decisions often have consequences that don’t show up until years later. Fortunately, mind mapping, fishbone diagrams, and creativity/brainstorming tools can be quite useful here." (Stephen G Haines, "The Managers Pocket Guide to Systems Thinking & Learning", 1998)
"Our simplistic cause-effect analyses, especially when coupled with the desire for quick fixes, usually lead to far more problems than they solve - impatience and knee-jerk reactions included. If we stop for a moment and take a good look our world and its seven levels of complex and interdependent systems, we begin to understand that multiple causes with multiple effects are the true reality, as are circles of causality-effects." (Stephen G Haines, "The Managers Pocket Guide to Systems Thinking & Learning", 1998)
"There is a new science of complexity which says that the link between cause and effect is increasingly difficult to trace; that change (planned or otherwise) unfolds in non-linear ways; that paradoxes and contradictions abound; and that creative solutions arise out of diversity, uncertainty and chaos." (Andy P Hargreaves & Michael Fullan, "What’s Worth Fighting for Out There?", 1998)
"We use mathematics and statistics to describe the diverse realms of randomness. From these descriptions, we attempt to glean insights into the workings of chance and to search for hidden causes. With such tools in hand, we seek patterns and relationships and propose predictions that help us make sense of the world." (Ivars Peterson, "The Jungles of Randomness: A Mathematical Safari", 1998)
"What it means for a mental model to be a structural analog is that it embodies a representation of the spatial and temporal relations among, and the causal structures connecting the events and entities depicted and whatever other information that is relevant to the problem-solving talks. […] The essential points are that a mental model can be nonlinguistic in form and the mental mechanisms are such that they can satisfy the model-building and simulative constraints necessary for the activity of mental modeling." (Nancy J Nersessian, "Model-based reasoning in conceptual change", 1999)
"Even if our cognitive maps of causal structure were perfect, learning, especially double-loop learning, would still be difficult. To use a mental model to design a new strategy or organization we must make inferences about the consequences of decision rules that have never been tried and for which we have no data. To do so requires intuitive solution of high-order nonlinear differential equations, a task far exceeding human cognitive capabilities in all but the simplest systems." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)
"A model isolates one or a few causal connections, mechanisms, or processes, to the exclusion of other contributing or interfering factors - while in the actual world, those other factors make their effects felt in what actually happens. Models may seem true in the abstract, and are false in the concrete. The key issue is about whether there is a bridge between the two, the abstract and the concrete, such that a simple model can be relied on as a source of relevantly truthful information about the complex reality." (Uskali Mäki, "Fact and Fiction in Economics: Models, Realism and Social Construction", 2002)
"In a complex system, there is no such thing as simple cause and effect." (Margaret J Wheatley, "It's An Interconnected World", 2002)
"Nonetheless, the basic principles regarding correlations between variables are not that diffcult to understand. We must look for patterns that reveal potential relationships and for evidence that variables are actually related. But when we do spot those relationships, we should not jump to conclusions about causality. Instead, we need to weigh the strength of the relationship and the plausibility of our theory, and we must always try to discount the possibility of spuriousness." (Joel Best, "More Damned Lies and Statistics: How numbers confuse public issues", 2004)
"Chance is just as real as causation; both are modes of becoming. The way to model a random process is to enrich the mathematical theory of probability with a model of a random mechanism. In the sciences, probabilities are never made up or 'elicited' by observing the choices people make, or the bets they are willing to place. The reason is that, in science and technology, interpreted probability exactifies objective chance, not gut feeling or intuition. No randomness, no probability." (Mario Bunge, "Chasing Reality: Strife over Realism", 2006)
"Thus, nonlinearity can be understood as the effect of a causal loop, where effects or outputs are fed back into the causes or inputs of the process. Complex systems are characterized by networks of such causal loops. In a complex, the interdependencies are such that a component A will affect a component B, but B will in general also affect A, directly or indirectly. A single feedback loop can be positive or negative. A positive feedback will amplify any variation in A, making it grow exponentially. The result is that the tiniest, microscopic difference between initial states can grow into macroscopically observable distinctions." (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)
"A system is a set of things – people, cells, molecules, or whatever – interconnected in such a way that they produce their own pattern of behavior over time. […] The system, to a large extent, causes its own behavior." (Donella H Meadows, “Thinking in Systems: A Primer”, 2008)
"Traditional statistics is strong in devising ways of describing data and inferring distributional parameters from sample. Causal inference requires two additional ingredients: a science-friendly language for articulating causal knowledge, and a mathematical machinery for processing that knowledge, combining it with data and drawing new causal conclusions about a phenomenon." (Judea Pearl, "Causal inference in statistics: An overview", Statistics Surveys 3, 2009)
"All forms of complex causation, and especially nonlinear transformations, admittedly stack the deck against prediction. Linear describes an outcome produced by one or more variables where the effect is additive. Any other interaction is nonlinear. This would include outcomes that involve step functions or phase transitions. The hard sciences routinely describe nonlinear phenomena. Making predictions about them becomes increasingly problematic when multiple variables are involved that have complex interactions. Some simple nonlinear systems can quickly become unpredictable when small variations in their inputs are introduced." (Richard N Lebow, "Forbidden Fruit: Counterfactuals and International Relations", 2010)
"Cybernetics is the art of creating equilibrium in a world of possibilities and constraints. This is not just a romantic description, it portrays the new way of thinking quite accurately. Cybernetics differs from the traditional scientific procedure, because it does not try to explain phenomena by searching for their causes, but rather by specifying the constraints that determine the direction of their development." (Ernst von Glasersfeld, "Partial Memories: Sketches from an Improbable Life", 2010)
"Most systems in nature are inherently nonlinear and can only be described by nonlinear equations, which are difficult to solve in a closed form. Non-linear systems give rise to interesting phenomena such as chaos, complexity, emergence and self-organization. One of the characteristics of non-linear systems is that a small change in the initial conditions can give rise to complex and significant changes throughout the system. This property of a non-linear system such as the weather is known as the butterfly effect where it is purported that a butterfly flapping its wings in Japan can give rise to a tornado in Kansas. This unpredictable behaviour of nonlinear dynamical systems, i.e. its extreme sensitivity to initial conditions, seems to be random and is therefore referred to as chaos. This chaotic and seemingly random behaviour occurs for non-linear deterministic system in which effects can be linked to causes but cannot be predicted ahead of time." (Robert K Logan, "The Poetry of Physics and The Physics of Poetry", 2010)
"System dynamics is an approach to understanding the behaviour of over time. It deals with internal feedback loops and time delays that affect the behaviour of the entire system. It also helps the decision maker untangle the complexity of the connections between various policy variables by providing a new language and set of tools to describe. Then it does this by modeling the cause and effect relationships among these variables." (Raed M Al-Qirem & Saad G Yaseen, "Modelling a Small Firm in Jordan Using System Dynamics", 2010)
"In dynamical systems, a bifurcation occurs when a small smooth change made to the parameter values (the bifurcation parameters) of a system causes a sudden 'qualitative' or topological change in its behaviour. Generally, at a bifurcation, the local stability properties of equilibria, periodic orbits or other invariant sets changes." (Gregory Faye, "An introduction to bifurcation theory", 2011)
"Each systems archetype embodies a particular theory about dynamic behavior that can serve as a starting point for selecting and formulating raw data into a coherent set of interrelationships. Once those relationships are made explicit and precise, the 'theory' of the archetype can then further guide us in our data-gathering process to test the causal relationships through direct observation, data analysis, or group deliberation." (Daniel H Kim, "Systems Archetypes as Dynamic Theories", The Systems Thinker Vol. 24 (1), 2013)
"Without precise predictability, control is impotent and almost meaningless. In other words, the lesser the predictability, the harder the entity or system is to control, and vice versa. If our universe actually operated on linear causality, with no surprises, uncertainty, or abrupt changes, all future events would be absolutely predictable in a sort of waveless orderliness." (Lawrence K Samuels, "Defense of Chaos", 2013)
"A basic problem with MRA is that it typically assumes that the independent variables can be regarded as building blocks, with each variable taken by itself being logically independent of all the others. This is usually not the case, at least for behavioral data. […] Just as correlation doesn’t prove causation, absence of correlation fails to prove absence of causation. False-negative findings can occur using MRA just as false-positive findings do - because of the hidden web of causation that we’ve failed to identify." (Richard E Nisbett, "Mindware: Tools for Smart Thinking", 2015)
"The theory behind multiple regression analysis is that if you control for everything that is related to the independent variable and the dependent variable by pulling their correlations out of the mix, you can get at the true causal relation between the predictor variable and the outcome variable. That’s the theory. In practice, many things prevent this ideal case from being the norm." (Richard E Nisbett, "Mindware: Tools for Smart Thinking", 2015)
"The work around the complex systems map supported a concentration on causal mechanisms. This enabled poor system responses to be diagnosed as the unanticipated effects of previous policies as well as identification of the drivers of the sector. Understanding the feedback mechanisms in play then allowed experimentation with possible future policies and the creation of a coherent and mutually supporting package of recommendations for change." (David C Lane et al, "Blending systems thinking approaches for organisational analysis: reviewing child protection", 2015)
"Correlation is not equivalent to cause for one major reason. Correlation is well defined in terms of a mathematical formula. Cause is not well defined." (David S Salsburg, "Errors, Blunders, and Lies: How to Tell the Difference", 2017)
"Effects without an understanding of the causes behind them, on the other hand, are just bunches of data points floating in the ether, offering nothing useful by themselves. Big Data is information, equivalent to the patterns of light that fall onto the eye. Big Data is like the history of stimuli that our eyes have responded to. And as we discussed earlier, stimuli are themselves meaningless because they could mean anything. The same is true for Big Data, unless something transformative is brought to all those data sets… understanding." (Beau Lotto, "Deviate: The Science of Seeing Differently", 2017)
"Again, classical statistics only summarizes data, so it does not provide even a language for asking [a counterfactual] question. Causal inference provides a notation and, more importantly, offers a solution. As with predicting the effect of interventions [...], in many cases we can emulate human retrospective thinking with an algorithm that takes what we know about the observed world and produces an answer about the counterfactual world." (Judea Pearl & Dana Mackenzie, "The Book of Why: The new science of cause and effect", 2018)
"Bayesian networks inhabit a world where all questions are reducible to probabilities, or (in the terminology of this chapter) degrees of association between variables; they could not ascend to the second or third rungs of the Ladder of Causation. Fortunately, they required only two slight twists to climb to the top." (Judea Pearl & Dana Mackenzie, "The Book of Why: The new science of cause and effect", 2018)
"Some scientists (e.g., econometricians) like to work with mathematical equations; others (e.g., hard-core statisticians) prefer a list of assumptions that ostensibly summarizes the structure of the diagram. Regardless of language, the model should depict, however qualitatively, the process that generates the data - in other words, the cause-effect forces that operate in the environment and shape the data generated." (Judea Pearl & Dana Mackenzie, "The Book of Why: The new science of cause and effect", 2018)
"The calculus of causation consists of two languages: causal diagrams, to express what we know, and a symbolic language, resembling algebra, to express what we want to know. The causal diagrams are simply dot-and-arrow pictures that summarize our existing scientific knowledge. The dots represent quantities of interest, called 'variables', and the arrows represent known or suspected causal relationships between those variables - namely, which variable 'listens' to which others." (Judea Pearl & Dana Mackenzie, "The Book of Why: The new science of cause and effect", 2018)
"The main differences between Bayesian networks and causal diagrams lie in how they are constructed and the uses to which they are put. A Bayesian network is literally nothing more than a compact representation of a huge probability table. The arrows mean only that the probabilities of child nodes are related to the values of parent nodes by a certain formula (the conditional probability tables) and that this relation is sufficient. That is, knowing additional ancestors of the child will not change the formula. Likewise, a missing arrow between any two nodes means that they are independent, once we know the values of their parents. [...] If, however, the same diagram has been constructed as a causal diagram, then both the thinking that goes into the construction and the interpretation of the final diagram change." (Judea Pearl & Dana Mackenzie, "The Book of Why: The new science of cause and effect", 2018)
"Although to penetrate into the intimate mysteries of nature and hence to learn the true causes of phenomena is not allowed to us, nevertheless it can happen that a certain fictive hypothesis may suffice for explaining many phenomena." (Leonhard Euler)
"Nature is pleased with simplicity, and affects not the pomp of superfluous causes." (Sir Issac Newton)
No comments:
Post a Comment