20 November 2024

Science: On Risk (Quotes)

"A deterministic system is one in which the parts interact in a perfectly predictable way. There is never any room for doubt: given a last state of the system and the programme of information by defining its dynamic network, it is always possible to predict, without any risk of error, its succeeding state. A probabilistic system, on the other hand, is one about which no precisely detailed prediction can be given. The system may be studied intently, and it may become more and more possible to say what it is likely to do in any given circumstances. But the system simply is not predetermined, and a prediction affecting it can never escape from the logical limitations of the probabilities in which terms alone its behaviour can be described." (Stafford Beer, "Cybernetics and Management", 1959)

"It is easy to obtain confirmations, or verifications, for nearly every theory - if we look for confirmations. Confirmations should count only if they are the result of risky predictions. […] A theory which is not refutable by any conceivable event is non-scientific. Irrefutability is not a virtue of a theory (as people often think) but a vice. Every genuine test of a theory is an attempt to falsify it, or refute it." (Karl R Popper, "Conjectures and Refutations: The Growth of Scientific Knowledge", 1963)

"Statistical hypothesis testing is commonly used inappropriately to analyze data, determine causality, and make decisions about significance in ecological risk assessment,[...] It discourages good toxicity testing and field studies, it provides less protection to ecosystems or their components that are difficult to sample or replicate, and it provides less protection when more treatments or responses are used. It provides a poor basis for decision-making because it does not generate a conclusion of no effect, it does not indicate the nature or magnitude of effects, it does address effects at untested exposure levels, and it confounds effects and uncertainty[...]. Risk assessors should focus on analyzing the relationship between exposure and effects[...]."  (Glenn W Suter, "Abuse of hypothesis testing statistics in ecological risk assessment", Human and Ecological Risk Assessment 2, 1996)

"Until we can distinguish between an event that is truly random and an event that is the result of cause and effect, we will never know whether what we see is what we'll get, nor how we got what we got. When we take a risk, we are betting on an outcome that will result from a decision we have made, though we do not know for certain what the outcome will be. The essence of risk management lies in maximizing the areas where we have some control over the outcome while minimizing the areas where we have absolutely no control over the outcome and the linkage between effect and cause is hidden from us." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"Overcoming innumeracy is like completing a three-step program to statistical literacy. The first step is to defeat the illusion of certainty. The second step is to learn about the actual risks of relevant events and actions. The third step is to communicate the risks in an understandable way and to draw inferences without falling prey to clouded thinking. The general point is this: Innumeracy does not simply reside in our minds but in the representations of risk that we choose." (Gerd Gigerenzer, "Calculated Risks: How to know when numbers deceive you", 2002)

"The goal of random sampling is to produce a sample that is likely to be representative of the population. Although random sampling does not guarantee that the sample will be representative, it does allow us to assess the risk of an unrepresentative sample. It is the ability to quantify this risk that will enable us to generalize with confidence from a random sample to the corresponding population." (Roxy Peck et al, "Introduction to Statistics and Data Analysis" 4th Ed., 2012)

"Decision trees are an important tool for decision making and risk analysis, and are usually represented in the form of a graph or list of rules. One of the most important features of decision trees is the ease of their application. Being visual in nature, they are readily comprehensible and applicable. Even if users are not familiar with the way that a decision tree is constructed, they can still successfully implement it. Most often decision trees are used to predict future scenarios, based on previous experience, and to support rational decision making." (Jelena Djuris et al, "Neural computing in pharmaceutical products and process development", Computer-Aided Applications in Pharmaceutical Technology, 2013)

"Without context, data is useless, and any visualization you create with it will also be useless. Using data without knowing anything about it, other than the values themselves, is like hearing an abridged quote secondhand and then citing it as a main discussion point in an essay. It might be okay, but you risk finding out later that the speaker meant the opposite of what you thought." (Nathan Yau, "Data Points: Visualization That Means Something", 2013)

"The more complex the system, the more variable (risky) the outcomes. The profound implications of this essential feature of reality still elude us in all the practical disciplines. Sometimes variance averages out, but more often fat-tail events beget more fat-tail events because of interdependencies. If there are multiple projects running, outlier (fat-tail) events may also be positively correlated - one IT project falling behind will stretch resources and increase the likelihood that others will be compromised." (Paul Gibbons, "The Science of Successful Organizational Change",  2015)

"Roughly stated, the No Free Lunch theorem states that in the lack of prior knowledge (i.e. inductive bias) on average all predictive algorithms that search for the minimum classification error (or extremum over any risk metric) have identical performance according to any measure." (N D Lewis, "Deep Learning Made Easy with R: A Gentle Introduction for Data Science", 2016)

"Premature enumeration is an equal-opportunity blunder: the most numerate among us may be just as much at risk as those who find their heads spinning at the first mention of a fraction. Indeed, if you’re confident with numbers you may be more prone than most to slicing and dicing, correlating and regressing, normalizing and rebasing, effortlessly manipulating the numbers on the spreadsheet or in the statistical package - without ever realizing that you don’t fully understand what these abstract quantities refer to. Arguably this temptation lay at the root of the last financial crisis: the sophistication of mathematical risk models obscured the question of how, exactly, risks were being measured, and whether those measurements were something you’d really want to bet your global banking system on." (Tim Harford, "The Data Detective: Ten easy rules to make sense of statistics", 2020)

"Behavioral finance so far makes conclusions from statics not dynamics, hence misses the picture. It applies trade-offs out of context and develops the consensus that people irrationally overestimate tail risk (hence need to be 'nudged' into taking more of these exposures). But the catastrophic event is an absorbing barrier. No risky exposure can be analyzed in isolation: risks accumulate. If we ride a motorcycle, smoke, fly our own propeller plane, and join the mafia, these risks add up to a near-certain premature death. Tail risks are not a renewable resource." (Nassim N Taleb, "Statistical Consequences of Fat Tails: Real World Preasymptotics, Epistemology, and Applications" 2nd Ed., 2022)

"Any time you run regression analysis on arbitrary real-world observational data, there’s a significant risk that there’s hidden confounding in your dataset and so causal conclusions from such analysis are likely to be (causally) biased." (Aleksander Molak, "Causal Inference and Discovery in Python", 2023)

"[Making reasoned macro calls] starts with having the best and longest-time-series data you can find. You may have to take some risks in terms of the quality of data sources, but it amazes me how people are often more willing to act based on little or no data than to use data that is a challenge to assemble." (Robert J Shiller)

19 November 2024

Science: On Evidence (Quotes)

"Rule 1. Original data should be presented in a way that will preserve the evidence in the original data for all the predictions assumed to be useful." (Walter A Shewhart, "Economic Control of Quality of Manufactured Product", 1931)

"Rule 2. Any summary of a distribution of numbers in terms of symmetric functions should not give an objective degree of belief in any one of the inferences or predictions to be made therefrom that would cause human action significantly different from what this action would be if the original distributions had been taken as evidence." (Walter A Shewhart, "Economic Control of Quality of Manufactured Product", 1931)

"[...] there is evidence that significance tests have been a genuine block to achieving [...] knowledge." (Denton E Morrison & Ramon E Henkel, "Significance tests reconsidered", The American Sociologist 4, 1969)

"Confidence intervals give a feeling of the uncertainty of experimental evidence, and (very important) give it in the same units [...] as the original observations." (Mary G Natrella, "The relation between confidence intervals and tests of significance", American Statistician 14, 1960)

"The null-hypothesis significance test treats ‘acceptance’ or ‘rejection’ of a hypothesis as though these were decisions one makes. But a hypothesis is not something, like a piece of pie offered for dessert, which can be accepted or rejected by a voluntary physical action. Acceptance or rejection of a hypothesis is a cognitive process, a degree of believing or disbelieving which, if rational, is not a matter of choice but determined solely by how likely it is, given the evidence, that the hypothesis is true." (William W Rozeboom, "The fallacy of the null–hypothesis significance test", Psychological Bulletin 57, 1960)

"Scientific discovery, or the formulation of scientific theory, starts in with the unvarnished and unembroidered evidence of the senses. It starts with simple observation - simple, unbiased, unprejudiced, naive, or innocent observation - and out of this sensory evidence, embodied in the form of simple propositions or declarations of fact, generalizations will grow up and take shape, almost as if some process of crystallization or condensation were taking place. Out of a disorderly array of facts, an orderly theory, an orderly general statement, will somehow emerge." (Sir Peter B Medawar, "Is the Scientific Paper Fraudulent?", The Saturday Review, 1964)

"We have overwhelming evidence that available information plus analysis does not lead to knowledge. The management science team can properly analyse a situation and present recommendations to the manager, but no change occurs. The situation is so familiar to those of us who try to practice management science that I hardly need to describe the cases." (C West Churchman, "Managerial acceptance of scientific recommendations", California Management Review Vol 7, 1964)

"Science consists simply of the formulation and testing of hypotheses based on observational evidence; experiments are important where applicable, but their function is merely to simplify observation by imposing controlled conditions." (Henry L Batten, "Evolution of the Earth", 1971)

"Statistics is a body of methods and theory applied to numerical evidence in making decisions in the face of uncertainty." (Lawrence Lapin, "Statistics for Modern Business Decisions", 1973)

"The language of association and prediction is probably most often used because the evidence seems insufficient to justify a direct causal statement. A better practice is to state the causal hypothesis and then to present the evidence along with an assessment with respect to the causal hypothesis - instead of letting the quality of the data determine the language of the explanation." (Edward R Tufte, "Data Analysis for Politics and Policy", 1974)

"All interpretations made by a scientist are hypotheses, and all hypotheses are tentative. They must forever be tested and they must be revised if found to be unsatisfactory. Hence, a change of mind in a scientist, and particularly in a great scientist, is not only not a sign of weakness but rather evidence for continuing attention to the respective problem and an ability to test the hypothesis again and again." (Ernst Mayr, "The Growth of Biological Thought: Diversity, Evolution and Inheritance", 1982)

"It has been widely felt, probably for thirty years and more, that significance tests are overemphasized and often misused and that more emphasis should be put on estimation and prediction. While such a shift of emphasis does seem to be occurring, for example in medical statistics, the continued very extensive use of significance tests is on the one hand alarming and on the other evidence that they are aimed, even if imperfectly, at some widely felt need." (David R Cox, "Some general aspects of the theory of statistics", International Statistical Review 54, 1986)

"Like a detective, a data analyst will experience many dead ends, retrace his steps, and explore many alternatives before settling on a single description of the evidence in front of him." (David Lubinsky & Daryl Pregibon , "Data analysis as search", Journal of Econometrics Vol. 38 (1–2), 1988)

"Subjective probability, also known as Bayesian statistics, pushes Bayes' theorem further by applying it to statements of the type described as 'unscientific' in the frequency definition. The probability of a theory (e.g. that it will rain tomorrow or that parity is not violated) is considered to be a subjective 'degree of belief - it can perhaps be measured by seeing what odds the person concerned will offer as a bet. Subsequent experimental evidence then modifies the initial degree of belief, making it stronger or weaker according to whether the results agree or disagree with the predictions of the theory in question." (Roger J Barlow, "Statistics: A guide to the use of statistical methods in the physical sciences", 1989)

"Probability theory is an ideal tool for formalizing uncertainty in situations where class frequencies are known or where evidence is based on outcomes of a sufficiently long series of independent random experiments. Possibility theory, on the other hand, is ideal for formalizing incomplete information expressed in terms of fuzzy propositions." (George Klir, "Fuzzy sets and fuzzy logic", 1995)

"[…] the simplest hypothesis proposed as an explanation of phenomena is more likely to be the true one than is any other available hypothesis, that its predictions are more likely to be true than those of any other available hypothesis, and that it is an ultimate a priori epistemic principle that simplicity is evidence for truth." (Richard Swinburne, "Simplicity as Evidence for Truth", 1997)

"When significance tests are used and a null hypothesis is not rejected, a major problem often arises - namely, the result may be interpreted, without a logical basis, as providing evidence for the null hypothesis." (David F Parkhurst, "Statistical Significance Tests: Equivalence and Reverse Tests Should Reduce Misinterpretation", BioScience Vol. 51 (12), 2001)

"One cautious approach is represented by Bernoulli’s more conservative outlook. If there are very strong reasons for believing that an observation has suffered an accident that made the value in the data-file thoroughly untrustworthy, then reject it; in the absence of clear evidence that an observation, identified by formal rule as an outlier, is unacceptable then retain it unless there is lack of trust that the laboratory obtaining it is conscientiously operated by able persons who have [...] taken every care.'" (David Finney, "Calibration Guidelines Challenge Outlier Practices", The American Statistician Vol 60 (4), 2006)

"Scholars feel the need to present tables of model parameters in academic articles (perhaps just as evidence that they ran the analysis they claimed to have run), but these tables are rarely interpreted other than for their sign and statistical significance. Most of the numbers in these tables are never even discussed in the text. From the perspective of the applied data analyst, R packages without procedures to compute quantities of scientific interest are woefully incomplete. A better approach focuses on quantities of direct scientific interest rather than uninterpretable model parameters. [...] For each quantity of interest, the user needs some summary that includes a point estimate and a measure of uncertainty such as a standard error, confidence interval, or a distribution. The methods of calculating these differ greatly across theories of inference and methods of analysis. However, from the user’s perspective, the result is almost always the same: the point estimate and uncertainty of some quantity of interest." (Kousuke Imai et al, "Toward a Common Framework for Statistical Analysis and Development", Journal of Computational and Graphical Statistics vol. 17, 2008)

"Data analysis is careful thinking about evidence." (Michael Milton, "Head First Data Analysis", 2009)

"Data clusters are everywhere, even in random data. Someone who looks for an explanation will inevitably find one, but a theory that fits a data cluster is not persuasive evidence. The found explanation needs to make sense and it needs to be tested with uncontaminated data." (Gary Smith, "Standard Deviations", 2014)

"In general, when building statistical models, we must not forget that the aim is to understand something about the real world. Or predict, choose an action, make a decision, summarize evidence, and so on, but always about the real world, not an abstract mathematical world: our models are not the reality - a point well made by George Box in his oft-cited remark that "all models are wrong, but some are useful". (David Hand, "Wonderful examples, but let's not close our eyes", Statistical Science 29, 2014)

"The dialectical interplay of experiment and theory is a key driving force of modern science. Experimental data do only have meaning in the light of a particular model or at least a theoretical background. Reversely theoretical considerations may be logically consistent as well as intellectually elegant: Without experimental evidence they are a mere exercise of thought no matter how difficult they are. Data analysis is a connector between experiment and theory: Its techniques advise possibilities of model extraction as well as model testing with experimental data." (Achim Zielesny, "From Curve Fitting to Machine Learning" 2nd Ed., 2016)

"In terms of characteristics, a data scientist has an inquisitive mind and is prepared to explore and ask questions, examine assumptions and analyse processes, test hypotheses and try out solutions and, based on evidence, communicate informed conclusions, recommendations and caveats to stakeholders and decision makers." (Jesús Rogel-Salazar, "Data Science and Analytics with Python", 2017)

"With the growing availability of massive data sets and user-friendly analysis software, it might be thought that there is less need for training in statistical methods. This would be naïve in the extreme. Far from freeing us from the need for statistical skills, bigger data and the rise in the number and complexity of scientific studies makes it even more difficult to draw appropriate conclusions. More data means that we need to be even more aware of what the evidence is actually worth." (David Spiegelhalter, "The Art of Statistics: Learning from Data", 2019)

"The general principles of starting with a well-defined question, engaging in careful observation, and then formulating hypotheses and assessing the strength of evidence for and against them became known as the scientific method." (Michael Friendly & Howard Wainer, "A History of Data Visualization and Graphic Communication", 2021)

"Absence of evidence is not evidence of absence." (Martin Rees)

"The deepest sin of the human mind is to believe things without evidence." (Thomas H Huxley)

18 November 2024

Science: On Truth in Models (Quotes)

"A model, like a novel, may resonate with nature, but it is not a ‘real’ thing. Like a novel, a model may be convincing - it may ‘ring true’ if it is consistent with our experience of the natural world. But just as we may wonder how much the characters in a novel are drawn from real life and how much is artifice, we might ask the same of a model: How much is based on observation and measurement of accessible phenomena, how much is convenience? Fundamentally, the reason for modeling is a lack of full access, either in time or space, to the phenomena of interest." (Kenneth Belitz, Science, Vol. 263, 1944)

"Exact truth of a null hypothesis is very unlikely except in a genuine uniformity trial." (David R Cox, "Some problems connected with statistical inference", Annals of Mathematical Statistics 29, 1958)

"[…] no models are [true] = not even the Newtonian laws. When you construct a model you leave out all the details which you, with the knowledge at your disposal, consider inessential. […] Models should not be true, but it is important that they are applicable, and whether they are applicable for any given purpose must of course be investigated. This also means that a model is never accepted finally, only on trial." (Georg Rasch, "Probabilistic Models for Some Intelligence and Attainment Tests", 1960)

"The validation of a model is not that it is 'true' but that it generates good testable hypotheses relevant to important problems." (Richard Levins, "The Strategy of Model Building in Population Biology", 1966)

"A theory has only the alternative of being right or wrong. A model has a third possibility: it may be right, but irrelevant." (Manfred Eigen, 1973)

"Models, of course, are never true, but fortunately it is only necessary that they be useful. For this it is usually needful only that they not be grossly wrong. I think rather simple modifications of our present models will prove adequate to take account of most realities of the outside world. The difficulties of computation which would have been a barrier in the past need not deter us now." (George E P Box, "Some Problems of Statistics and Everyday Life", Journal of the American Statistical Association, Vol. 74 (365), 1979)

"The purpose of an experiment is to answer questions. The truth of this seems so obvious, that it would not be worth emphasizing were it not for the fact that the results of many experiments are interpreted and presented with little or no reference to the questions that were asked in the first place."  (Thomas M Little, "Interpretation and presentation of results", Hortscience 16, 1981)

"The fact that [the model] is an approximation does not necessarily detract from its usefulness because models are approximations. All models are wrong, but some are useful." (George Box, 1987)

"A null hypothesis that yields under two different treatments have identical expectations is scarcely very plausible, and its rejection by a significance test is more dependent upon the size of an experiment than upon its untruth." (David J Finney, "Was this in your statistics textbook?", Experimental Agriculture 24, 1988)

"The motivation for any action on outliers must be to improve interpretation of data without ignoring unwelcome truth. To remove bad and untrustworthy data is a laudable ambition, but naive and untested rules may bring harm rather than benefit." (David Finney, "Calibration Guidelines Challenge Outlier Practices", The American Statistician Vol 60 (4), 2006) 

"You might say that there’s no reason to bother with model checking since all models are false anyway. I do believe that all models are false, but for me the purpose of model checking is not to accept or reject a model, but to reveal aspects of the data that are not captured by the fitted model." (Andrew Gelman, "Some thoughts on the sociology of statistics", 2007)

"If students have students have no experience with hands-on [telescope] observing, they may take all data as ‘truth’ without having an understanding of how the data are obtained and what could potentially go wrong in that process, so I think it becomes crucially important to give a glimpse of what’s happening behind the scenes at telescopes, so they can be appropriately skeptical users of data in the future." (Colette Salyk, Sky & Telescope, 2022)

"On a final note, we would like to stress the importance of design, which often does not receive the attention it deserves. Sometimes, the large number of modeling options for spatial analysis may raise the false impression that design does not matter, and that a sophisticated analysis takes care of everything. Nothing could be further from the truth." (Hans-Peter Piepho et al, "Two-dimensional P-spline smoothing for spatial analysis of plant breeding trials", “Biometrical Journal”, 2022)

17 November 2024

Science: On Confidence Intervals (Quotes)

"Confidence intervals give a feeling of the uncertainty of experimental evidence, and (very important) give it in the same units [...] as the original observations." (Mary G Natrella, "The relation between confidence intervals and tests of significance", American Statistician 14, 1960)

"I do not think that significance testing should be completely abandoned [...] and I don’t expect that it will be. But I urge researchers to provide estimates, with confidence intervals: scientific advance requires parameters with known reliability estimates. Classical confidence intervals are formally equivalent to a significance test, but they convey more information." (Nigel G Yoccoz, "Use, Overuse, and Misuse of Significance Tests in Evolutionary Biology and Ecology", Bulletin of the Ecological Society of America Vol. 72 (2), 1991)

"Whereas hypothesis testing emphasizes a very narrow question (‘Do the population means fail to conform to a specific pattern?’), the use of confidence intervals emphasizes a much broader question (‘What are the population means?’). Knowing what the means are, of course, implies knowing whether they fail to conform to a specific pattern, although the reverse is not true. In this sense, use of confidence intervals subsumes the process of hypothesis testing." (Geoffrey R Loftus, "On the tyranny of hypothesis testing in the social sciences", Contemporary Psychology 36, 1991)

"Probabilistic inference is the classical paradigm for data analysis in science and technology. It rests on a foundation of randomness; variation in data is ascribed to a random process in which nature generates data according to a probability distribution. This leads to a codification of uncertainly by confidence intervals and hypothesis tests." (William S Cleveland, "Visualizing Data", 1993)

"We should push for de-emphasizing some topics, such as statistical significance tests - an unfortunate carry-over from the traditional elementary statistics course. We would suggest a greater focus on confidence intervals - these achieve the aim of formal hypothesis testing, often provide additional useful information, and are not as easily misinterpreted." (Gerry Hahn et al, "The Impact of Six Sigma Improvement: A Glimpse Into the Future of Statistics", The American Statistician, 1999)

"Precision does not vary linearly with increasing sample size. As is well known, the width of a confidence interval is a function of the square root of the number of observations. But it is more complicate than that. The basic elements determining a confidence interval are the sample size, an estimate of variability, and a pivotal variable associated with the estimate of variability." (Gerald van Belle, "Statistical Rules of Thumb", 2002)

"The important thing is to understand that frequentist and Bayesian methods are answering different questions. To combine prior beliefs with data in a principled way, use Bayesian inference. To construct procedures with guaranteed long run performance, such as confidence intervals, use frequentist methods. Generally, Bayesian methods run into problems when the parameter space is high dimensional." (Larry A Wasserman, "All of Statistics: A concise course in statistical inference", 2004) 

"There is a tendency to use hypothesis testing methods even when they are not appropriate. Often, estimation and confidence intervals are better tools. Use hypothesis testing only when you want to test a well-defined hypothesis." (Larry A Wasserman, "All of Statistics: A concise course in statistical inference", 2004)

"Scholars feel the need to present tables of model parameters in academic articles (perhaps just as evidence that they ran the analysis they claimed to have run), but these tables are rarely interpreted other than for their sign and statistical significance. Most of the numbers in these tables are never even discussed in the text. From the perspective of the applied data analyst, R packages without procedures to compute quantities of scientific interest are woefully incomplete. A better approach focuses on quantities of direct scientific interest rather than uninterpretable model parameters. [...] For each quantity of interest, the user needs some summary that includes a point estimate and a measure of uncertainty such as a standard error, confidence interval, or a distribution. The methods of calculating these differ greatly across theories of inference and methods of analysis. However, from the user’s perspective, the result is almost always the same: the point estimate and uncertainty of some quantity of interest." (Kousuke Imai et al, "Toward a Common Framework for Statistical Analysis and Development", Journal of Computational and Graphical Statistics vol. 17, 2008)

"Given the important role that correlation plays in structural equation modeling, we need to understand the factors that affect establishing relationships among multivariable data points. The key factors are the level of measurement, restriction of range in data values (variability, skewness, kurtosis), missing data, nonlinearity, outliers, correction for attenuation, and issues related to sampling variation, confidence intervals, effect size, significance, sample size, and power." (Randall E Schumacker & Richard G Lomax, "A Beginner’s Guide to Structural Equation Modeling" 3rd Ed., 2010)

"A complete data analysis will involve the following steps: (i) Finding a good model to fit the signal based on the data. (ii) Finding a good model to fit the noise, based on the residuals from the model. (iii) Adjusting variances, test statistics, confidence intervals, and predictions, based on the model for the noise.(DeWayne R Derryberry, "Basic data analysis for time series with R", 2014)

"For a confidence interval, the central limit theorem plays a role in the reliability of the interval because the sample mean is often approximately normal even when the underlying data is not. A prediction interval has no such protection. The shape of the interval reflects the shape of the underlying distribution. It is more important to examine carefully the normality assumption by checking the residuals […].(DeWayne R Derryberry, "Basic data analysis for time series with R", 2014)

"Samples give us estimates of something, and they will almost always deviate from the true number by some amount, large or small, and that is the margin of error. […] The margin of error does not address underlying flaws in the research, only the degree of error in the sampling procedure. But ignoring those deeper possible flaws for the moment, there is another measurement or statistic that accompanies any rigorously defined sample: the confidence interval." (Daniel J Levitin, "Weaponized Lies", 2017)

"The margin of error is how accurate the results are, and the confidence interval is how confident you are that your estimate falls within the margin of error." (Daniel J Levitin, "Weaponized Lies", 2017)

"[...] a hypothesis test tells us whether the observed data are consistent with the null hypothesis, and a confidence interval tells us which hypotheses are consistent with the data." (William C Blackwelder)

12 October 2024

Mind: On Neuro-Linguistic Programming [NLP] (Quotes)

"Neuro-Linguistic programming is the study of our subjective experience; how we create what passes for reality in our minds. [...] NLP also studies brilliance and quality - how outstanding individuals and organisations get their outstanding results." (Joseph O’Connor, "Leading With NLP: Essential Leadership Skills for Influencing and Managing People", 1998)

"NLP contains a set of principles and distinctions which are uniquely suited to analyze and identify crucial patterns of values, behavior and interrelationships so that they may be put into pragmatic and testable implementations." (Robert B Dilts, "Modeling with NLP", 1998)

"NLP is the process by which the relevant pieces of these people's behavior was discovered and then organized into a working model." (Robert B Dilts, "Modeling with NLP", 1998)

"NLP operates from the assumption that the map is not the territory. As human beings, we can never know reality, in the sense that we have to experience reality through our senses and our senses are limited. [...] We can only make maps of the reality around us through the information that we receive through our senses and the connection of that information to our own personal memories and other experiences. Therefore, we don't tend to respond to reality itself, but rather to our own maps of reality." (Robert B Dilts, "Modeling with NLP", 1998)

"One of the goals of NLP is to identify problematic generalizations, deletions or distortions through the analysis of the 'syntax' or form of the surface structure and provide a system of tools so that a more enriched representation of the deep structure may be attained. Another goal of NLP, represented by the modeling process, is to be able to create better links and pathways between surface structures and deep structures." (Robert B Dilts, "Modeling with NLP", 1998)

"The focus of most NLP modeling processes is at the level of capabilities, the how to level. Capabilities connect beliefs and values to specific behaviors. Without the how, knowing what one is supposed to do, and even why to do it, is largely ineffective. Capabilities and skills provide the links and leverage to manifest our vision, identity, values and beliefs as actions in a particular environment." (Robert B Dilts, "Modeling with NLP", 1998)

"The objective of the NLP modeling process is not to end up with the one 'right' or 'true' description of a particular person's thinking process, but rather to make an instrumental map that allows us to apply the strategies that we have modeled in some useful way. An 'instrumental map' is one that allows us to act more effectively - the 'accuracy' or 'reality' of the map is less important than its 'usefulness'." (Robert B Dilts, "Modeling with NLP", 1998)

"[...] the philosophy of NLP is that effective learning and change involves initially setting goals, evidence and evidence procedures to reach a particular desired state. A wide coverage of strategies and activities are then provided in order to be able to vary the operations applied to reach goals." (Robert B Dilts, "Modeling with NLP", 1998)

"The primary function of NLP tools and techniques is to help to widen, enrich or add to our maps of the world. The basic presupposition of NLP is that the richer your map of the world is, the more possibilities that you have of dealing with whatever challenges occur in reality." (Robert B Dilts, "Modeling with NLP", 1998)

31 December 2023

Systems Thinking: On Iteration (Quotes)

"Statistical methods are tools of scientific investigation. Scientific investigation is a controlled learning process in which various aspects of a problem are illuminated as the study proceeds. It can be thought of as a major iteration within which secondary iterations occur. The major iteration is that in which a tentative conjecture suggests an experiment, appropriate analysis of the data so generated leads to a modified conjecture, and this in turn leads to a new experiment, and so on." (George E P Box & George C Tjao, "Bayesian Inference in Statistical Analysis", 1973)

"Apart from power laws, iteration is one of the prime sources of self-similarity. Iteration here means the repeated application of some rule or operation - doing the same thing over and over again. […] A concept closely related to iteration is recursion. In an age of increasing automation and computation, many processes and calculations are recursive, and if a recursive algorithm is in fact repetitious, self-similarity is waiting in the wings."(Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

"Understandably, invariant sets (and their complements) play a crucial role in dynamic systems in general because they tell the most important fact about any initial condition, namely, its eventual fate: will the iterates be bounded, or will they be unstable and diverge? Or will the orbit be periodic or aperiodic?" (Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

"Chaos has three primary features: unpredictability, boundedness, and sensitivity to initial conditions. Unpredictability means that a sequence of numbers that is generated from a chaotic function does not repeat. This principle is perhaps a matter of degree, because some of the numbers could look as though they are recurring only because they are rounded to a convenient number of decimal points. [...] Boundedness means that, for all the unpredictability of motion, all points remain within certain boundaries. The principle of sensitivity to initial conditions means that two points that start off as arbitrarily close together become exponentially farther away from each other as the iteration process proceeds. This is a clear case of small differences producing a huge effect." (Stephen J Guastello & Larry S Liebovitch, "Introduction to Nonlinear Dynamics and Complexity" [in "Chaos and Complexity in Psychology"], 2009)

"Nature's tendency for iteration, pattern formation, and creation of order out of chaos creates expectations of predictability. It seems, however, that nature, because of varying degrees of interaction between chance and choice, and the nonlinearity of systems, escapes the boredom of predictability." (Jamshid Gharajedaghi, "Systems Thinking: Managing Chaos and Complexity A Platform for Designing Business Architecture" 3rd Ed., 2011)

"Geometric pattern repeated at progressively smaller scales, where each iteration is about a reproduction of the image to produce completely irregular shapes and surfaces that can not be represented by classical geometry. Fractals are generally self-similar (each section looks at all) and are not subordinated to a specific scale. They are used especially in the digital modeling of irregular patterns and structures in nature." (Mauro Chiarella, "Folds and Refolds: Space Generation, Shapes, and Complex Components", 2016)

"[...] perhaps one of the most important features of complex systems, which is a key differentiator when comparing with chaotic systems, is the concept of emergence. Emergence 'breaks' the notion of determinism and linearity because it means that the outcome of these interactions is naturally unpredictable. In large systems, macro features often emerge in ways that cannot be traced back to any particular event or agent. Therefore, complexity theory is based on interaction, emergence and iterations." (Luis Tomé & Şuay Nilhan Açıkalın, "Complexity Theory as a New Lens in IR: System and Change" [in "Chaos, Complexity and Leadership 2017", Şefika Şule Erçetin & Nihan Potas], 2019)

26 December 2023

Systems Thinking: On Periodicity (Quotes)

"Since a given system can never of its own accord go over into another equally probable state but into a more probable one, it is likewise impossible to construct a system of bodies that after traversing various states returns periodically to its original state, that is a perpetual motion machine." (Ludwig E Boltzmann, "The Second Law of Thermodynamics", [Address to a Formal meeting of the Imperial Academy of Science], 1886)

"Finite systems of deterministic ordinary nonlinear differential equations may be designed to represent forced dissipative hydrodynamic flow. Solutions of these equations can be identified with trajectories in phase space. For those systems with bounded solutions, it is found that nonperiodic solutions are ordinarily unstable with respect to small modifications, so that slightly differing initial states can evolve into considerably different states. Systems with bounded solutions are shown to possess bounded numerical solutions. (Edward N Lorenz, "Deterministic Nonperiodic Flow", Journal of the Atmospheric Science 20, 1963)

"Now, the main problem with a quasiperiodic theory of turbulence (putting several oscillators together) is the following: when there is a nonlinear coupling between the oscillators, it very often happens that the time evolution does not remain quasiperiodic. As a matter of fact, in this latter situation, one can observe the appearance of a feature which makes the motion completely different from a quasiperiodic one. This feature is called sensitive dependence on initial conditions and turns out to be the conceptual key to reformulating the problem of turbulence." (David Ruelle, "Chaotic Evolution and Strange Attractors: The statistical analysis of time series for deterministic nonlinear systems", 1989)

"All physical objects that are 'self-similar' have limited self-similarity - just as there are no perfectly periodic functions, in the mathematical sense, in the real world: most oscillations have a beginning and an end (with the possible exception of our universe, if it is closed and begins a new life cycle after every 'big crunch' […]. Nevertheless, self-similarity is a useful  abstraction, just as periodicity is one of the most useful concepts in the sciences, any finite extent notwithstanding." (Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

"Clearly, however, a zero probability is not the same thing as an impossibility; […] In systems that are now called chaotic, most initial states are followed by nonperiodic behavior, and only a special few lead to periodicity. […] In limited chaos, encountering nonperiodic behavior is analogous to striking a point on the diagonal of the square; although it is possible, its probability is zero. In full chaos, the probability of encountering periodic behavior is zero." (Edward N Lorenz, "The Essence of Chaos", 1993)

"The description of the evolutionary trajectory of dynamical systems as irreversible, periodically chaotic, and strongly nonlinear fits certain features of the historical development of human societies. But the description of evolutionary processes, whether in nature or in history, has additional elements. These elements include such factors as the convergence of existing systems on progressively higher organizational levels, the increasingly efficient exploitation by systems of the sources of free energy in their environment, and the complexification of systems structure in states progressively further removed from thermodynamic equilibrium." (Ervin László et al, "The Evolution of Cognitive Maps: New Paradigms for the Twenty-first Century", 1993) 

"There is no question but that the chains of events through which chaos can develop out of regularity, or regularity out of chaos, are essential aspects of families of dynamical systems [...]  Sometimes [...] a nearly imperceptible change in a constant will produce a qualitative change in the system’s behaviour: from steady to periodic, from steady or periodic to almost periodic, or from steady, periodic, or almost periodic to chaotic. Even chaos can change abruptly to more complicated chaos, and, of course, each of these changes can proceed in the opposite direction. Such changes are called bifurcations." (Edward Lorenz, "The Essence of Chaos", 1993)

"As with subtle bifurcations, catastrophes also involve a control parameter. When the value of that parameter is below a bifurcation point, the system is dominated by one attractor. When the value of that parameter is above the bifurcation point, another attractor dominates. Thus the fundamental characteristic of a catastrophe is the sudden disappearance of one attractor and its basin, combined with the dominant emergence of another attractor. Any type of attractor static, periodic, or chaotic can be involved in this. Elementary catastrophe theory involves static attractors, such as points. Because multidimensional surfaces can also attract (together with attracting points on these surfaces), we refer to them more generally as attracting hypersurfaces, limit sets, or simply attractors." (Courtney Brown, "Chaos and Catastrophe Theories", 1995)

"In addition to dimensionality requirements, chaos can occur only in nonlinear situations. In multidimensional settings, this means that at least one term in one equation must be nonlinear while also involving several of the variables. With all linear models, solutions can be expressed as combinations of regular and linear periodic processes, but nonlinearities in a model allow for instabilities in such periodic solutions within certain value ranges for some of the parameters." (Courtney Brown, "Chaos and Catastrophe Theories", 1995)

"Chaos appears in both dissipative and conservative systems, but there is a difference in its structure in the two types of systems. Conservative systems have no attractors. Initial conditions can give rise to periodic, quasiperiodic, or chaotic motion, but the chaotic motion, unlike that associated with dissipative systems, is not self-similar. In other words, if you magnify it, it does not give smaller copies of itself. A system that does exhibit self-similarity is called fractal. [...] The chaotic orbits in conservative systems are not fractal; they visit all regions of certain small sections of the phase space, and completely avoid other regions. If you magnify a region of the space, it is not self-similar." (Barry R Parker, "Chaos in the Cosmos: The stunning complexity of the universe", 1996)

"In colloquial usage, chaos means a state of total disorder. In its technical sense, however, chaos refers to a state that only appears random, but is actually generated by nonrandom laws. As such, it occupies an unfamiliar middle ground between order and disorder. It looks erratic superficially, yet it contains cryptic patterns and is governed by rigid rules. It's predictable in the short run but unpredictable in the long run. And it never repeats itself: Its behavior is nonperiodic." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"The existence of equilibria or steady periodic solutions is not sufficient to determine if a system will actually behave that way. The stability of these solutions must also be checked. As parameters are changed, a stable motion can become unstable and new solutions may appear. The study of the changes in the dynamic behavior of systems as parameters are varied is the subject of bifurcation theory. Values of the parameters at which the qualitative or topological nature of the motion changes are known as critical or bifurcation values." (Francis C Moona, "Nonlinear Dynamics", 2003)

"A moderate amount of noise leads to enhanced order in excitable systems, manifesting itself in a nearly periodic spiking of single excitable systems, enhancement of synchronized oscillations in coupled systems, and noise-induced stability of spatial pattens in reaction-diffusion systems." (Benjamin Lindner et al, "Effects of Noise in Excitable Systems", Physical Reports. vol. 392, 2004)

"A typical control goal when controlling chaotic systems is to transform a chaotic trajectory into a periodic one. In terms of control theory it means stabilization of an unstable periodic orbit or equilibrium. A specific feature of this problem is the possibility of achieving the goal by means of an arbitrarily small control action. Other control goals like synchronization and chaotization can also be achieved by small control in many cases." (Alexander L Fradkov, "Cybernetical Physics: From Control of Chaos to Quantum Control", 2007)

"In parametrized dynamical systems a bifurcation occurs when a qualitative change is invoked by a change of parameters. In models such a qualitative change corresponds to transition between dynamical regimes. In the generic theory a finite list of cases is obtained, containing elements like ‘saddle-node’, ‘period doubling’, ‘Hopf bifurcation’ and many others." (Henk W Broer & Heinz Hanssmann, "Hamiltonian Perturbation Theory (and Transition to Chaos)", 2009)

"In fact, contrary to intuition, some of the most complicated dynamics arise from the simplest equations, while complicated equations often produce very simple and uninteresting dynamics. It is nearly impossible to look at a nonlinear equation and predict whether the solution will be chaotic or otherwise complicated. Small variations of a parameter can change a chaotic system into a periodic one, and vice versa." (Julien C Sprott, "Elegant Chaos: Algebraically Simple Chaotic Flows", 2010)

"The main defining feature of chaos is the sensitive dependence on initial conditions. Two nearby initial conditions on the attractor or in the chaotic sea separate by a distance that grows exponentially in time when averaged along the trajectory, leading to long-term unpredictability. The Lyapunov exponent is the average rate of growth of this distance, with a positive value signifying sensitive dependence (chaos), a zero value signifying periodicity (or quasiperiodicity), and a negative value signifying a stable equilibrium." (Julien C Sprott, "Elegant Chaos: Algebraically Simple Chaotic Flows", 2010)

"In dynamical systems, a bifurcation occurs when a small smooth change made to the parameter values (the bifurcation parameters) of a system causes a sudden 'qualitative' or topological change in its behaviour. Generally, at a bifurcation, the local stability properties of equilibria, periodic orbits or other invariant sets changes." (Gregory Faye, "An introduction to bifurcation theory", 2011)

"Chaos is just one phenomenon out of many that are encountered in the study of dynamical systems. In addition to behaving chaotically, systems may show fixed equilibria, simple periodic cycles, and more complicated behaviors that defy easy categorization. The study of dynamical systems holds many surprises and shows that the relationships between order and disorder, simplicity and complexity, can be subtle, and counterintuitive." (David P Feldman, "Chaos and Fractals: An Elementary Introduction", 2012)

"A limit cycle is an isolated closed trajectory. Isolated means that neighboring trajectories are not closed; they spiral either toward or away from the limit cycle. If all neighboring trajectories approach the limit cycle, we say the limit cycle is stable or attracting. Otherwise the limit cycle is unstable, or in exceptional cases, half-stable. Stable limit cycles are very important scientifically - they model systems that exhibit self-sustained oscillations. In other words, these systems oscillate even in the absence of external periodic forcing." (Steven H Strogatz, "Nonlinear Dynamics and Chaos: With Applications to Physics, Biology, Chemistry, and Engineering", 2015)

19 December 2023

Systems Thinking: On Robustness (Quotes)

"Self-organization can be defined as the spontaneous creation of a globally coherent pattern out of local interactions. Because of its distributed character, this organization tends to be robust, resisting perturbations. The dynamics of a self-organizing system is typically non-linear, because of circular or feedback relations between the components. Positive feedback leads to an explosive growth, which ends when all components have been absorbed into the new configuration, leaving the system in a stable, negative feedback state. Non-linear systems have in general several stable states, and this number tends to increase (bifurcate) as an increasing input of energy pushes the system farther from its thermodynamic equilibrium." (Francis Heylighen, "The Science Of Self-Organization And Adaptivity", 1970)

"This is a general characteristic of self-organizing systems: they are robust or resilient. This means that they are relatively insensitive to perturbations or errors, and have a strong capacity to restore themselves, unlike most human designed systems." (Francis Heylighen, "The Science of Self-Organization and Adaptivity", 2001)

"Through self-organization, the behavior of the group emerges from the collective interactions of all the individuals. In fact, a major recurring theme in swarm intelligence (and of complexity science in general) is that even if individuals follow simple rules, the resulting group behavior can be surprisingly complex - and remarkably effective. And, to a large extent, flexibility and robustness result from self-organization." (Eric Bonabeau & Christopher Meyer, "Swarm Intelligence: A Whole New Way to Think About Business", Harvard Business Review, 2001)

"Most systems displaying a high degree of tolerance against failures are a common feature: Their functionality is guaranteed by a highly interconnected complex network. A cell's robustness is hidden in its intricate regulatory and metabolic network; society's resilience is rooted in the interwoven social web; the economy's stability is maintained by a delicate network of financial and regulator organizations; an ecosystem's survivability is encoded in a carefully crafted web of species interactions. It seems that nature strives to achieve robustness through interconnectivity. Such universal choice of a network architecture is perhaps more than mere coincidences." (Albert-László Barabási, "Linked: How Everything Is Connected to Everything Else and What It Means for Business, Science, and Everyday Life", 2002)

"Swarm Intelligence can be defined more precisely as: Any attempt to design algorithms or distributed problem-solving methods inspired by the collective behavior of the social insect colonies or other animal societies. The main properties of such systems are flexibility, robustness, decentralization and self-organization." ("Swarm Intelligence in Data Mining", Ed. Ajith Abraham et al, 2006)

"Swarm intelligence can be effective when applied to highly complicated problems with many nonlinear factors, although it is often less effective than the genetic algorithm approach [...]. Swarm intelligence is related to swarm optimization […]. As with swarm intelligence, there is some evidence that at least some of the time swarm optimization can produce solutions that are more robust than genetic algorithms. Robustness here is defined as a solution’s resistance to performance degradation when the underlying variables are changed." (Michael J North & Charles M Macal, Managing Business Complexity: Discovering Strategic Solutions with Agent-Based Modeling and Simulation, 2007)

"In that sense, a self-organizing system is intrinsically adaptive: it maintains its basic organization in spite of continuing changes in its environment. As noted, perturbations may even make the system more robust, by helping it to discover a more stable organization." (Francis Heylighen, "Complexity and Self-Organization", 2008)

"The concept of stress tests is derived from the procedures used to ensure the robustness of complex engineering structures. There are three stages. You begin by testing each component in conditions considerably more demanding than it is likely to encounter. Then, you review system design to ensure that, even if several elements break down simultaneously, this does not jeopardise the integrity of the whole structure. Third, and most importantly, you test the total system for outcomes far outside the range of experience. You do not ask, 'Will the bridge survive a strong gust of wind?' You ask, 'Will it survive a gale worse than any at this site in the last century?'" (John Kay, The Financial Times, 2010)

"Chaos provides order. Chaotic agitation and motion are needed to create overall, repetitive order. This ‘order through fluctuations’ keeps dynamic markets stable and evolutionary processes robust. In essence, chaos is a phase transition that gives spontaneous energy the means to achieve repetitive and structural order." (Lawrence K Samuels, "Defense of Chaos: The Chaology of Politics, Economics and Human Action", 2013)

"One of the remarkable features of these complex systems created by replicator dynamics is that infinitesimal differences in starting positions create vastly different patterns. This sensitive dependence on initial conditions is often called the butterfly-effect aspect of complex systems - small changes in the replicator dynamics or in the starting point can lead to enormous differences in outcome, and they change one’s view of how robust the current reality is. If it is complex, one small change could have led to a reality that is quite different." (David Colander & Roland Kupers, "Complexity and the art of public policy : solving society’s problems from the bottom up", 2014)

"Although cascading failures may appear random and unpredictable, they follow reproducible laws that can be quantified and even predicted using the tools of network science. First, to avoid damaging cascades, we must understand the structure of the network on which the cascade propagates. Second, we must be able to model the dynamical processes taking place on these networks, like the flow of electricity. Finally, we need to uncover how the interplay between the network structure and dynamics affects the robustness of the whole system." (Albert-László Barabási, "Network Science", 2016)

"The model should be robust under extreme conditions. There is an important direct structure test to the robustness of the model under direct extreme conditions, and it evaluates the validity of the equations under extreme conditions by assessing the plausibility of the resulting values against knowledge/anticipation of what would happen under similar conditions in real life." (Bilash K Bala et al, "System Dynamics: Modelling and Simulation", 2017)

17 December 2023

Systems Thinking: On Sensitivity (Quotes)

"Although a system may exhibit sensitive dependence on initial condition, this does not mean that everything is unpredictable about it. In fact, finding what is predictable in a background of chaos is a deep and important problem. (Which means that, regrettably, it is unsolved.) In dealing with this deep and important problem, and for want of a better approach, we shall use common sense." (David Ruelle, "Chance and Chaos", 1991)

"[…] the standard theory of chaos deals with time evolutions that come back again and again close to where they were earlier. Systems that exhibit this eternal return" are in general only moderately complex. The historical evolution of very complex systems, by contrast, is typically one way: history does not repeat itself. For these very complex systems with one-way evolution it is usually clear that sensitive dependence on initial condition is present. The question is then whether it is restricted by regulation mechanisms, or whether it leads to long-term important consequences." (David Ruelle, "Chance and Chaos", 1991)

"How can deterministic behavior look random? If truly identical states do occur on two or more occasions, it is unlikely that the identical states that will necessarily follow will be perceived as being appreciably different. What can readily happen instead is that almost, but not quite, identical states occurring on two occasions will appear to be just alike, while the states that follow, which need not be even nearly alike, will be observably different. In fact, in some dynamical systems it is normal for two almost identical states to be followed, after a sufficient time lapse, by two states bearing no more resemblance than two states chosen at random from a long sequence. Systems in which this is the case are said to be sensitively dependent on initial conditions. With a few more qualifications, to be considered presently, sensitive dependence can serve as an acceptable definition of chaos [...]" (Edward N Lorenz, "The Essence of Chaos", 1993)

"Chaos has three fundamental characteristics. They are (a) irregular periodicity, (b) sensitivity to initial conditions, and (c) a lack of predictability. These characteristics interact within any one chaotic setting to produce highly complex nonlinear variable trajectories."(Courtney Brown, "Chaos and Catastrophe Theories", 1995)

"First, social systems are inherently insensitive to most policy changes that people choose in an effort to alter the behavior of systems. In fact, social systems draw attention to the very points at which an attempt to intervene will fail. Human intuition develops from exposure to simple systems. In simple systems, the cause of a trouble is close in both time and space to symptoms of the trouble. If one touches a hot stove, the burn occurs here and now; the cause is obvious. However, in complex dynamic systems, causes are often far removed in both time and space from the symptoms. True causes may lie far back in time and arise from an entirely different part of the system from when and where the symptoms occur. However, the complex system can mislead in devious ways by presenting an apparent cause that meets the expectations derived from simple systems." (Jay W Forrester, "Counterintuitive Behavior of Social Systems", 1995)

"Second, social systems seem to have a few sensitive influence points through which behavior can be changed. These high-influence points are not where most people expect. Furthermore, when a high-influence policy is identified, the chances are great that a person guided by intuition and judgment will alter the system in the wrong direction." (Jay W Forrester, "Counterintuitive Behavior of Social Systems", 1995)

"Small changes in the initial conditions in a chaotic system produce dramatically different evolutionary histories. It is because of this sensitivity to initial conditions that chaotic systems are inherently unpredictable. To predict a future state of a system, one has to be able to rely on numerical calculations and initial measurements of the state variables. Yet slight errors in measurement combined with extremely small computational errors (from roundoff or truncation) make prediction impossible from a practical perspective. Moreover, small initial errors in prediction grow exponentially in chaotic systems as the trajectories evolve. Thus, theoretically, prediction may be possible with some chaotic processes if one is interested only in the movement between two relatively close points on a trajectory. When longer time intervals are involved, the situation becomes hopeless." (Courtney Brown, "Chaos and Catastrophe Theories", 1995)

"Swarm systems generate novelty for three reasons: (1) They are 'sensitive to initial conditions' - a scientific shorthand for saying that the size of the effect is not proportional to the size of the cause - so they can make a surprising mountain out of a molehill. (2) They hide countless novel possibilities in the exponential combinations of many interlinked individuals. (3) They don’t reckon individuals, so therefore individual variation and imperfection can be allowed. In swarm systems with heritability, individual variation and imperfection will lead to perpetual novelty, or what we call evolution." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"Another implication of the Law of Requisite Variety is that the member of a system that has the most flexibility also tends to be the catalytic member of that system. This is a significant principle for leadership in particular. The ability to be flexible and sensitive to variation is important in terms of managing the system itself." (Robert B Dilts, "Modeling with NLP", 1998)

"The mental models people use to guide their decisions are dynamically deficient. […] people generally adopt an event-based, open-loop view of causality, ignore feedback processes, fail to appreciate time delays between action and response and in the reporting of information, do not understand stocks and flows and are insensitive to nonlinearities that may alter the strengths of different feedback loops as a system evolves." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"This is a general characteristic of self-organizing systems: they are robust or resilient. This means that they are relatively insensitive to perturbations or errors, and have a strong capacity to restore themselves, unlike most human designed systems." (Francis Heylighen, "The Science of Self-Organization and Adaptivity", 2001)

"In chaos theory this 'butterfly effect' highlights the extreme sensitivity of nonlinear systems at their bifurcation points. There the slightest perturbation can push them into chaos, or into some quite different form of ordered behavior. Because we can never have total information or work to an infinite number of decimal places, there will always be a tiny level of uncertainty that can magnify to the point where it begins to dominate the system. It is for this reason that chaos theory reminds us that uncertainty can always subvert our attempts to encompass the cosmos with our schemes and mathematical reasoning." (F David Peat, "From Certainty to Uncertainty", 2002)

"[…] some systems (system is just a jargon for anything, like the swinging pendulum or the Solar System, or water dripping from a tap)  are very sensitive to their starting conditions, so that a tiny difference in the initial ‘push’ you give them causes a big difference in where they end up, and there is feedback, so that what a system does affects its own behavior."(John Gribbin, "Deep Simplicity", 2004)

"Two things explain the importance of the normal distribution: (1) The central limit effect that produces a tendency for real error distributions to be 'normal like'. (2) The robustness to nonnormality of some common statistical procedures, where 'robustness' means insensitivity to deviations from theoretical normality." (George E P Box et al, "Statistics for Experimenters: Design, discovery, and innovation" 2nd Ed., 2005)

"Physically, the stability of the dynamics is characterized by the sensitivity to initial conditions. This sensitivity can be determined for statistically stationary states, e.g. for the motion on an attractor. If this motion demonstrates sensitive dependence on initial conditions, then it is chaotic. In the popular literature this is often called the 'Butterfly Effect', after the famous 'gedankenexperiment' of Edward Lorenz: if a perturbation of the atmosphere due to a butterfly in Brazil induces a thunderstorm in Texas, then the dynamics of the atmosphere should be considered as an unpredictable and chaotic one. By contrast, stable dependence on initial conditions means that the dynamics is regular." (Ulrike Feudel et al, "Strange Nonchaotic Attractors", 2006)

"This phenomenon, common to chaos theory, is also known as sensitive dependence on initial conditions. Just a small change in the initial conditions can drastically change the long-term behavior of a system. Such a small amount of difference in a measurement might be considered experimental noise, background noise, or an inaccuracy of the equipment." (Greg Rae, Chaos Theory: A Brief Introduction, 2006)

"Sensitive dependence on initial conditions is one of the criteria necessary for showing a solution to a difference equation exhibits chaotic behavior." (Linda J S Allen, "An Introduction to Mathematical Biology", 2007)

"A characteristic of such chaotic dynamics is an extreme sensitivity to initial conditions (exponential separation of neighboring trajectories), which puts severe limitations on any forecast of the future fate of a particular trajectory. This sensitivity is known as the ‘butterfly effect’: the state of the system at time t can be entirely different even if the initial conditions are only slightly changed, i.e., by a butterfly flapping its wings." (Hans J Korsch et al, "Chaos: A Program Collection for the PC", 2008)

"One of the remarkable features of these complex systems created by replicator dynamics is that infinitesimal differences in starting positions create vastly different patterns. This sensitive dependence on initial conditions is often called the butterfly-effect aspect of complex systems - small changes in the replicator dynamics or in the starting point can lead to enormous differences in outcome, and they change one’s view of how robust the current reality is. If it is complex, one small change could have led to a reality that is quite different." (David Colander & Roland Kupers, "Complexity and the art of public policy : solving society’s problems from the bottom up", 2014)

16 December 2023

Systems Thinking: On Resilience (Quotes)

"The notion that the 'balance of nature' is delicately poised and easily upset is nonsense. Nature is extraordinarily tough and resilient, interlaced with checks and balances, with an astonishing capacity for recovering from disturbances in equilibrium. The formula for survival is not power; it is symbiosis." (Sir Eric Ashby, [Encounter] 1976)

"The more complex the network is, the more complex its pattern of interconnections, the more resilient it will be." (Fritjof Capra, "The Web of Life: A New Scientific Understanding of Living Systems", 1996)

"This is a general characteristic of self-organizing systems: they are robust or resilient. This means that they are relatively insensitive to perturbations or errors, and have a strong capacity to restore themselves, unlike most human designed systems." (Francis Heylighen, "The Science of Self-Organization and Adaptivity", 2001)

"Most systems displaying a high degree of tolerance against failures are a common feature: Their functionality is guaranteed by a highly interconnected complex network. A cell's robustness is hidden in its intricate regulatory and metabolic network; society's resilience is rooted in the interwoven social web; the economy's stability is maintained by a delicate network of financial and regulator organizations; an ecosystem's survivability is encoded in a carefully crafted web of species interactions. It seems that nature strives to achieve robustness through interconnectivity. Such universal choice of a network architecture is perhaps more than mere coincidences." (Albert-László Barabási, "Linked: How Everything Is Connected to Everything Else and What It Means for Business, Science, and Everyday Life", 2002)

"How is it that an ant colony can organize itself to carry out the complex tasks of food gathering and nest building and at the same time exhibit an enormous degree of resilience if disrupted and forced to adapt to changing situations? Natural systems are able not only to survive, but also to adapt and become better suited to their environment, in effect optimizing their behavior over time. They seemingly exhibit collective intelligence, or swarm intelligence as it is called, even without the existence of or the direction provided by a central authority." (Michael J North & Charles M Macal, "Managing Business Complexity: Discovering Strategic Solutions with Agent-Based Modeling and Simulation", 2007)

"Like resilience, self-organizazion is often sacrificed for purposes of short-term productivity and stability." (Donella H Meadows, "Thinking in Systems: A Primer", 2008)

"[…] our mental models fail to take into account the complications of the real world - at least those ways that one can see from a systems perspective. It is a warning list. Here is where hidden snags lie. You can’t navigate well in an interconnected, feedback-dominated world unless you take your eyes off short-term events and look for long-term behavior and structure; unless you are aware of false boundaries and bounded rationality; unless you take into account limiting factors, nonlinearities and delays. You are likely to mistreat, misdesign, or misread systems if you don’t respect their properties of resilience, self-organization, and hierarchy." (Donella H Meadows,"Thinking in Systems: A Primer", 2008)

"Antifragility is beyond resilience or robustness. The resilient resists shocks and stays the same; the antifragile gets better." (Nassim N Taleb, "Antifragile: Things that gain from disorder", 2012)

"Complexity demands resilience, and that's what panarchy offers. Resilience in the face of complexity is a challenge even when you apply rigorous intelligence and integrity to develop a coherent and flexible strategy." (Robert D Steele, "The Open-Source Everything Manifesto: Transparency, Truth, and Trust", 2012)

"Stability is often defined as a resilient system that keeps processing transactions, even if transient impulses (rapid shocks to the system), persistent stresses (force applied to the system over an extended period), or component failures disrupt normal processing." (Michael Hüttermann et al, "DevOps for Developers", 2013)

Related Posts Plugin for WordPress, Blogger...