"Probability, however, is not something absolute, [it is] drawn from certain information which, although it does not suffice to resolve the problem, nevertheless ensures that we judge correctly which of the two opposites is the easiest given the conditions known to us." (Gottfried W Leibniz, "Forethoughts for an encyclopaedia or universal science", cca. 1679)
"Knowledge is of two kinds. We know a subject ourselves, or we know where we can find information upon it." (Samuel Johnson, 1775)
"What is called science today consists of a haphazard heap of information, united by nothing, often utterly unnecessary, and not only failing to present one unquestionable truth, but as often as not containing the grossest errors, today put forward as truths, and tomorrow overthrown." (Leo Tolstoy, "What Is Art?", 1897)
"There can be no unique probability attached to any event or behaviour: we can only speak of ‘probability in the light of certain given information’, and the probability alters according to the extent of the information." (Sir Arthur S Eddington, "The Nature of the Physical World" , 1928)
"As words are not the things we speak about, and structure is the only link between them, structure becomes the only content of knowledge. If we gamble on verbal structures that have no observable empirical structures, such gambling can never give us any structural information about the world. Therefore such verbal structures are structurally obsolete, and if we believe in them, they induce delusions or other semantic disturbances." (Alfred Korzybski, "Science and Sanity", 1933)
"Just as entropy is a measure of disorganization, the information carried by a set of messages is a measure of organization. In fact, it is possible to interpret the information carried by a message as essentially the negative of its entropy, and the negative logarithm of its probability. That is, the more probable the message, the less information it gives. Clichés, for example, are less illuminating than great poems." (Norbert Wiener, "The Human Use of Human Beings", 1950)
"Knowledge is not something which exists and grows in the abstract. It is a function of human organisms and of social organization. Knowledge, that is to say, is always what somebody knows: the most perfect transcript of knowledge in writing is not knowledge if nobody knows it. Knowledge however grows by the receipt of meaningful information - that is, by the intake of messages by a knower which are capable of reorganising his knowledge." (Kenneth E Boulding, "General Systems Theory - The Skeleton of Science", Management Science Vol. 2 (3), 1956)
"This is the key of modern science and it was the beginning of the true understanding of Nature - this idea to look at the thing, to record the details, and to hope that in the information thus obtained might lie a clue to one or another theoretical interpretation." (Richard P Feynman, "The Character of Physical Law", 1965)
"[...] 'information' is not a substance or concrete entity but rather a relationship between sets or ensembles of structured variety." (Walter F Buckley, "Sociology and modern systems theory", 1967)
"There are as many types of questions as components in the information." (Jacques Bertin, Semiology of graphics [Semiologie Graphique], 1967)
"The idea of knowledge as an improbable structure is still a good place to start. Knowledge, however, has a dimension which goes beyond that of mere information or improbability. This is a dimension of significance which is very hard to reduce to quantitative form. Two knowledge structures might be equally improbable but one might be much more significant than the other." (Kenneth E Boulding, "Beyond Economics: Essays on Society", 1968)
"When action grows unprofitable, gather information; when information grows unprofitable, sleep. (Ursula K Le Guin, "The Left Hand of Darkness", 1969)
"What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention, and a need to allocate that attention efficiently among the overabundance of information sources that might consume it." (Herbert Simon, "Computers, Communications and the Public Interest", 1971)
"What we mean by information - the elementary unit of information - is a difference which makes a difference, and it is able to make a difference because the neural pathways along which it travels and is continually transformed are themselves provided with energy. The pathways are ready to be triggered. We may even say that the question is already implicit in them." (Gregory Bateson, "Steps to an Ecology of Mind", 1972)
"Science gets most of its information by the process of reductionism, exploring the details, then the details of the details, until all the smallest bits of the structure, or the smallest parts of the mechanism, are laid out for counting and scrutiny. Only when this is done can the investigation be extended to encompass the whole organism or the entire system. So we say. Sometimes it seems that we take a loss, working this way." (Lewis Thomas, "The Medusa and the Snail: More Notes of a Biology Watcher", 1974)
"Science is not a heartless pursuit of objective information. It is a creative human activity, its geniuses acting more as artists than information processors. Changes in theory are not simply the derivative results of the new discoveries but the work of creative imagination influenced by contemporary social and political forces." (Stephen J Gould, "Ever Since Darwin: Reflections in Natural History", 1977)
"Data, seeming facts, apparent associations-these are not certain knowledge of something. They may be puzzles that can one day be explained; they may be trivia that need not be explained at all. (Kenneth Waltz, "Theory of International Politics", 1979)
"To a considerable degree science consists in originating the maximum amount of information with the minimum expenditure of energy. Beauty is the cleanness of line in such formulations along with symmetry, surprise, and congruence with other prevailing beliefs." (Edward O Wilson, "Biophilia", 1984)
"Knowledge is the appropriate collection of information, such that it's intent is to be useful. Knowledge is a deterministic process. When someone 'memorizes' information (as less-aspiring test-bound students often do), then they have amassed knowledge. This knowledge has useful meaning to them, but it does not provide for, in and of itself, an integration such as would infer further knowledge." (Russell L Ackoff, "Towards a Systems Theory of Organization", 1985)
"Information is data that has been given meaning by way of relational connection. This 'meaning' can be useful, but does not have to be. In computer parlance, a relational database makes information from the data stored within it." (Russell L Ackoff, "Towards a Systems Theory of Organization", 1985)
"Probability plays a central role in many fields, from quantum mechanics to information theory, and even older fields use probability now that the presence of 'noise' is officially admitted. The newer aspects of many fields start with the admission of uncertainty." (Richard W Hamming, "Methods of Mathematics Applied to Calculus, Probability, and Statistics", 1985)
"Probabilities are summaries of knowledge that is left behind when information is transferred to a higher level of abstraction." (Judea Pearl, "Probabilistic Reasoning in Intelligent Systems: Network of Plausible, Inference", 1988)
"Information exists. It does not need to be perceived to exist. It does not need to be understood to exist. It requires no intelligence to interpret it. It does not have to have meaning to exist. It exists." (Tom Stonier, Information and the Internal Structure of the Universe: An Exploration into Information Physics, 1990)
"What about confusing clutter? Information overload? Doesn't data have to be ‘boiled down’ and ‘simplified’? These common questions miss the point, for the quantity of detail is an issue completely separate from the difficulty of reading. Clutter and confusion are failures of design, not attributes of information." (Edward R Tufte, "Envisioning Information", 1990)
"Knowledge is theory. We should be thankful if action of management is based on theory. Knowledge has temporal spread. Information is not knowledge. The world is drowning in information but is slow in acquisition of knowledge. There is no substitute for knowledge." (William E Deming, "The New Economics for Industry, Government, Education", 1993)
"The science of statistics may be described as exploring, analyzing and summarizing data; designing or choosing appropriate ways of collecting data and extracting information from them; and communicating that information. Statistics also involves constructing and testing models for describing chance phenomena. These models can be used as a basis for making inferences and drawing conclusions and, finally, perhaps for making decisions." (Fergus Daly et al, "Elements of Statistics", 1995)
"[Schemata are] knowledge structures that represent objects or events and provide default assumptions about their characteristics, relationships, and entailments under conditions of incomplete information." (Paul J DiMaggio, "Culture and Cognition", Annual Review of Sociology No. 23, 1997)
"Each element in the system is ignorant of the behavior of the system as a whole, it responds only to information that is available to it locally. This point is vitally important. If each element ‘knew’ what was happening to the system as a whole, all of the complexity would have to be present in that element." (Paul Cilliers, "Complexity and Postmodernism: Understanding Complex Systems" , 1998)
"Complexity is that property of a model which makes it difficult to formulate its overall behaviour in a given language, even when given reasonably complete information about its atomic components and their inter-relations." (Bruce Edmonds, "Syntactic Measures of Complexity", 1999)
"It [the Internet] is inherently destructive of memory. You think you’re getting lots more [information] until you’ve found out you’ve made a bargain with the Devil. You’ve slowly mutated, and have become an extension of the machine." (James Billington, [interview] 1999)
'It would be a serious intellectual mistake to confuse Information that functions as entertainment with actual, or knowledge-based, information. It would be a mistake as well to simply ignore the cognitive implications of information processing as entertainment." (Joseph Urgo, "In the Age of Distraction", 2000)
"There is a strong tendency today to narrow specialization. Because of the exponential growth of information, we can afford (in terms of both economics and time) preparation of specialists in extremely narrow fields, the various branches of science and engineering having their own particular realms. As the knowledge in these fields grows deeper and broader, the individual's field of expertise has necessarily become narrower. One result is that handling information has become more difficult and even ineffective." (Semyon D Savransky, "Engineering of Creativity", 2000)
"A model isolates one or a few causal connections, mechanisms, or processes, to the exclusion of other contributing or interfering factors - while in the actual world, those other factors make their effects felt in what actually happens. Models may seem true in the abstract, and are false in the concrete. The key issue is about whether there is a bridge between the two, the abstract and the concrete, such that a simple model can be relied on as a source of relevantly truthful information about the complex reality." (Uskali Mäki, "Fact and Fiction in Economics: Models, Realism and Social Construction", 2002)
"Information needs representation. The idea that it is possible to communicate information in a 'pure' form is fiction. Successful risk communication requires intuitively clear representations. Playing with representations can help us not only to understand numbers (describe phenomena) but also to draw conclusions from numbers (make inferences). There is no single best representation, because what is needed always depends on the minds that are doing the communicating." (Gerd Gigerenzer, "Calculated Risks: How to know when numbers deceive you", 2002)
"Entropy is not about speeds or positions of particles, the way temperature and pressure and volume are, but about our lack of information." (Hans C von Baeyer," Information, The New Language of Science", 2003)
"The use of computers shouldn't ignore the objectives of graphics, that are:
1) Treating data to get information.
2) Communicating, when necessary, the information obtained." (Jacques Bertin, [interview] 2003)
"While in theory randomness is an intrinsic property, in practice, randomness is incomplete information." (Nassim N Taleb, "The Black Swan", 2007)
"Put simply, statistics is a range of procedures for gathering, organizing, analyzing and presenting quantitative data. […] Essentially […], statistics is a scientific approach to analyzing numerical data in order to enable us to maximize our interpretation, understanding and use. This means that statistics helps us turn data into information; that is, data that have been interpreted, understood and are useful to the recipient. Put formally, for your project, statistics is the systematic collection and analysis of numerical data, in order to investigate or discover relationships among phenomena so as to explain, predict and control their occurrence." (Reva B Brown & Mark Saunders, "Dealing with Statistics: What You Need to Know", 2008)
"Complexity has the propensity to overload systems, making the relevance of a particular piece of information not statistically significant. And when an array of mind-numbing factors is added into the equation, theory and models rarely conform to reality." (Lawrence K Samuels, "Defense of Chaos: The Chaology of Politics, Economics and Human Action", 2013)
"Complexity scientists concluded that there are just too many factors - both concordant and contrarian - to understand. And with so many potential gaps in information, almost nobody can see the whole picture. Complex systems have severe limits, not only to predictability but also to measurability. Some complexity theorists argue that modelling, while useful for thinking and for studying the complexities of the world, is a particularly poor tool for predicting what will happen." (Lawrence K Samuels, "Defense of Chaos: The Chaology of Politics, Economics and Human Action", 2013)
"In a physical system, information is the opposite of entropy, as it involves uncommon and highly correlated configurations that are difficult to arrive at." (César A Hidalgo, "Why Information Grows: The Evolution of Order, from Atoms to Economies", 2015)
"One of the most powerful transformational catalysts is knowledge, new information, or logic that defies old mental models and ways of thinking" (Elizabeth Thornton, "The Objective Leader", 2015)
"The term data, unlike the related terms facts and evidence, does not connote truth. Data is descriptive, but data can be erroneous. We tend to distinguish data from information. Data is a primitive or atomic state (as in ‘raw data’). It becomes information only when it is presented in context, in a way that informs. This progression from data to information is not the only direction in which the relationship flows, however; information can also be broken down into pieces, stripped of context, and stored as data. This is the case with most of the data that’s stored in computer systems. Data that’s collected and stored directly by machines, such as sensors, becomes information only when it’s reconnected to its context." (Stephen Few, "Signal: Understanding What Matters in a World of Noise", 2015)
No comments:
Post a Comment