04 December 2020

Systems Thinking: Information Theory (Quotes)

"[…] information theory is characterised essentially by its dealing always with a set of possibilities; both its primary data and its final statements are almost always about the set as such, and not about some individual element in the set." (W Ross Ashby, "An Introduction to Cybernetics", 1956)

"Cybernetics is concerned primarily with the construction of theories and models in science, without making a hard and fast distinction between the physical and the biological sciences. The theories and models occur both in symbols and in hardware, and by 'hardware’ we shall mean a machine or computer built in terms of physical or chemical, or indeed any handleable parts. Most usually we shall think of hardware as meaning electronic parts such as valves and relays. Cybernetics insists, also, on a further and rather special condition that distinguishes it from ordinary scientific theorizing: it demands a certain standard of effectiveness. In this respect it has acquired some of the same motive power that has driven research on modern logic, and this is especially true in the construction and application of artificial languages and the use of operational definitions. Always the search is for precision and effectiveness, and we must now discuss the question of effectiveness in some detail. It should be noted that when we talk in these terms we are giving pride of place to the theory of automata at the expense, at least to some extent, of feedback and information theory." (Frank H George, "The Brain As A Computer", 1962)

"The general notion in communication theory is that of information. In many cases, the flow of information corresponds to a flow of energy, e. g. if light waves emitted by some objects reach the eye or a photoelectric cell, elicit some reaction of the organism or some machinery, and thus convey information." (Ludwig von Bertalanffy, "General System Theory", 1968) 

"The 'flow of information' through human communication channels is enormous. So far no theory exists, to our knowledge, which attributes any sort of unambiguous measure to this 'flow'." (Anatol Rapoport, "Modern Systems Research for the Behavioral Scientist", 1969)

"Probability plays a central role in many fields, from quantum mechanics to information theory, and even older fields use probability now that the presence of "noise" is officially admitted. The newer aspects of many fields start with the admission of uncertainty." (Richard Hamming, "Methods of Mathematics Applied to Calculus, Probability, and Statistics", 1985)

"The field of 'information theory' began by using the old hardware paradigm of transportation of data from point to point." (Marshall McLuhan & Eric McLuhan, Laws of Media: The New Science, 1988)

"Without an understanding of causality there can be no theory of communication. What passes as information theory today is not communication at all, but merely transportation." (Marshall McLuhan & Eric McLuhan, "Laws of Media: The New Science", 1988)

"If quantum communication and quantum computation are to flourish, a new information theory will have to be developed." (Hans Christian von Baeyer, "Information, The New Language of Science", 2003)

"In fact, an information theory that leaves out the issue of noise turns out to have no content." (Hans Christian von Baeyer, "Information, The New Language of Science", 2003)

"In an information economy, entrepreneurs master the science of information in order to overcome the laws of the purely physical sciences. They can succeed because of the surprising power of the laws of information, which are conducive to human creativity. The central concept of information theory is a measure of freedom of choice. The principle of matter, on the other hand, is not liberty but limitation- it has weight and occupies space." (George Gilder, "Knowledge and Power: The Information Theory of Capitalism and How it is Revolutionizing our World", 2013)

"Information theory leads to the quantification of the information content of the source, as denoted by entropy, the characterization of the information-bearing capacity of the communication channel, as related to its noise characteristics, and consequently the establishment of the relationship between the information content of the source and the capacity of the channel. In short, information theory provides a quantitative measure of the information contained in message signals and help determine the capacity of a communication system to transfer this information from source to sink over a noisy channel in a reliable fashion." (Ali Grami, "Information Theory", 2016)


No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...