"If the channel is noisy it is not in general possible to reconstruct the original message or the transmitted signal with certainty by any operation on the received signal. There are ways, however, of transmitting the information which are optimal in combating noise." (Claude E. Shannon, "A Mathematical Theory of Communication", Bell System Technical journal, 1948)
"An adaptive organism is connected with its environment by two kinds of channels. Afferent channels give it information about the state of the environment; efferent channels cause action on the environment. Problem statements define solutions in terms of afferent information to the organism; the organism's task is to discover a set of efferent signals which, changing the state of the environment, will produce the appropriate afferent. But, ab initio, the mapping of efferents on afferents is entirely arbitrary; the relations can only be discovered by experiment, by acting and observing the consequences of action." (Herbert A Simon, "The Logic of Heuristic Decision Making", [in "The Logic of Decision and Action"], 1966)
"In the language of cybernetics, maintaining reactions can be outlined as follows: the sensing material receives information about the external environment in the form of coded signals. This information is reprocessed and sent in the form of new signals through defined channels, or networks. This new information brings about an internal reorganization of the system which contributes to the preservation of its integrity. The mechanism which reprocesses the information is called the control system. It consists of a vast number of input and output elements, connected by channels through which the signals are transmitted. The information can be stored in a recall or memory system, which may consist of separate elements, each of which can be in one of several stable states. The particular state of the element varies, under the influence of the input signals. When a number of such elements are in certain specified states, information is, in effect, recorded in the form of a text of finite length, using an alphabet with a finite number of characters. These processes underlie contemporary electronic computing machines and are, in a number of respects, strongly analogous to biological memory systems." (Carl Sagan, "Intelligent Life in the Universe", 1966)
"Experiments usually are looking for 'signals' of truth, and the search is always ham pered by 'noise' of one kind or another. In judging someone else's experimental results it's important to find out whether they represent a true signal or whether they are just so much noise." (Robert Hooke, "How to Tell the Liars from the Statisticians", 1983)
"The term closed loop-learning process refers to the idea that one learns by determining what s desired and comparing what is actually taking place as measured at the process and feedback for comparison. The difference between what is desired and what is taking place provides an error indication which is used to develop a signal to the process being controlled." (Harold Chestnut, 1984)
"In a real experiment the noise present in a signal is usually considered to be the result of the interplay of a large number of degrees of freedom over which one has no control. This type of noise can be reduced by improving the experimental apparatus. But we have seen that another type of noise, which is not removable by any refinement of technique, can be present. This is what we have called the deterministic noise. Despite its intractability it provides us with a way to describe noisy signals by simple mathematical models, making possible a dynamical system approach to the problem of turbulence." (David Ruelle, "Chaotic Evolution and Strange Attractors: The statistical analysis of time series for deterministic nonlinear systems", 1989)
"An artificial neural network is an information-processing system that has certain performance characteristics in common with biological neural networks. Artificial neural networks have been developed as generalizations of mathematical models of human cognition or neural biology, based on the assumptions that: 1. Information processing occurs at many simple elements called neurons. 2. Signals are passed between neurons over connection links. 3. Each connection link has an associated weight, which, in a typical neural net, multiplies the signal transmitted. 4. Each neuron applies an activation function (usually nonlinear) to its net input (sum of weighted input signals) to determine its output signal." (Laurene Fausett, "Fundamentals of Neural Networks", 1994)
"Engineers have sought to minimize the effects of noise in electronic circuits and communication systems. But recent research has established that noise can play a constructive role in the detection of weak periodic signals." (Kurt Wiesenfeld & Frank Moss, "Stochastic Resonance and the Benefits of Noise: From Ice Ages to Crayfish and SQUIDs", Nature vol. 373, 1995)
"Most engineering systems in communication, control, and signal processing are developed under the often erroneous assumption that the interfering noise is Gaussian. Many physical environments are more accurately modeled as impulsive, characterized by heavy-tailed non-Gaussian distributions. The performances of systems developed under the assumption of Gaussian noise can be severely degraded by the non-Gaussian noise due to potent deviation from normality in the tails." (Seong Rag Kim & Adam Efron, "Adaptive Robust Impulse Noise Filtering", IEEE Transactions on Signal Processing vol. 43 (8), 1995)
"Mathematics can function as a telescope, a microscope, a sieve for sorting out the signal from the noise, a template for pattern perception, a way of seeking and validating truth. […] A knowledge of the mathematics behind our ideas can help us to fool ourselves a little less often, with less drastic consequences." (K C Cole, "The Universe and the Teacup: The Mathematics of Truth and Beauty", 1997)
"Data are generally collected as a basis for action. However, unless potential signals are separated from probable noise, the actions taken may be totally inconsistent with the data. Thus, the proper use of data requires that you have simple and effective methods of analysis which will properly separate potential signals from probable noise." (Donald J Wheeler, "Understanding Variation: The Key to Managing Chaos" 2nd Ed., 2000)
"A symbol is a mental representation regarding the internal reality referring to its object by a convention and produced by the conscious interpretation of a sign. In contrast to signals, symbols may be used every time if the receiver has the corresponding representation. Symbols also relate to feelings and thus give access not only to information but also to the communicator’s motivational and emotional state. The use of symbols makes it possible for the organism using it to evoke in the receiver the same response it evokes in himself. To communicate with symbols is to use a language." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)
"Apart from intrinsic noise sources at the level of an individual neuron there are also sources of noise that are due to signal transmission and network effects. Synaptic transmission failures, for instance, seem to impose a substantial limitation within a neuronal network." (Wulfram Gerstner & Werner Kistler, "Spiking Neuron Models: Single Neurons, Population, Plasticity", 2002)
"The most familiar example of swarm intelligence is the human brain. Memory, perception and thought all arise out of the nett actions of billions of individual neurons. As we saw earlier, artificial neural networks (ANNs) try to mimic this idea. Signals from the outside world enter via an input layer of neurons. These pass the signal through a series of hidden layers, until the result emerges from an output layer. Each neuron modifies the signal in some simple way. It might, for instance, convert the inputs by plugging them into a polynomial, or some other simple function. Also, the network can learn by modifying the strength of the connections between neurons in different layers." (David G Green, "The Serendipity Machine: A voyage of discovery through the unexpected world of computers", 2004)
"A signal has a finite-length frequency spectrum only if it lasts infinitely long in time. So a finite spectrum implies infinite time and vice versa. The reverse also holds in the ideal world of mathematics: A signal is finite in time only if it has a frequency spectrum that is infinite in extent." (Bart Kosko, "Noise", 2006)
"Any technical discussion of noise begins with white noise because white noise is pure or ideal noise. White noise serves as the gold standard of noise. Scientists and engineers have explored hundreds of other noise types but most of these deviate from white noise in some specific way. White noise is noisy because it has a wide and flat band of frequencies if one looks at its spectrum. This reflects the common working definition of noise as a so-called wideband signal. Good signals or wanted signals concentrate their energy on a comparatively narrow band of the frequency spectrum. Hence good signals tend to be so-called narrowband signals at least relative to the wide band of white noise. White noise is so noisy because its spectrum is as wide as possible - it runs the whole infinite length of the frequency spectrum itself. So pure or ideal white noise exists only as a mathematical abstraction. It cannot exist physically because it would require infinite energy." (Bart Kosko, "Noise", 2006)
"I wage war on noise every day as part of my work as a scientist and engineer. We try to maximize signal-to-noise ratios. We try to filter noise out of measurements of sounds or images or anything else that conveys information from the world around us. We code the transmission of digital messages with extra 0s and 1s to defeat line noise and burst noise and any other form of interference. Wc design sophisticated algorithms to track noise and then cancel it in headphones or in a sonogram. Some of us even teach classes on how to defeat this nemesis of the digital age. Such action further conditions our anti-noise reflexes." (Bart Kosko, "Noise", 2006)
"Is the universe noise? That question is not as strange as it sounds. Noise is an unwanted signal. A signal is anything that conveys information or ultimately anything that has energy. The universe consists of a great deal of energy. Indeed a working definition of the universe is all energy anywhere ever. So the answer turns on how one defines what it means to be wanted and by whom." (Bart Kosko, "Noise", 2006)
"Distinguishing the signal from the noise requires both scientific knowledge and self-knowledge." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)
"Finding patterns is easy in any kind of data-rich environment; that's what mediocre gamblers do. The key is in determining whether the patterns represent signal or noise." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)
"The signal is the truth. The noise is what distracts us from the truth." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)
"[…] the term 'information', as used in information theory, has nothing to do with meaning. It is a measure of the order, or nonrandomness, of a signal; and the main concern of information theory is the problem of how to get a message, coded as a signal, through a noisy channel." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)
"Data contain descriptions. Some are true, some are not. Some are useful, most are not. Skillful use of data requires that we learn to pick out the pieces that are true and useful. [...] To find signals in data, we must learn to reduce the noise - not just the noise that resides in the data, but also the noise that resides in us. It is nearly impossible for noisy minds to perceive anything but noise in data. […] Signals always point to something. In this sense, a signal is not a thing but a relationship. Data becomes useful knowledge of something that matters when it builds a bridge between a question and an answer. This connection is the signal." (Stephen Few, "Signal: Understanding What Matters in a World of Noise", 2015)
"Information theory leads to the quantification of the information content of the source, as denoted by entropy, the characterization of the information-bearing capacity of the communication channel, as related to its noise characteristics, and consequently the establishment of the relationship between the information content of the source and the capacity of the channel. In short, information theory provides a quantitative measure of the information contained in message signals and help determine the capacity of a communication system to transfer this information from source to sink over a noisy channel in a reliable fashion." (Ali Grami, "Information Theory", 2016)
"Repeated observations of the same phenomenon do not always produce the same results, due to random noise or error. Sampling errors result when our observations capture unrepresentative circumstances, like measuring rush hour traffic on weekends as well as during the work week. Measurement errors reflect the limits of precision inherent in any sensing device. The notion of signal to noise ratio captures the degree to which a series of observations reflects a quantity of interest as opposed to data variance. As data scientists, we care about changes in the signal instead of the noise, and such variance often makes this problem surprisingly difficult." (Steven S Skiena, "The Data Science Design Manual", 2017)
"In mathematical modeling, as in all of science, we always have to make choices about what to stress and what to ignore. The art of abstraction lies in knowing what is essential and what is minutia, what is signal and what is noise, what is trend and what is wiggle. It’s an art because such choices always involve an element of danger; they come close to wishful thinking and intellectual dishonesty."
No comments:
Post a Comment