"A machine can handle information; it can calculate, conclude, and choose; it can perform reasonable operations with information. A machine. therefore, can think." (Edmund C Berkeley, "Giant Brains or Machines that Think", 1949)
"From a narrow point of view, a machine that only thinks produces only information. It takes in information in one state, and it puts out information in another state. From this viewpoint, information in itself is harmless; it is just an arrangement of marks; and accordingly, a machine that thinks is harmless, and no control is necessary." (Edmund C Berkeley, "Giant Brains or Machines that Think", 1949)
"Now when we speak of a machine that thinks, or a mechanical brain, what do we mean? Essentially, a mechanical brain is a machine that handles information, transfers information automatically from one part of the machine to another, and has a flexible control over the sequence of its operations. No human being is needed around such a machine to pick up a physical piece of information produced in one part of the machine, personally move it to another part of the machine, and there put it in again. Nor is any human being needed to give the machine instructions from minute to minute. Instead, we can write out the whole program to solve a problem, translate the program into machine language, and put the program into the machine." (Edmund C Berkeley, "Giant Brains or Machines that Think", 1949)
"Feedback is a method of controlling a system by reinserting into it the results of its past performance. If these results are merely used as numerical data for the criticism of the system and its regulation, we have the simple feedback of the control engineers. If, however, the information which proceeds backward from the performance is able to change the general method and pattern of performance, we have a process which may be called learning." (Norbert Wiener, 1954)
"Cybernetics is the science of the process of transmission, processing and storage of information." (Sergei Sobolew, Woprosy Psychology, 1958)
"The term 'systems engineering' is a term with an air of romance and of mystery. The romance and the mystery come from its use in the field of guided missiles, rockets, artificial satellites, and space flight. Much of the work being done in these areas is classified and hence much of it is not known to the general public or to this writer. […] From a business point of view, systems engineering is the creation of a deliberate combination of human services, material services, and machine service to accomplish an information processing job. But this is also very nearly a definition of business system analysis. The difference, from a business point of view, therefore, between business system analysis and systems engineering is only one of degree. In general, systems engineering is more total and more goal-oriented in its approach [...]." ("Computers and People" Vol. 5, 1956)
"In the language of cybernetics, maintaining reactions can be outlined as follows: the sensing material receives information about the external environment in the form of coded signals. This information is reprocessed and sent in the form of new signals through defined channels, or networks. This new information brings about an internal reorganization of the system which contributes to the preservation of its integrity. The mechanism which reprocesses the information is called the control system. It consists of a vast number of input and output elements, connected by channels through which the signals are transmitted. The information can be stored in a recall or memory system, which may consist of separate elements, each of which can be in one of several stable states. The particular state of the element varies, under the influence of the input signals. When a number of such elements are in certain specified states, information is, in effect, recorded in the form of a text of finite length, using an alphabet with a finite number of characters. These processes underlie contemporary electronic computing machines and are, in a number of respects, strongly analogous to biological memory systems." (Carl Sagan, "Intelligent Life in the Universe", 1966)
"The subject of study in systems theory is not a 'physical object', a chemical or social phenomenon, for example, but a 'system': a formal relationship between observed features or attributes. For conceptual reasons, the language used in describing the behavior of systems is that of information processing and goal seeking (decision making control)." (Mihajlo D Mesarovic & Y Takahara, "Foundations for the mathematical theory of general systems", 1975)
"Man is not a machine, [...] although man most certainly processes information, he does not necessarily process it in the way computers do. Computers and men are not species of the same genus. [...] No other organism, and certainly no computer, can be made to confront genuine human problems in human terms. [...] However much intelligence computers may attain, now or in the future, theirs must always be an intelligence alien to genuine human problems and concerns." (Joesph Weizenbaum, Computer Power and Human Reason: From Judgment to Calculation, 1976)
"Owing to his lack of knowledge, the ordinary man cannot attempt to resolve conflicting theories of conflicting advice into a single organized structure. He is likely to assume the information available to him is on the order of what we might think of as a few pieces of an enormous jigsaw puzzle. If a given piece fails to fit, it is not because it is fraudulent; more likely the contradictions and inconsistencies within his information are due to his lack of understanding and to the fact that he possesses only a few pieces of the puzzle. Differing statements about the nature of things […] are to be collected eagerly and be made a part of the individual's collection of puzzle pieces. Ultimately, after many lifetimes, the pieces will fit together and the individual will attain clear and certain knowledge." (Alan R Beals, "Strategies of Resort to Curers in South India" [contributed in Charles M. Leslie (ed.), "Asian Medical Systems: A Comparative Study", 1976])
"Science is not a heartless pursuit of objective information. It is a creative human activity, its geniuses acting more as artists than information processors. Changes in theory are not simply the derivative results of the new discoveries but the work of creative imagination influenced by contemporary social and political forces. " (Stephen J Gould, "Ever Since Darwin: Reflections in Natural History", 1977)
"The greater the uncertainty, the greater the amount of decision making and information processing. It is hypothesized that organizations have limited capacities to process information and adopt different organizing modes to deal with task uncertainty. Therefore, variations in organizing modes are actually variations in the capacity of organizations to process information and make decisions about events which cannot be anticipated in advance." (John K Galbraith, "Organization Design", 1977)
"Effect spreads its 'tentacles' not only forwards (as a new cause giving rise to a new effect) but also backwards, to the cause which gave rise to it, thus modifying, exhausting or intensifying its force. This interaction of cause and effect is known as the principle of feedback. It operates everywhere, particularly in all self-organising systems where perception, storing, processing and use of information take place, as for example, in the organism, in a cybernetic device, and in society. The stability, control and progress of a system are inconceivable without feedback." (Alexander Spirkin, "Dialectical Materialism", 1983)
"The purpose of a mental model is to allow the person to understand and to anticipate the behavior of a physical system. This means that the model must have predictive power, either by applying rules of inference or by procedural derivation (in whatever manner these properties may be realized in a person); in other words, it should be possible for people to ' run' their models mentally. This means that the conceptual mental model must also include a model of the relevant human information processing and knowledge structures that make it possible for the person to use a mental model to predict and understand the physical system." (Donald A Norman, "Some Observations on Mental Models" [in "Mental Models"], 1983)
"The third model regards mind as an information processing system. This is the model of mind subscribed to by cognitive psychologists and also to some extent by the ego psychologists. Since an acquisition of information entails maximization of negative entropy and complexity, this model of mind assumes mind to be an open system." (Thaddus E Weckowicz, "Models of Mental Illness", 1984)
"An artificial neural network is an information-processing system that has certain performance characteristics in common with biological neural networks. Artificial neural networks have been developed as generalizations of mathematical models of human cognition or neural biology, based on the assumptions that: (1) Information processing occurs at many simple elements called neurons. (2) Signals are passed between neurons over connection links. (3) Each connection link has an associated weight, which, in a typical neural net, multiplies the signal transmitted. (4) Each neuron applies an activation function (usually nonlinear) to its net input (sum of weighted input signals) to determine its output signal." (Laurene Fausett, "Fundamentals of Neural Networks", 1994)
"Cybernetics is concerned with scientific investigation of systemic processes of a highly varied nature, including such phenomena as regulation, information processing, information storage, adaptation, self-organization, self-reproduction, and strategic behavior. Within the general cybernetic approach, the following theoretical fields have developed: systems theory (system), communication theory, game theory, and decision theory." (Fritz B Simon et al, "Language of Family Therapy: A Systemic Vocabulary and Source Book", 1985)
"The basic idea is that schemata are data structures for representing the generic concepts stored in memory. There are schemata for generalized concepts underlying objects, situations, events, sequences of events, actions, and sequences of actions. Roughly, schemata are like models of the outside world. To process information with the use of a schema is to determine which model best fits the incoming information. Ultimately, consistent configurations of schemata are discovered which, in concert, offer the best account for the input. This configuration of schemata together constitutes the interpretation of the input." (David E Rumelhart, Paul Smolensky, James L McClelland & Geoffrey E Hinton, "Schemata and sequential thought processes in PDP models", 1986)
"Fuzziness, then, is a concomitant of complexity. This implies that as the complexity of a task, or of a system for performing that task, exceeds a certain threshold, the system must necessarily become fuzzy in nature. Thus, with the rapid increase in the complexity of the information processing tasks which the computers are called upon to perform, we are reaching a point where computers will have to be designed for processing of information in fuzzy form. In fact, it is the capability to manipulate fuzzy concepts that distinguishes human intelligence from the machine intelligence of current generation computers. Without such capability we cannot build machines that can summarize written text, translate well from one natural language to another, or perform many other tasks that humans can do with ease because of their ability to manipulate fuzzy concepts." (Lotfi A Zadeh, "The Birth and Evolution of Fuzzy Logic", 1989)
"If we want to solve problems effectively [...] we must keep in mind not only many features but also the influences among them. Complexity is the label we will give to the existence of many interdependent variables in a given system. The more variables and the greater their interdependence, the greater the system's complexity. Great complexity places high demands on a planner's capacity to gather information, integrate findings, and design effective actions. The links between the variables oblige us to attend to a great many features simultaneously, and that, concomitantly, makes it impossible for us to undertake only one action in a complex system." (Dietrich Dorner, "The Logic of Failure: Recognizing and Avoiding Error in Complex Situations", 1989)
"It is important to observe that there is an intimate connection between fuzziness and complexity. Thus, a basic characteristic of the human brain, a characteristic shared in varying degrees with all information processing systems, is its limited capacity to handle classes of high cardinality, that is, classes having a large number of members. Consequently, when we are presented with a class of very high cardinality, we tend to group its elements together into subclasses in such a way as to reduce the complexity of the information processing task involved. When a point is reached where the cardinality of the class of subclasses exceeds the information handling capacity of the human brain, the boundaries of the subclasses are forced to become imprecise and fuzziness becomes a manifestation of this imprecision." (Lotfi A Zadeh, "The Birth and Evolution of Fuzzy Logic", 1989)
"The cybernetics phase of cognitive science produced an amazing array of concrete results, in addition to its long-term (often underground) influence: the use of mathematical logic to understand the operation of the nervous system; the invention of information processing machines (as digital computers), thus laying the basis for artificial intelligence; the establishment of the metadiscipline of system theory, which has had an imprint in many branches of science, such as engineering (systems analysis, control theory), biology (regulatory physiology, ecology), social sciences (family therapy, structural anthropology, management, urban studies), and economics (game theory); information theory as a statistical theory of signal and communication channels; the first examples of self-organizing systems. This list is impressive: we tend to consider many of these notions and tools an integrative part of our life […]" (Francisco Varela, "The Embodied Mind", 1991)
"On the other hand, those who design and build computers know exactly how the machines are working down in the hidden depths of their semiconductors. Computers can be taken apart, scrutinized, and put back together. Their activities can be tracked, analyzed, measured, and thus clearly understood - which is far from possible with the brain. This gives rise to the tempting assumption on the part of the builders and designers that computers can tell us something about brains, indeed, that the computer can serve as a model of the mind, which then comes to be seen as some manner of information processing machine, and possibly not as good at the job as the machine. (Theodore Roszak, "The Cult of Information", 1994)
"When we visually perceive the world, we do not just process information; we have a subjective experience of color, shape, and depth. We have experiences associated with other senses (think of auditory experiences of music, or the ineffable nature of smell experiences), with bodily sensations (e.g., pains, tickles, and orgasms), with mental imagery (e.g., the colored shapes that appear when one tubs one's eyes), with emotion (the sparkle of happiness, the intensity of anger, the weight of despair), and with the stream of conscious thought." (David Chalmers, "The Puzzle of Conscious Experience", Scientific American, 1995)
"[...] information feedback about the real world not only alters our decisions within the context of existing frames and decision rules but also feeds back to alter our mental models. As our mental models change we change the structure of our systems, creating different decision rules and new strategies. The same information, processed and interpreted by a different decision rule, now yields a different decision. Altering the structure of our systems then alters their patterns of behavior. The development of systems thinking is a double-loop learning process in which we replace a reductionist, narrow, short-run, static view of the world with a holistic, broad, long-term, dynamic view and then redesign our policies and institutions accordingly." (John D Sterman, "Business dynamics: Systems thinking and modeling for a complex world", 2000)
"Agent subroutines may pass information back and forth, but subroutines are not changed as a result of the interaction, as people are. In real social interaction, information is exchanged, but also something else, perhaps more important: individuals exchange rules, tips, beliefs about how to process the information. Thus a social interaction typically results in a change in the thinking processes - not just the contents - of the participants." (James F Kennedy et al, "Swarm Intelligence", 2001)
"The acquisition of information is a flow from noise to order - a process converting entropy to redundancy. During this process, the amount of information decreases but is compensated by constant re- coding. In the recoding the amount of information per unit increases by means of a new symbol which represents the total amount of the old. The maturing thus implies information condensation. Simultaneously, the redundance decreases, which render the information more difficult to interpret." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)
"Self-organization can be seen as a spontaneous coordination of the interactions between the components of the system, so as to maximize their synergy. This requires the propagation and processing of information, as different components perceive different aspects of the situation, while their shared goal requires this information to be integrated. The resulting process is characterized by distributed cognition: different components participate in different ways to the overall gathering and processing of information, thus collectively solving the problems posed by any perceived deviation between the present situation and the desired situation." (Carlos Gershenson & Francis Heylighen, "How can we think the complex?", 2004)
"[a complex system is] a system in which large networks of components with no central control and simple rules of operation give rise to complex collective behavior, sophisticated information processing, and adaptation via learning or evolution." (Melanie Mitchell, "Complexity: A Guided Tour", 2009)
"An artificial neural network, often just called a 'neural network' (NN), is an interconnected group of artificial neurons that uses a mathematical model or computational model for information processing based on a connectionist approach to computation. Knowledge is acquired by the network from its environment through a learning process, and interneuron connection strengths (synaptic weighs) are used to store the acquired knowledge." (Larbi Esmahi et al, "Adaptive Neuro-Fuzzy Systems", 2009)
"[...] we also distinguish knowledge from information, because some pieces of information, such as questions, orders, and absurdities do not constitute knowledge. And also because computers process information but, since they lack minds, they cannot be said to know anything." (Mario Bunge, "Matter and Mind: A Philosophical Inquiry", 2010)
"Intelligent systems that mimic processing of information by human brain neurons. They are capable of learning attributes, generalizing, parallel processing of information and error minimization. As a result, they are capable to model and solve complex systems." (Salim Lahmiri , "Modeling Stock Market Industrial Sectors as Dynamic Systems and Forecasting", 2015)
"Cybernetics studies the concepts of control and communication in living organisms, machines and organizations including self-organization. It focuses on how a (digital, mechanical or biological) system processes information, responds to it and changes or being changed for better functioning (including control and communication)." (Dmitry A Novikov, "Cybernetics 2.0", 2016)