13 September 2021

Knowledge Representation: The Filtering Mind (Quotes)

 "Because of the extended time image and the extended relationship images, man is capable of ‘rational behavior,’ that is to say, his response is not to an immediate stimulus but to an image of the future filtered through an elaborate value system.  His image contains not only what is, but what might be." (Kenneth E Boulding, "The Image: Knowledge in life and society", 1956)

"We say the map is different from the territory. But what is the territory? Operationally, somebody went out with a retina or a measuring stick and made representations which were then put on paper. What is on the paper map is a representation of what was in the retinal representation of the man who made the map; and as you push the question back, what you find is an infinite regress, an infinite series of maps. The territory never gets in at all. […] Always, the process of representation will filter it out so that the mental world is only maps of maps, ad infinitum." (Gregory Bateson, "Steps to an Ecology of Mind", 1972)

"Nature is not ‘given’ to us - our minds are never virgin in front of reality. Whatever we say we see or observe is biased by what we already know, think, believe, or wish to see. Some of these thoughts, beliefs and knowledge can function as an obstacle to our understanding of the phenomena." (Anna Sierpinska, "Understanding in Mathematics", 1994)

"The abstractions of science are stereotypes, as two-dimensional and as potentially misleading as everyday stereotypes. And yet they are as necessary to the process of understanding as filtering is to the process of perception." (K C Cole, "First You Build a Cloud and Other Reflections on Physics as a Way of Life", 1999)

"We all would like to know more and, at the same time, to receive less information. In fact, the problem of a worker in today's knowledge industry is not the scarcity of information but its excess. The same holds for professionals: just think of a physician or an executive, constantly bombarded by information that is at best irrelevant. In order to learn anything we need time. And to make time we must use information filters allowing us to ignore most of the information aimed at us. We must ignore much to learn a little." (Mario Bunge, "Philosophy in Crisis: The Need for Reconstruction", 2001)

"The receiver decodes the symbols to interpret the meaning of the message. Encoding and decoding are potential sources for communication errors because knowledge, attitudes, and context act as filters and create noise when translating from symbols to meaning. Finally, feedback occurs when the receiver responds to the sender’s communication with a return message. Without feedback, the communication is one-way; with feedback, it is two-way. Feedback is a powerful aid to communication effectiveness because it enables the sender to determine whether the receiver correctly interpreted the message." (Richard L Daft & Dorothy Marcic, "Understanding Management" 5th Ed., 2006)

"Your mental models shape the way you see the world. They help you to quickly make sense of the noises that filter in from outside, but they can also limit your ability to see the true picture." (Colin Cook & Yoram R Wind, "The Power of Impossible Thinking: Transform the Business of Your Life and the Life of Your Business", 2006)

"Actually, around 80% of the data we use to make decisions is already in our heads before we engage with a situation. Our power to perceive is governed and limited by cognitive filters, sometimes termed our ‘mental model’. Mental models are formed as a result of past experience, knowledge and attitudes. They are deeply ingrained, often subconscious, structures that limit what we perceive and also colour our interpretation of supposed facts." (Robina Chatham & Brian Sutton, "Changing the IT Leader’s Mindset", 2010) 

"[…] our strong mental models tend to make us blind to certain possibilities, and therefore we unknowingly engage in biased listening. Whenever we interpret information, we subconsciously access three filters based upon how we feel about the content, the information source and situation (or context) in which we receive the information." (Robina Chatham & Brian Sutton, "Changing the IT Leader’s Mindset", 2010)

"Perception and memory are imprecise filters of information, and the way in which information is presented, that is, the frame, influences how it is received. Because too much information is difficult to deal with, people have developed shortcuts or heuristics in order to come up with reasonable decisions. Unfortunately, sometimes these heuristics lead to bias, especially when used outside their natural domains." (Lucy F Ackert & Richard Deaves, "Behavioral Finance: Psychology, Decision-Making, and Markets", 2010)

"Mental models bind our awareness within a particular scaffold and then selectively can filter the content we subsequently receive. Through recalibration using revised mental models, we argue, we cultivate strategies anew, creating new habits, and galvanizing more intentional and evolved mental models. This recalibration often entails developing a strong sense of self and self-worth, realizing that each of us has a range of moral choices that may deviate from those in authority, and moral imagination." (Patricia H Werhane et al, "Obstacles to Ethical: Decision-Making Mental Models, Milgram and the Problem of Obedience", 2013)

"In the absence of clear information - in the absence of reliable statistics - people did what they had always done: filtered available information through the lens of their worldview." (Zachary Karabell, "The Leading Indicators: A short history of the numbers that rule our world", 2014)

"Images are generally resistant to change and ignore messages that do not conform to their internal settings. Sometimes, however, they do react and can alter in an incremental or even revolutionary manner. Humans can talk about and share their images and, in the symbolic universe they create, reflect upon what is and what might be." (Michael C Jackson, "Critical Systems Thinking and the Management of Complexity", 2019)

"We filter new information. If it accords with what we expect, we’ll be more likely to accept it. […] Our brains are always trying to make sense of the world around us based on incomplete information. The brain makes predictions about what it expects, and tends to fill in the gaps, often based on surprisingly sparse data." (Tim Harford, "The Data Detective: Ten easy rules to make sense of statistics", 2020)

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...