Can We Refrain from Jingle Fallacies for a Minuet?
Intellectually, rationally, we know that the people with whom we speak don’t always understand us and that ascribing fault is counterproductive. It makes a lot more sense to figure out where the misunderstanding is occurring and to repair it with sense and compassion. Easier said than done? You bet. Fred Robinson suggests that we start by examining why we cross wires, however inadvertently.—Editor
Photo by Paul Skorupskas on Unsplash
Author: Fred Robinson
Can those of us who support systems definition and development be more in tune with stakeholders for harmonious acquisitions?
Puns can be fun, but they are also useful in that they expose a human weakness, that is, our tendency to first apply our personal conceptions of meaning to words before possibly making the effort to consider that maybe the words hold different meaning for others.
When our attempts at clear communications result in folks exchanging a shared set of words or a concept description without also sharing the same meaning or understanding, from a logic perspective, this cognitive gap is called a jingle fallacy [1, 2, 3]. Have you ever found yourself working with others, when they or you exclaim at some point well in to the effort, “Oh, that’s what you meant by [jingled term here]!” This might be followed by someone making the regretful reply, “I wish I’d known that we interpreted that definition differently sooner!”
If so, there’s a good chance you have been party to a jingle fallacy—a misunderstanding of a concept that all parties were certain they had understood the same way.
As adults, we generally feel like we’ve mastered the English language. It is so easy for us to translate our words into thoughts and our thoughts back into words in our own head that we take for granted that others are doing it the exact same way as we are. Yet, no matter how much evidence to the contrary—evidence that we are actually ineffective communicators, and that words do not have consistent meaning—we still trust and believe that we are sharing understanding merely by sharing our words. This confirmation bias—hearing only what one wants to hear—is just one of many contributions to the challenges of effective communications.
According to Rebecca Saxe of MIT’s SaxeLab, when children are somewhere between 3 and 5 years old, they develop the ability to realize that other people can hold thoughts in their heads that differ from their own. Yet Sir Ken Robinson (no relation to me) suggests that primary school education in the United States slowly saps the propensity for diverse thinking out of our school-aged children. Indeed, enforcing conformance to standards of learning and thinking can lead to assumptions that everyone must be thinking the same things when we use our words. Folks revert to their preschool mindset in the process.
In the many domains in which knowledge workers specialize, our words are explicit representations for what we believe we know, and that which we hope others associate with the same knowledge. Yet, each of us has tacit internal representations through our thoughts and mental models of the world which we perceive and use words to explain. Michael Polanyi  described the tacit dimension as that which we know but cannot say, our silent knowledge. We are regularly challenged in being good human-to-human communicators by this dichotomy, whether we word-play with puns or when our work is accomplished with words for which we share definitions but not understanding. The latter case is where jingle fallacies are born.
Working in the public interest, as we do at MITRE, we try to refrain from assuming that solely relying on the knowledge we put into words accurately shares our thoughts or expectations. When we hear something that sounds good to us, like a jingle, we know that it’s important to challenge preconceived notions and to reaffirm perceptions so that we improve shared understanding. We can’t just rely on making better dictionaries and glossaries.
How we might do that — through systems thinking and design thinking — is the subject of my next post.
 Aikins, H. A. (1902). The principles of logic. New York: Holt.
 Larsen, K., & Bong, C. H. (2016). A tool for addressing construct identity in literature reviews and metaanalyses. MIS Quarterly, 40(3), 529-551.
 Thorndike, E. L. (1904). An introduction to the theory of mental and social measurements. Science Press.
 Polanyi, M. (1966). The tacit dimension. Chicago: University of Chicago.
Fred Robinson has worked as both systems engineer and system architect within multiple business and technology domains, including but not limited to census data capture, electronic digital archives, physical security, biometrics, healthcare, and defense IT systems. Prior experience includes test engineering and execution for spacecraft and aviation systems. He holds a Master of Science in Systems Engineering from Johns Hopkins, and he completed his Doctor of Science in Information Systems and Communications at Robert Morris University in April 2018.
© 2018 The MITRE Corporation. All rights reserved. Approved for public release. Distribution unlimited. Case Number 18-0794
The MITRE Corporation is a not-for-profit organization that operates research and development centers sponsored by the federal government. Learn more about MITRE.