We come across various sources of information in our day to day lives that help us establish our beliefs and opinions. However, do we always stop to assess the accuracy of the information being provided to us? We make sense of the information given to us and assume it to be true. When we receive a piece of information, we create a cause-and-effect connection structure to process it and make sense of it. Recent research has revealed that once this connection is made, we may not change our views despite learning of evidence that discredits the information.
Hamby, Ecker, & Brinberg (2020) have recently provided a unique take on the subject of False Information in their paper from the Journal of Consumer Psychology. They used the principle of ‘continued influence effect’ to suggest that the likelihood of our belief in discredited or retracted information depends on its ability to support the cause-and-effect structure of an event (Hamby, Ecker, & Brinberg, 2020). The ‘continued influence effect’ can be described as the continued reliance on retracted information in reasoning despite the acknowledgment of the retraction (Chang, Ecker, & Page, 2108). Hamby, Ecker, & Brinberg (2020) argue that we would rather believe in false information if it gives us a more complete understanding of the cause-and-effect structure than a perspective which is evidently accurate, but less complete and fails to fill in the gaps.
The research consisted of three studies. Study 1 consisted of participants viewing a story in which an unwell character takes a drug that fails to cure him. The participants were divided into two groups, one was given an explanation (he took the medicine at the wrong time) and the other group was not given any explanation. Both the groups were given an additional piece of information- The character took the medication with a glass of lemonade; they were also informed that taking this medication with citrus-based drinks renders the drug ineffective (Hamby, Ecker, & Brinberg, 2020). Later, all the participants were told that the last piece of information was retracted as it was inaccurate. Upon later interrogation on why the drug was ineffective, it was found that the group without any explanation was more likely to incorrectly blame the false information of the lemonade as the reason behind the drug’s failure (2020). This observation can be attributed to the ability of false information to support the cause-and-effect construct created by the group without any explanation. The group which was given the explanation of the drug being consumed at the wrong time arguably found it easier to let go of the false information upon its retraction as it did not support its cause-and-effect construct.
Anne Hamby and her colleagues supported their argument through a second study, which consisted of participants reading an extract from a poker player- “I reach down and pull out my bottle of kombucha, which I like to drink at poker matches. I take a long deep swig from the bottle. And I have clarity of mind. I fold” (Hamby, Ecker, & Brinberg, 2020, p. 21). The sentence allowed the “casual inference” that the player’s mental clarity was due to drinking kombucha, however, it was never explicitly said so (2020). The participants were divided into two groups. One group was informed that kombucha supports mental performance, while the other was informed that the drink supports muscular function. The provision of this additional piece of information helped the participants from the mental performance group to establish a cause-and-effect relationship between the consumption of kombucha and the poker player’s mental clarity to fold the hand. However, the muscular function group could not make any clear links between the two pieces of information. Later, all participants were informed that the statement about kombucha was retracted as it was false. In their assessment of the participants’ reasoning behind the poker player’s win, it was found that participants from the mental performance group were more likely to attribute the player’s win to kombucha, despite having learned that the information provided to them was false (2020). This study also supported their argument as the group with a clear cause-and-effect construct linked with the false information tended to ignore its inaccuracy.
Study 3 tested the impact of continued influence depending on a positive or negative ending of a story. The premise of Study 1 was repeated (failing medication) and participants were divided into two groups. One group was given a positive ending to the story- the character changes his routine and gets better, while the other group was given a negative ending- “the medication’s window of effectiveness has elapsed” and he does not get better (Hamby, Ecker, & Brinberg, 2020, p. 27). Both the groups were given the additional piece of information that stated citrus-based drinks as a cause for failed medication, which was later retracted. The results of the study showed that the continued influence of the false information was lesser in the group which had received the negative ending (2020). Hamby and her colleagues suggest that this is due to our motivation to accurately understand the cause behind a negative ending, in order to avoid going through the same mistake in the future.
Hamby, Ecker, & Brinberg (2020) also argue that we can be conscious of the continued influence effect through prioritizing accuracy over completeness. They identified our tendency of being biased towards creating a complete cause-and-effect construct in order to understand a piece of information, irrespective of its accuracy. We live in an age where we are subject to an unlimited supply of information through multiple sources. In our attempt to comprehend this information, we must consciously prioritize accuracy, rather than fitting together half the pieces of the jigsaw because they justify the cause-and-effect constructs in our minds. Such research sheds light on the vulnerability of our ability to perceive incoming information. This novel take on influence opens up various research possibilities on cognition, social influence, and attribution.
Chang, E. P., Ecker, U., & Page, A. (2018). Continued Influence Effect of Misinformation in Rumination.
Hamby, A., Ecker, U., & Brinberg, D. (2020). How Stories in Memory Perpetuate the Continued Influence of False Information. Journal of Consumer Psychology, 30(2), 240-259.
Manav Agarwal recently completed an undergraduate degree in Liberal Arts and Humanities with a major in Psychology at the Jindal School of Liberal Arts & Humanities (O.P. Jindal Global University). His interests lie in the area of organizational & business psychology, human resources and community development.