Publications

  • It Might Become True: How Prefactual Thinking Licenses Dishonesty (Helgason & Effron, JPSP 2022)

    In our “post-truth” era, misinformation spreads not only because people believe falsehoods, but also because people sometimes give dishonesty a moral pass. The present research examines how the moral judgments that people form about dishonesty depend not only on what they know to be true, but also on what they imagine might become true. In six studies (N = 3,607), people judged a falsehood as less unethical to tell in the present when we randomly assigned them to entertain prefactual thoughts about how it might become true in the future. This effect emerged with participants from 59 nations judging falsehoods about consumer products, professional skills, and controversial political issues – and the effect was particularly pronounced when participants were inclined to accept that the falsehood might become true. Moreover, thinking prefactually about how a falsehood might become true made people more inclined to share the falsehood on social media. We theorized that, even when people recognize a falsehood as factually incorrect, these prefactual thoughts reduce how unethical the falsehood seems by making the broader meaning that the statement communicates, its gist, seem truer. Mediational evidence was consistent with this theorizing. We argue that prefactual thinking offers people a degree of freedom they can use to excuse lies, and we discuss implications for theories of mental simulation and moral judgment. (Paper link)

  • Reflecting on Identity Change Facilitates Confession of Past Misdeeds (Helgason & Berman, JEP:G 2022)

    Across four studies (N = 3,351), we demonstrate that reflecting on identity change increases confession and decreases justification of past misdeeds. Moreover, publicly communicating one’s identity change to others increases confession above and beyond privately reflecting on identity change. By severing their connection with their past self, individuals can admit to past a misdeed (“I did it”), while reducing their fear that doing so will implicate their present moral character (“But that’s not who I am anymore”). (Paper link)

  • From Critical to Hypocritical: Counterfactual Thinking Increases Partisan Disagreement About Media Hypocrisy (Helgason & Effron, JESP 2022)

    Partisans on both sides of the political aisle complain that the mainstream media is hypocritical, but they disagree about whom that hypocrisy benefits. In the present research, we examine how counterfactual thinking contributes to this partisan disagreement about media hypocrisy. In three studies (two pre-registered, N = 1,342) of people’s reactions to media criticism of politicians, we find that people judged the media’s criticism of politicians they support as more hypocritical when they imagined whether the media would have criticized a politician from a different party for the same behavior if given the chance. Because this effect only emerged when people judged the media’s criticism of politicians they supported, and not politicians they opposed, counterfactual thinking increased partisan division in perceptions of media hypocrisy. We discuss implications for how counterfactual thinking facilitates motivated moral reasoning, contributes to bias in social judgment, and amplifies political polarization. (Paper link)

  • The Moral Psychology of Misinformation: Why We Excuse Dishonesty in a Post-Truth World (Effron & Helgason, Current Opinion in Psych 2022)

    Commentators say we have entered a “post-truth” era. As political lies and “fake news” flourish, citizens appear not only to believe misinformation, but also to condone misinformation they do not believe. The present article reviews recent research on three psychological factors that encourage people to condone misinformation: partisanship, imagination, and repetition. Each factor relates to a hallmark of “post-truth” society: political polarization, leaders who push “alternative facts,” and technology that amplifies disinformation. By lowering moral standards, convincing people that a lie’s “gist” is true, or dulling affective reactions, these factors not only reduce moral condemnation of misinformation, but can also amplify partisan disagreement. We discuss implications for reducing the spread of misinformation. (Paper link)

  • Moral Inconsistency (Effron & Helgason, AESP 2023)

    We review a program of research examining three questions. First, why is the morality of people’s behavior inconsistent across time and situations? We point to people’s ability to convince themselves they have a license to sin, and we demonstrate various ways people use their behavioral history and others – individuals, groups, and society – to feel licensed. Second, why are people’s moral judgments of others’ behavior inconsistent? We highlight three factors: motivation, imagination, and repetition. Third, when do people tolerate others who fail to practice what they preach? We argue that people only condemn others’ inconsistency as hypocrisy if they think the others are enjoying an “undeserved moral benefit.” Altogether, this program of research suggests that people are surprisingly willing to enact and excuse inconsistency in their moral lives. We discuss how to reconcile this observation with the foundational social psychological principle that people hate inconsistency. (Paper link)

  • “It’s Not Literally True, But You Get the Gist:” How Nuanced Understandings of Truth Encourage People to Condone and Spread Misinformation (Langdon, Helgason, Qiu, & Effron, Current Opinion in Psych 2024)

    People have a more-nuanced view of misinformation than the binary distinction between “fake news” and “real news” implies. We distinguish between the truth of a statement’s verbatim details (i.e., the specific, literal information) and its gist (i.e., the general, overarching meaning), and suggest that people tolerate and intentionally spread misinformation in part because they believe its gist. That is, even when they recognize a claim as literally false, they may judge it as morally acceptable to spread because they believe it is true “in spirit.” Prior knowledge, partisanship, and imagination increase belief in the gist. We argue that partisan conflict about the morality of spreading misinformation hinges on disagreements not only about facts but also about gists. (Paper link)

  • Moral or Lawful? When Legal Constraints Reverse the Motivational Benefits of Moral Considerations (Kundro, Croitoru, & Helgason, Org Sci 2024)

    Nearly every employee is subject to some form of legal requirement as a function of their work. Laws are often implemented by authorities to ensure that employees and organizations engage in ethical and moral conduct at work. Importantly, acting in a moral manner is linked to benefits for employees, increasing intrinsic motivation which facilitates high levels of proactive behavior. Yet, employees increasingly face situations where laws or regulations conflict with what they perceive as morally appropriate (i.e. legal constraints on moral behavior), which we argue instead has negative consequences for employees. Combining insights from literature on motivation and moral foundations theory, we propose that when employees face legal constraints on moral behavior, they feel less intrinsically motivated, leading them to engage in less proactive behavior. We further predict that legal constraints are less damaging when employees perceive them as necessary, versus unnecessary evils. We test our model across three complementary studies: a field study of employees from a company in a heavily regulated industry and two pre-registered experiments. (Paper link)