Advertisement

Science proves people lie for selfish reasons

If the lie's outcome was delayed or unclear, people lied less often.

Vladek

How honest are you, really? Would you be open to a brain treatment that might make you act a little more truthful? I'm asking because researchers from the ever-productive University of Zurich tried an experiment to see if they could influence behavior with "transcranial direct current stimulation (tDCS)," a noninvasive type of electrical pulse that apparently makes brain cells more active. The scientists targeted the right dorsolateral prefrontal cortex (involved with risky and moral decision making) with the tDCS to see how honest people would be when reporting dice rolls.

Before each roll, a computer would tell the participants which types of rolls would get them the most money. There were ten rolls in the "sham experiment" (where shocks weren't administered), and maximum earning potential was around $90. So, there was incentive to lie.

"Although this prevented us from identifying individual acts of cheating with certainty, we can determine the degree of cheating with each tDCS intervention by comparing the mean percentage of reported successful die rolls against the 50% benchmark implied by fully honest reporting."

That lead to folks reporting successful die rolls 68 percent of the time, versus the honest peoples' 50 percent. "Subjects who claimed nine, eight and seven successful die rolls were also significantly overrepresented, suggesting that many of them cheated on some, but not all possible occasions."

"Cheating was substantial in a control condition, but decreased dramatically when neural excitability was enhanced with tDCS," the paper reads.

In the non-sham experiment, the electrical stimulation dropped the misreporting of facts "significantly." Participants only reported successful die rolls 58 percent of the time. "This result corresponds to an implied cheating rate of 15 percent, a figure that is nearly 60 percent lower than that observed in the sham condition."

The key finding here is that the only time the brain stimulation had an effect was when material motives were put up against moral ones. Selfish behavior was measured using a dictator game and an investment experiment aimed to look at someone's affinity for risky and unclear results. There was also one designed to track how impulsive someone might be. That's in addition to a die-rolling experiment where participants didn't get any money themselves, but were "earning it" for an anonymous participant.

Basically, unless the lying specifically benefitted themselves, the participants were less likely to do it.

"This finding suggest that the stimulation mainly reduced cheating in participants who actually experienced a moral conflict ["to cheat or not to cheat," basically], but did not influence the decision-making process" in people who were trying to make as much money as possible. Now we know of another use for tDCS aside from shocking away motion sickness: keeping folks honest.