Skip to main content

The Art of Lying

Lying has gotten a bad rap. In fact, it is among the most sophisticated accomplishments of the human mind. But how can one tell if a person is fibbing?

Liars tend to appear more tense, and their lackluster stories are often thin on detail.

A 51-year-old man I will call “Mr. Pinocchio” had a strange problem. When he tried to tell a lie, he often passed out and had convulsions. In essence, he became a kind of Pinocchio, the fictional puppet whose nose grew with every fib. For the patient, the consequences were all too real: he was a high-ranking official in the European Economic Community (since replaced by the European Union), and his negotiating partners could tell immediately when he was bending the truth. His condition, a symptom of a rare form of epilepsy, was not only dangerous, it was bad for his career.

Doctors at the University Hospitals of Strasbourg in France discovered that the root of the problem was a tumor about the size of a walnut. The tumor was probably increasing the excitability of a brain region involved in emotions; when Mr. Pinocchio lied, this excitability caused a structure called the amygdala to trigger seizures. Once the tumor was removed, the fits stopped, and he was able to resume his duties. The doctors, who described the case in 1993, dubbed the condition the “Pinocchio syndrome.”

Mr. Pinocchio’s plight demonstrates the far-reaching consequences of even minor changes in the structure of the brain. But perhaps just as important, it shows that lying is a major component of the human behavioral repertoire; without it, we would have a hard time coping. When people speak unvarnished truth all the time—as can happen when Parkinson’s disease or certain injuries to the brain’s frontal lobe disrupt people’s ability to lie—they tend to be judged tactless and hurtful. In everyday life, we tell little white lies all the time, if only out of politeness: Your homemade pie is awesome (it’s awful). No, Grandma, you’re not interrupting anything (she is). A little bit of pretense seems to smooth out human relationships without doing lasting harm.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Yet how much do researchers know about lying in our daily existence? How ubiquitous is it? When do children usually start engaging in it? Does it take more brainpower to lie or to tell the truth? Are most people good at detecting untruths? And are we better at it than tools designed for the purpose? Scientists exploring such questions have made good progress—including discovering that lying in young children is a sign that they have mastered some important cognitive skills.

To Lie or Not to Lie

Of course, not everyone agrees that some lying is necessary. Generations of thinkers have lined up against this perspective. The Ten Commandments admonish us to tell the truth. The Pentateuch is explicit: “Thou shalt not bear false witness against thy neighbor.” Islam and Buddhism also condemn lying. For 18th-century philosopher Immanuel Kant, the lie was the “radical innate evil in human nature” and was to be shunned even when it was a matter of life and death.

Today many philosophers take a more nuanced view. German philosopher Bettina Stangneth argues that lying should be an exception to the rule because, in the final analysis, people rely on being told the truth in most aspects of life. Among the reasons they lie, she notes in her 2017 book Deciphering Lies, is that it can enable them to conceal themselves, hiding and withdrawing from people who intrude on their comfort zone. It is also unwise, Stangneth says, to release children into the world unaware that others might lie to them.

It is not only humans who practice deception. Trickery and deceit of various kinds have also been observed in higher mammals, especially primates. The neocortex—the part of the brain that evolved most recently—is critical to this ability. Its volume predicts the extent to which various primates are able to trick and manipulate, as primatologist Richard Byrne of the University of St. Andrews in Scotland showed in 2004.

Children Have to Learn How to Lie

In our own kind, small children love to make up stories, but they generally tell their first purposeful lies at about age four or five. Before starting their careers as con artists, children must first acquire two important cognitive skills. One is deontic reasoning: the ability to recognize and understand social rules and what happens when the rules are transgressed. For instance, if you confess, you may be punished; if you lie, you might get away with it. The other is theory of mind: the ability to imagine what another person is thinking. I need to realize that my mother will not believe that the dog snagged the last burger if she saw me scarf down the food. As a step to developing a theory of mind, children also need to perceive that they know some things their parents do not, and vice versa—an awareness usually acquired by age three or four.

People cook up about two stories a day on average, according to social psychologist Bella M. DePaulo, of the University of California, Santa Barbara, who conducted a 2003 study in which participants filled out “lie diaries.” It takes time, however, to become skilled. A 2015 study with more than 1,000 participants looked at lying in volunteers in the Netherlands aged six to 77. Children, the analysis found, initially have difficulty formulating believable lies, but proficiency improves with age. Young adults between 18 and 29 do it best. After about the age of 45, we begin to lose this ability.

A similar invertedU-shaped curve over the life span is also seen with a phenomenon known as response inhibitionthe ability to suppress one’s initial response to something. It is what keeps us from blurting out our anger at our boss when we are better off keeping silent. The pattern suggests that this regulatory process, which, like deception, is managed by the neocortex, may be a prerequisite for successful lying.

Current thinking about the psychological processes involved in deception holds that people typically tell the truth more easily than they tell a lie and that lying requires far more cognitive resources. First, we must become aware of the truth; then we have to invent a plausible scenario that is consistent and does not contradict the observable facts. At the same time, we must suppress the truth so that we do not spill the beans—that is, we must engage in response inhibition. What is more, we must be able to assess accurately the reactions of the listener so that, if necessary, we can deftly produce adaptations to our original story line. And there is the ethical dimension, whereby we have to make a conscious decision to transgress a social norm. All this deciding and self-control implies that lying is managed by the prefrontal cortex—the region at the front of the brain responsible for executive control, which includes such processes as planning and regulating emotions and behavior.

Under the Hood

Brain-imaging studies have contributed to the view that lying generally requires more effort than telling the truth and involves the prefrontal cortex. In a pioneering 2001 study, the late neuroscientist Sean Spence, then at the University of Sheffield in England, tested this idea using a rather rudimentary experimental setup. While Spence’s participants lay in a functional magnetic resonance imaging (fMRI) brain scanner, they answered questions about their daily routine by pressing a yes or no button on a screen. Depending on the color of the writing, they were to answer either truthfully or with a lie. (The researchers knew the correct answers from earlier interviews.) The results showed that the participants needed appreciably more time to formulate a dishonest answer than an honest one. In addition, certain parts of the prefrontal cortex were more active during lying (that is, they had more blood flowing in them). Together the findings indicated that the executive part of the brain was doing more processing during lying.

Several follow-up studies have confirmed the role of the prefrontal cortex in lying. Merely pointing to a particular region of the brain that is active when we tell an untruth does not, however, reveal what is going on up there. Moreover, the situations in these early experiments were so artificial that they had hardly anything in common with people’s everyday lives: the subjects probably could not have cared less whether they were dishonest about what they ate for breakfast.

To counter this last problem, in 2009 psychologist Joshua Greene of Harvard University conducted an ingenious experiment in which the participants had a monetary incentive to behave dishonestly. As subjects lay in an fMRI scanner, they were asked to predict the results of a computer-generated coin toss. (The cover story was that this study was testing their paranormal abilities. Even neuroscientists sometimes have to employ misdirection in the name of a higher scientific goal!)

If the volunteers typed the correct response, they were given up to $7. They lost money for wrong answers. They had to reveal their prediction beforehand for half of the test runs. In all the other runs, they merely disclosed after the coin toss whether they had predicted correctly. Subjects were paid even if they lied about their advance conclusions, but not everyone exploited the situation. Greene was able to read the honesty of the participants simply by looking at the hit rates: the honest subjects predicted correctly half the time, whereas the cheaters claimed to have come up with the correct answers in more than three quarters of the runs—a rate too high to be believed. After the study was over, a few liars were bothered by a bad conscience and admitted that they had cheated.

Greene asked himself what distinguished the honest from the dishonest participants. Analysis of the fMRI data showed that when honest subjects gave their answers, they had no increased activity in certain areas of the prefrontal cortex known to be involved in self-control. In contrast, those control regions did become perfused with blood when the cheaters responded. The analysis of reaction times told much the same story. The honest participants did not hesitate even when they were given the opportunity to cheat. Apparently they never even considered lying. Conversely, response time became more prolonged in the dishonest subjects.

Particularly interesting was that the cheaters showed increased activity in the control regions of the prefrontal cortex not only when they chose to behave dishonestly but also when they threw in occasional truths to distract from the lies. Greene suggests that activity in the control regions of the prefrontal cortex in the cheaters may reflect the process of deciding whether to lie, regardless of the decisions those cheaters finally made.

Instead of assessing individual brain regions at the same time as someone told the truth or a lie, psychologist Ahmed Karim of the University of Tübingen in Germany and his colleagues influenced brain activity from the outside, using a method known as transcranial direct-current stimulation—which is safe and painless. In this method, two electrodes are attached to the scalp and positioned so that a weak current hits a selected brain area.

To make the experimental situation as lifelike as possible, the team invented a role-playing game. The test subjects were to pretend they were robbers, sneak into an unobserved room and steal a €20 note from a wallet in a jacket pocket. They were told that some participants in the study would be innocent. After the theft, they were subjected to an interrogation. If they got through the interrogation without getting tangled up in contradictions, they could keep the money. They were advised to answer as many trivial questions as possible truthfully (for example, giving the correct color of the jacket) because nonguilty people might remember such details just as easily as thieves did but lie at decisive moments (for example, when questioned about the color of the wallet). The electrodes were applied to everyone before questioning, but electrical impulses were administered to only half of the participants (the “test” subjects); the other half served as the control group.

More Effective Deception, Thanks to Brain Stimulation

In Karim’s study, the electrodes were arranged to minimize the excitability of the anterior prefrontal cortex, a brain area that earlier studies had associated with moral and ethical decision making. With this region inhibited, the ability to deceive improved markedly. Subjects in the test and control groups lied about as frequently, but those who received the stimulation were simply better at it; their mix of truthful answers and lies made them less likely to get found out. Their response times were also considerably faster.

The researchers ruled out the possibility that brain stimulation had elevated the cognitive efficiency of the participants more generally. In a complicated test of attention, the test subjects did no better than the control group. Apparently Karim’s team had specifically improved its test subjects’ ability to lie.

One possible interpretation of the findings is that the electric current temporarily interrupted the functioning of the anterior prefrontal cortex, leaving participants with fewer cognitive resources for evaluating the ethical implications of their actions; the interruption allowed them to concentrate on their deceptions. Two follow-up studies conducted by other teams were also able to influence lying using direct current, although they used different experimental setups and target brain regions. But all the test subjects in these studies lied at essentially the press of a button. Whether electrically stimulating selected brain areas would work outside the laboratory is unknown. In any case, no instrument has yet been developed that can test such a hypothesis.

Challenges of Lie Detection

On the other hand, devices that supposedly measure whether a person is telling the truth—polygraphs—have been in use for decades. Such tools are desirable in part because humans turn out to be terrible lie detectors.

In 2003 DePaulo and her colleagues summarized 120 behavior studies, concluding that liars tend to seem more tense and that their stories lack vividness, leaving out the unusual details that would generally be included in honest descriptions. Liars also correct themselves less; in other words, their stories are often too smooth. Yet such characteristics do not suffice to identify a liar conclusively; at most, they serve as clues. In another analysis of multiple studies, DePaulo and a co-author found that people can distinguish a lie from the truth about 54 percent of the time, just slightly better than if they had guessed. But even those who encounter liars frequently—such as the police, judges and psychologists—can have trouble recognizing a con artist.

None

Polygraphs are meant to do better by measuring a variety of biological signs (such as skin conductance and pulse) that supposedly track with lying. Gestalt psychologist Vittorio Benussi of the University of Graz in Austria presented a prototype based on respiration in the early 1910s, and detectors have been refined and improved ever since. Even so, the value continues to be a matter of contention. In 1954 the West German Federal Court of Justice banned polygraph use in criminal trials on the grounds that such “insight into the soul of the accused” (as a 1957 paper on the ruling put it) would undermine defendants’ freedom to make decisions and act. From today’s perspective, this reasoning seems a bit overdramatic; even the latest lie detectors do not have that ability. More recent criticisms have been leveled at their unreliability.

Courts in other countries do accept results from lie-detector tests as evidence. The case of George Zimmerman, a neighborhood-watch volunteer who, in 2012, shot a black teenager—Trayvon Martin—supposedly in self-defense, is well known. Zimmerman’s acquittal triggered a debate about racism across the U.S. The police interrogation involved a particular variant of a lie-detector test that includes what is called computer voice-stress analysis. This analysis was later placed in evidence to prove the innocence of the accused, despite vehement scientific criticism of the method.

Polygraphs do detect lying at a rate better than chance, although they are also frequently wrong. A questioning technique known as the guilty knowledge test has been found to work well in conjunction with a polygraph. The suspect is asked multiple-choice questions, the answers to which only a guilty party would know (a technique very similar to the study involving the pickpocket role-playing described earlier). The theory behind it holds that when asked questions that could reveal guilt (“Was the wallet red?”), a guilty person exhibits more pronounced physiological excitation, as indicated by elevated skin conductance and delayed response time. This method has an accuracy of up to 95 percent, with the innocent almost always identified as such. Although this test is by far the most precise technique available, even it is not perfect.

Recently experiments have been conducted to evaluate whether imaging techniques such as fMRI might be useful for detecting lies. The proposed tests mostly look at different activation patterns of the prefrontal cortex in response to true and false statements. In the U.S., a number of companies are marketing fMRI lie detection. One advertises itself as useful to insurance companies, government agencies and others. It even claims to provide information relating to “risk reduction in dating,” “trust issues in interpersonal relationships,” and “issues concerning the underlying topics of sex, power, and money.”

But fMRI approaches still have shortcomings. For one thing, differences in responses to lies and truths that become evident when calculating the average results of a group do not necessarily show up in each individual. Moreover, researchers have not yet been able to identify a brain region that is activated more intensely when we tell the truth than when we lie. As a result, a person’s honesty can be revealed only indirectly, by the absence of indications of lying. Another problem is Greene’s finding that elevated blood perfusion in parts of the prefrontal cortex might indicate that a person is deciding whether to lie and not necessarily that the person is lying. That ambiguity can make it difficult to interpret fMRI readings.

So far courts have rejected fMRI lie detectors as evidence. The efficacy of the method has simply not been adequately documented. A machine that reads thoughts and catches the brain in the act of lying is not yet on the near horizon.

This article is reproduced with permission and was first published in Gehirn&Geist on April 3, 2018.

MORE TO EXPLORE

Cues to Deception. B. M. DePaulo et al. in Psychological Bulletin, Vol. 129, No. 1, pages 74–118; January 2003.

Patterns of Neural Activity Associated with Honest and Dishonest Moral Decisions. Joshua D. Greene and Joseph M. Paxton in Proceedings of the National Academy of Sciences USA, Vol. 106, No. 30, pages 12,506–12,511; July 28, 2009.

From Junior to Senior Pinocchio: A Cross-Sectional Lifespan Investigation of Deception. Evelyne Debey et al. in Acta Psychologica, Vol. 160, pages 58–68; September 2015.

Lying Takes Time: A Meta-analysis on Reaction Time Measures of Deception. Kristina Suchotzki et al. in Psychological Bulletin, Vol. 143, No. 4, pages 428–453; April 2017.

Theodor Schaarschmidt is a psychologist who earns his living honestly—as a science journalist.

More by Theodor Schaarschmidt
SA Mind Vol 29 Issue 5This article was originally published with the title “The Art of Lying” in SA Mind Vol. 29 No. 5 (), p. 18
doi:10.1038/scientificamericanmind0918-18