“We have all been hurt and experienced pain at some point in our lives. That pain compromises our trust and can transform our perspective on life. It is natural psychologically to defend ourselves when we feel vulnerability would be dangerous, but trust is as much a blessing for our own mental health as it is a gift for those we chose to trust. When trauma or pain takes away our ability to trust others, this means it is continually hurting us and depriving us of deep, meaningful bonds.
Our spiritual heart-felt side cannot thrive if we keep ourselves walled up. While we must be careful with whom we decide to open up with, it is not healthy to withdraw trust from everyone. Every relationship whether intimate, professional or family based requires a certain level of trust.
What is Trust
Trust refers to our ability to confidently believe that someone else’s intentions are good towards us. It is our ability to predict someone’s behavior and how they will respond to situations. Trust is just as much logical and based on evidence as it is emotional and instinctual. We FEEL trust, but we also calculate it.
Much of our social interactions are based on a give and take system, trust is a crucial part of this. When we marry someone and choose to trust them with our well-being, we have certain expectations of what they will give to the relationship as well as what we will give. Even if you consider the act of buying a car, it is natural to have more trust in a dealer selling you a certified used car with a warranty versus someone off the street that might give you a better deal but no warranty.
It comes down to this. If you believe someone will do right by you even in a difficult situation, you have trust in them. If you are unsure if someone will do right by you, then you don’t trust them.
It takes time to develop trust in someone, this is typically not an overnight process although in some social situations such as with a religious leader, we tend to expect trustworthiness out of them. As we have more social interactions and experiences together we start to notice their trends which either indicate they are dependable or not trustworthy.
In some situations, the other person is asked to sacrifice something such as money or time to meet our needs, those situations draw us closer to them and allow us to let our guards down. Although it is inevitable we will have to take a leap of faith at some point to develop deep and significant trust.
Trust in Relationships
The depth of our trust we develop in a relationship is so important as it relates to the extent we commit ourselves and invest. Considering the give and take social system, we give a lot more of ourselves to someone when we trust them and in return, we hope to receive that back. Insecurity about whether someone will act in our better interest causes us to withdraw emotionally, spiritually and often physically from that person. We will create a psychological distance from the other person as a means of defense.
Think of it like building a castle around our heart, we allow them to roam outside of our castle, but we won’t let down the drawbridge so easily. It is impossible to be close to someone if we won’t let them inside. Naturally, the person roaming the castle will grow tired and eventually withdraw, thus ending the relationship. This can relate to business partnerships and friendships just as much as intimate relationships.
Can You Trust Again?
Even if you have been badly hurt and betrayed, perhaps in a very traumatic situation, you can learn to trust people again. You have the power to decide if you will let their actions continue to hurt you and impact your ability to trust others or if you will make the choice to move forward, heal and work on trusting others.
4 Steps Towards Learning to Trust Again
Trust yourself. You cannot expect to trust others if you don’t trust yourself. Do not blame yourself for the past pain that robbed you of trust. Remember you are making the choice to stop giving power to that pain. Have faith in your judgment and don’t doubt yourself based upon past experiences.
Forgiveness. This doesn’t mean you are forgetting or condoning what the other person did, but you are choosing to be the better person and extend forgiveness to them as well as yourself. You are refusing to let their bad choices dictate your future. Every major religion in the world promotes forgiveness and mercy. Not just as an act of charity, but as a means of healing your own heart.
…you do not do evil to those who do evil to you, but you deal with them with forgiveness and kindness…
Stop victimizing yourself. We always have a choice when we are hurt, to remain the victim or to become stronger. No matter how harsh of a pain you endured, it is your choice to use it as a crutch and stay withdrawn OR take the steps forward toward healing. I have often heard the expression that which does not kill you only makes you stronger, it is true if you allow it to be. Stop being the victim, start being the victor. No one will hand you the ability to trust again, you must work toward it.
Accept vulnerability. Trust requires being vulnerable, which yes that means you must accept the risk you might get hurt. Every time we trust someone it is a careful risk calculation. Without the occasional leaps of faith, you will never know the extent of trust and love you can experience.
Trust is a critical component of our mental well-being, if we cannot trust anyone else then we lack trust in our own judgment. To achieve our happiest and most positive state of mind, we must allow ourselves to be vulnerable. That doesn’t mean we never have our guards up, of course, we must be mindful of who has access to our heart and the ability to harm us. Trust is a careful calculation of risk and reward. You have the ability to learn how to trust again, I did.”
People resort to lying for so many different reasons that it’d be impossible to list them all. However, of the most common motives for telling lies, avoiding punishment is the primary motivator for both children and adults. Other typical reasons include protecting ourselves or others from harm, maintaining privacy, and avoiding embarrassment, to name a few.
Learning to spot micro expressions is an important key to detecting deception as micro expressions often reveal hidden emotions.
Avoiding Punishment “I thought I was only going 55 miles an hour officer” claims the driver speeding at 70 mph. “My wristwatch stopped so I had no idea that I got home 2 hours after my curfew”, says the teenager. Avoiding punishment is the most frequent reason people tell serious lies, regardless of their age, whether it be to avoid the speeding ticket or being grounded. In serious lies there is a threat of significant damage if the lie is discovered: loss of freedom, money, job, relationship, reputation, or even life itself.
It is only in such serious lies, in which the liar would be punished if detected, that lies are detectable from demeanor – facial expression, body movements, gaze, voice, or words. The threat imposes an emotional load, generating involuntary changes that can betray the lie. The lies of everyday life where it doesn’t matter if they are detected – no punishment or rewards — that lies are easily told flawlessly.
Concealing Reward or Benefit In serious lies the falsehood is usually told to conceal the reward or benefit the liar obtained by breaking a rule or explicit expectation. The curfew violator was able to stay longer at the party; the speeding driver is rushing because he pushed the snooze button when the alarm went off. The husband who claims the ringer on the telephone in his office must have been turned off when he was ‘working’ late – in a hotel room with his girlfriend – will pay no price if his lie succeeds. In each of these examples, the rule breaker decides before breaking a rule that he or she will if questioned lie to cover the cheating. Sometimes the reward could have been achieved – a high mark on an exam — without cheating but not as easily, it would have taken more effort (hours of study in this example).
Protecting Someone from Harm Protecting someone else from harm is the next most important reason why people tell serious lies. You don’t want your friend, you fellow worker, your sibling, your spouse – anyone who you care about — to get punished, even if you don’t agree with what the person you are protecting did that put him or her in danger. It is not certain whether society approves of these lies. When policemen refuse to testify against a fellow officer they know has broken the law, we respect their motives but many people believe they should be truthful. Yet the terms we use – rat, fink, snitch – are derogatory. Anonymous call-in lines exist so those who volunteer information can avoid any loss of reputation or danger by informing. Do we have different standards for people who take the initiative to inform as compared to those who inform when directly asked to reveal information? I will reconsider this issue in a later newsletter when I write about children’s lies and why we don’t want them to tattle.
Self-Protection To protect yourself from being harmed even when you have not broken any rule is still another motive. The child home alone who tells the stranger knocking on the door “my father is taking a nap come back later”, has committed no misdeed that he or she is concealing; it is a self-protection lie.
Some lies are told to win admiration from others. Boasting about something untrue is an obvious instance. It is common in children, some adolescents, and even adults. If discovered it harms the reputation of the boaster, but not much more than that. Claiming falsely to have earned money for previous investors moves into the criminal realm.
Maintaining Privacy To maintain privacy, without asserting that right, is another reason why people may lie. A daughter answering her mother’s question “who were you talking to on the phone just now”, by naming a girlfriend, not the boy who is asking her out on a date, is an example. It is only when there is a strong trusting relationship, that a child would feel brave enough to say “that’s private”, announcing the right to have a secret. Another topic I will return to in my newsletter about trust.
The Thrill of it All Some people lie for the sheer thrill of getting away with it, testing their unsuspected power. Many children will at some point lie to their parents simply to see if they can do it. Some people do this all the time enjoying the power they obtain in controlling the information available to the target.
Avoiding Embarrassment Avoiding embarrassment is still another motive for some serious and many trivial lies. The child who claims the wet seat resulted from spilling a glass of water, not from wetting her pants is an example, if the child did not fear punishment for her failure, just embarrassment.
Avoiding embarrassment is relevant to many less serious lies that come under the rubric of lies-of-everyday-life. Very often people lie to get out of an awkward social situation. They may not know how to do it – “can’t get a babysitter” offered to avoid another dull evening and food. “Sorry I am on my way out the door”, an excuse given by people who do not feel brave enough to be truthful even to a totally unknown telephone solicitor.
Being Polite Then there are the deceptions that are required by politeness — “thanks so much for the lovely party” or “that color really looks good on you”. I don’t consider these to be lies, anymore than bluffing in poker is a lie, acting in a play is lying, or the asking price not being the selling price. In all of these instances the target does not expect to be told the truth, there is notification. But the impostor is a liar, as is the con man, because they are taking advantage of our expectation that we will be told the truth. More about this will be in my newsletter about the different techniques for lying.
Do we really want to know if someone is lying? In most cases, there’s no quick or easy way to detect deception and, even if there were, we might not like what we discover.
So, while people often claim to want to know the truth, there are many instances in which it is more comforting to believe the lies. In these circumstances, we tend to ignore deception clues and excuse otherwise suspicious behaviors to avoid the potentially negative consequences of uncovering the lies we’re told.”
“Lying is among the most sophisticated and demanding accomplishments of the human brain. Children have to learn how to lie; people with certain types of frontal lobe injuries may not be able to do it.
Electrical stimulation of the prefrontal cortex appears to improve our ability to deceive. This region of the brain may, among other things, be responsible for the decision to lie or tell the truth.
Most people have trouble recognizing false statements. Some polygraph tests are better at it yet are far from perfect. Researchers are trying to use imaging methods to distinguish truth from lies. Intensified activity in the prefrontal cortex may be an indicator of the process by which we decide to lie or not—but it tells us nothing about the lie itself.
A 51-year-old man I will call “Mr. Pinocchio” had a strange problem. When he tried to tell a lie, he often passed out and had convulsions. In essence, he became a kind of Pinocchio, the fictional puppet whose nose grew with every fib. For the patient, the consequences were all too real: he was a high-ranking official in the European Economic Community (since replaced by the European Union), and his negotiating partners could tell immediately when he was bending the truth. His condition, a symptom of a rare form of epilepsy, was not only dangerous, it was bad for his career.
Doctors at the University Hospitals of Strasbourg in France discovered that the root of the problem was a tumor about the size of a walnut. The tumor was probably increasing the excitability of a brain region involved in emotions; when Mr. Pinocchio lied, this excitability caused a structure called the amygdala to trigger seizures. Once the tumor was removed, the fits stopped, and he was able to resume his duties. The doctors, who described the case in 1993, dubbed the condition the “Pinocchio syndrome.”
Mr. Pinocchio’s plight demonstrates the far-reaching consequences of even minor changes in the structure of the brain. But perhaps just as important, it shows that lying is a major component of the human behavioral repertoire; without it, we would have a hard time coping. When people speak unvarnished truth all the time—as can happen when Parkinson’s disease or certain injuries to the brain’s frontal lobe disrupt people’s ability to lie—they tend to be judged tactless and hurtful. In everyday life, we tell little white lies all the time, if only out of politeness: Your homemade pie is awesome (it’s awful). No, Grandma, you’re not interrupting anything (she is). A little bit of pretense seems to smooth out human relationships without doing lasting harm.
Yet how much do researchers know about lying in our daily existence? How ubiquitous is it? When do children usually start engaging in it? Does it take more brainpower to lie or to tell the truth? Are most people good at detecting untruths? And are we better at it than tools designed for the purpose? Scientists exploring such questions have made good progress—including discovering that lying in young children is a sign that they have mastered some important cognitive skills.
TO LIE OR NOT TO LIE Of course, not everyone agrees that some lying is necessary. Generations of thinkers have lined up against this perspective. The Ten Commandments admonish us to tell the truth. The Pentateuch is explicit: “Thou shalt not bear false witness against thy neighbor.” Islam and Buddhism also condemn lying. For 18th-century philosopher Immanuel Kant, the lie was the “radical innate evil in human nature” and was to be shunned even when it was a matter of life and death.
Today many philosophers take a more nuanced view. German philosopher Bettina Stangneth argues that lying should be an exception to the rule because, in the final analysis, people rely on being told the truth in most aspects of life. Among the reasons they lie, she notes in her 2017 book Deciphering Lies, is that it can enable them to conceal themselves, hiding and withdrawing from people who intrude on their comfort zone. It is also unwise, Stangneth says, to release children into the world unaware that others might lie to them.
It is not only humans who practice deception. Trickery and deceit of various kinds have also been observed in higher mammals, especially primates. The neocortex—the part of the brain that evolved most recently—is critical to this ability. Its volume predicts the extent to which various primates are able to trick and manipulate, as primatologist Richard Byrne of the University of St. Andrews in Scotland showed in 2004.
CHILDREN HAVE TO LEARN HOW TO LIE In our own kind, small children love to make up stories, but they generally tell their first purposeful lies at about age four or five. Before starting their careers as con artists, children must first acquire two important cognitive skills. One is deontic reasoning: the ability to recognize and understand social rules and what happens when the rules are transgressed. For instance, if you confess, you may be punished; if you lie, you might get away with it. The other is theory of mind: the ability to imagine what another person is thinking. I need to realize that my mother will not believe that the dog snagged the last burger if she saw me scarf down the food. As a step to developing a theory of mind, children also need to perceive that they know some things their parents do not, and vice versa—an awareness usually acquired by age three or four.
People cook up about two stories a day on average, according to social psychologist Bella M. DePaulo, of the University of California, Santa Barbara, who conducted a 2003 study in which participants filled out “lie diaries.” It takes time, however, to become skilled. A 2015 study with more than 1,000 participants looked at lying in volunteers in the Netherlands aged six to 77. Children, the analysis found, initially have difficulty formulating believable lies, but proficiency improves with age. Young adults between 18 and 29 do it best. After about the age of 45, we begin to lose this ability.
A similar inverted U-shaped curve over the life span is also seen with a phenomenon known as response inhibition—the ability to suppress one’s initial response to something. It is what keeps us from blurting out our anger at our boss when we are better off keeping silent. The pattern suggests that this regulatory process, which, like deception, is managed by the neocortex, may be a prerequisite for successful lying.
Current thinking about the psychological processes involved in deception holds that people typically tell the truth more easily than they tell a lie and that lying requires far more cognitive resources. First, we must become aware of the truth; then we have to invent a plausible scenario that is consistent and does not contradict the observable facts. At the same time, we must suppress the truth so that we do not spill the beans—that is, we must engage in response inhibition. What is more, we must be able to assess accurately the reactions of the listener so that, if necessary, we can deftly produce adaptations to our original story line. And there is the ethical dimension, whereby we have to make a conscious decision to transgress a social norm. All this deciding and self-control implies that lying is managed by the prefrontal cortex—the region at the front of the brain responsible for executive control, which includes such processes as planning and regulating emotions and behavior.
UNDER THE HOOD Brain-imaging studies have contributed to the view that lying generally requires more effort than telling the truth and involves the prefrontal cortex. In a pioneering 2001 study, the late neuroscientist Sean Spence, then at the University of Sheffield in England, tested this idea using a rather rudimentary experimental setup. While Spence’s participants lay in a functional magnetic resonance imaging (fMRI) brain scanner, they answered questions about their daily routine by pressing a yes or no button on a screen. Depending on the color of the writing, they were to answer either truthfully or with a lie. (The researchers knew the correct answers from earlier interviews.) The results showed that the participants needed appreciably more time to formulate a dishonest answer than an honest one. In addition, certain parts of the prefrontal cortex were more active during lying (that is, they had more blood flowing in them). Together the findings indicated that the executive part of the brain was doing more processing during lying.
Several follow-up studies have confirmed the role of the prefrontal cortex in lying. Merely pointing to a particular region of the brain that is active when we tell an untruth does not, however, reveal what is going on up there. Moreover, the situations in these early experiments were so artificial that they had hardly anything in common with people’s everyday lives: the subjects probably could not have cared less whether they were dishonest about what they ate for breakfast.
To counter this last problem, in 2009 psychologist Joshua Greene of Harvard University conducted an ingenious experiment in which the participants had a monetary incentive to behave dishonestly. As subjects lay in an fMRI scanner, they were asked to predict the results of a computer-generated coin toss. (The cover story was that this study was testing their paranormal abilities. Even neuroscientists sometimes have to employ misdirection in the name of a higher scientific goal!)
If the volunteers typed the correct response, they were given up to $7. They lost money for wrong answers. They had to reveal their prediction beforehand for half of the test runs. In all the other runs, they merely disclosed after the coin toss whether they had predicted correctly. Subjects were paid even if they lied about their advance conclusions, but not everyone exploited the situation. Greene was able to read the honesty of the participants simply by looking at the hit rates: the honest subjects predicted correctly half the time, whereas the cheaters claimed to have come up with the correct answers in more than three quarters of the runs—a rate too high to be believed. After the study was over, a few liars were bothered by a bad conscience and admitted that they had cheated.
Greene asked himself what distinguished the honest from the dishonest participants. Analysis of the fMRI data showed that when honest subjects gave their answers, they had no increased activity in certain areas of the prefrontal cortex known to be involved in self-control. In contrast, those control regions did become perfused with blood when the cheaters responded. The analysis of reaction times told much the same story. The honest participants did not hesitate even when they were given the opportunity to cheat. Apparently they never even considered lying. Conversely, response time became more prolonged in the dishonest subjects.
Particularly interesting was that the cheaters showed increased activity in the control regions of the prefrontal cortex not only when they chose to behave dishonestly but also when they threw in occasional truths to distract from the lies. Greene suggests that activity in the control regions of the prefrontal cortex in the cheaters may reflect the process of deciding whether to lie, regardless of the decisions those cheaters finally made.
Instead of assessing individual brain regions at the same time as someone told the truth or a lie, psychologist Ahmed Karim of the University of Tübingen in Germany and his colleagues influenced brain activity from the outside, using a method known as transcranial direct-current stimulation—which is safe and painless. In this method, two electrodes are attached to the scalp and positioned so that a weak current hits a selected brain area.
To make the experimental situation as lifelike as possible, the team invented a role-playing game. The test subjects were to pretend they were robbers, sneak into an unobserved room and steal a €20 note from a wallet in a jacket pocket. They were told that some participants in the study would be innocent. After the theft, they were subjected to an interrogation. If they got through the interrogation without getting tangled up in contradictions, they could keep the money. They were advised to answer as many trivial questions as possible truthfully (for example, giving the correct color of the jacket) because nonguilty people might remember such details just as easily as thieves did but lie at decisive moments (for example, when questioned about the color of the wallet). The electrodes were applied to everyone before questioning, but electrical impulses were administered to only half of the participants (the “test” subjects); the other half served as the control group.
MORE EFFECTIVE DECEPTION, THANKS TO BRAIN STIMULATION In Karim’s study, the electrodes were arranged to minimize the excitability of the anterior prefrontal cortex, a brain area that earlier studies had associated with moral and ethical decision making. With this region inhibited, the ability to deceive improved markedly. Subjects in the test and control groups lied about as frequently, but those who received the stimulation were simply better at it; their mix of truthful answers and lies made them less likely to get found out. Their response times were also considerably faster.
The researchers ruled out the possibility that brain stimulation had elevated the cognitive efficiency of the participants more generally. In a complicated test of attention, the test subjects did no better than the control group. Apparently Karim’s team had specifically improved its test subjects’ ability to lie.
One possible interpretation of the findings is that the electric current temporarily interrupted the functioning of the anterior prefrontal cortex, leaving participants with fewer cognitive resources for evaluating the ethical implications of their actions; the interruption allowed them to concentrate on their deceptions. Two follow-up studies conducted by other teams were also able to influence lying using direct current, although they used different experimental setups and target brain regions. But all the test subjects in these studies lied at essentially the press of a button. Whether electrically stimulating selected brain areas would work outside the laboratory is unknown. In any case, no instrument has yet been developed that can test such a hypothesis.
CHALLENGES OF LIE DETECTION On the other hand, devices that supposedly measure whether a person is telling the truth—polygraphs—have been in use for decades. Such tools are desirable in part because humans turn out to be terrible lie detectors.
In 2003 DePaulo and her colleagues summarized 120 behavior studies, concluding that liars tend to seem more tense and that their stories lack vividness, leaving out the unusual details that would generally be included in honest descriptions. Liars also correct themselves less; in other words, their stories are often too smooth. Yet such characteristics do not suffice to identify a liar conclusively; at most, they serve as clues. In another analysis of multiple studies, DePaulo and a co-author found that people can distinguish a lie from the truth about 54 percent of the time, just slightly better than if they had guessed. But even those who encounter liars frequently—such as the police, judges and psychologists—can have trouble recognizing a con artist.
Polygraphs are meant to do better by measuring a variety of biological signs (such as skin conductance and pulse) that supposedly track with lying. Gestalt psychologist Vittorio Benussi of the University of Graz in Austria presented a prototype based on respiration in the early 1910s, and detectors have been refined and improved ever since. Even so, the value continues to be a matter of contention. In 1954 the West German Federal Court of Justice banned polygraph use in criminal trials on the grounds that such “insight into the soul of the accused” (as a 1957 paper on the ruling put it) would undermine defendants’ freedom to make decisions and act. From today’s perspective, this reasoning seems a bit overdramatic; even the latest lie detectors do not have that ability. More recent criticisms have been leveled at their unreliability.
Courts in other countries do accept results from lie-detector tests as evidence. The case of George Zimmerman, a neighborhood-watch volunteer who, in 2012, shot a black teenager—Trayvon Martin—supposedly in self-defense, is well known. Zimmerman’s acquittal triggered a debate about racism across the U.S. The police interrogation involved a particular variant of a lie-detector test that includes what is called computer voice-stress analysis. This analysis was later placed in evidence to prove the innocence of the accused, despite vehement scientific criticism of the method.
Polygraphs do detect lying at a rate better than chance, although they are also frequently wrong. A questioning technique known as the guilty knowledge test has been found to work well in conjunction with a polygraph. The suspect is asked multiple-choice questions, the answers to which only a guilty party would know (a technique very similar to the study involving the pickpocket role-playing described earlier). The theory behind it holds that when asked questions that could reveal guilt (“Was the wallet red?”), a guilty person exhibits more pronounced physiological excitation, as indicated by elevated skin conductance and delayed response time. This method has an accuracy of up to 95 percent, with the innocent almost always identified as such. Although this test is by far the most precise technique available, even it is not perfect.
Recently experiments have been conducted to evaluate whether imaging techniques such as fMRI might be useful for detecting lies. The proposed tests mostly look at different activation patterns of the prefrontal cortex in response to true and false statements. In the U.S., a number of companies are marketing fMRI lie detection. One advertises itself as useful to insurance companies, government agencies and others. It even claims to provide information relating to “risk reduction in dating,” “trust issues in interpersonal relationships,” and “issues concerning the underlying topics of sex, power, and money.”
But fMRI approaches still have shortcomings. For one thing, differences in responses to lies and truths that become evident when calculating the average results of a group do not necessarily show up in each individual. Moreover, researchers have not yet been able to identify a brain region that is activated more intensely when we tell the truth than when we lie. As a result, a person’s honesty can be revealed only indirectly, by the absence of indications of lying. Another problem is Greene’s finding that elevated blood perfusion in parts of the prefrontal cortex might indicate that a person is deciding whether to lie and not necessarily that the person is lying. That ambiguity can make it difficult to interpret fMRI readings.
So far courts have rejected fMRI lie detectors as evidence. The efficacy of the method has simply not been adequately documented. A machine that reads thoughts and catches the brain in the act of lying is not yet on the near horizon.
This article is reproduced with permission and was first published in Gehirn&Geist on April 3, 2018.”
“When so many are struggling for connection, inspiration and hope, Fantastic Fungi brings us together as interconnected creators of our world.
Fantastic Fungi, directed by Louie Schwartzberg, is a consciousness-shifting film that takes us on an immersive journey through time and scale into the magical earth beneath our feet, an underground network that can heal and save our planet. Through the eyes of renowned scientists and mycologists like Paul Stamets, best-selling authors Michael Pollan, Eugenia Bone, Andrew Weil and others, we become aware of the beauty, intelligence and solutions the fungi kingdom offers us in response to some of our most pressing medical, therapeutic, and environmental challenges.”
Director: Louie Schwartzberg Writer: Mark Monroe Narrated by Brie Larson Producer: Louie Schwartzberg, Lyn Davis Lear, Elease Lui Stemp Cast: Paul Stamets, Michael Pollan, Andrew Weil, Eugenia Bone, Suzanne Simard Rating: Not Rated Running Time: 81 Minutes Distributor: Area 23a