“Truth, Lies, and Uncertainty”

Readings from September Scientific American by Dave Hall

In this age of high-powered lie-telling, technology has greatly enhanced the power
of selling ideas. What is there in humans that makes them so susceptible to
unverified information?

Sermon: Truth, lies and uncertainty

This is a difficult, complex, issue, but it’s very current. The intent today is to describe the ways human animals try to cope with the modern world they have created. We look at two concepts: first, humans evaluate what they see, feel, hear, and so on by shipping that data to the brain for translation. Second, humans have trouble evaluating data that may well be distorted when it comes to them, bearing in mind the influences of social media on the internet.
The following is based on the September 2019 issue of Scientific American. Most of the following is quotes from various articles.
During this reading, I’ll indicate the specialty of each writer who is quoted. At times I’ve modified things for clarity.

The “Like” button is a weapon.
After reading something on line, do you hit the “like” button because of who sent you a story, do you read things very carefully before deciding whether to hit the button and send the message on? The Like button can spread propaganda far and wide with no effort. It is a propaganda cannon.

We will consider here some difficulties with truth, some aspects of lies, and the prevalence of uncertainty.
A careful study of all these issues could easily take up a semester course. We’ll try for twenty minutes.

Pg 27 An introduction (Written by the Editors of this issue)
On July 8, President Donald Trump stood in the East Room of the White House and delivered a speech celebrating his administration’s environmental leadership, and its efforts to ensure “the cleanest air and cleanest water”, and its success in reducing carbon emissions.
The speech was surreal but apparently strategic: It came on the heels of polls showing that Americans are growing increasingly worried about the environment. It remains to be seen whether [he] will sway environmentally concerned voters, but clearly his team thinks that’s a possibility.
In this special issue of SA, we set out to explore how it is that we can all live in the same universe, yet see reality so differently.
By understanding how we instinctively deal with unknowns and how bad actors exploit the information ecosystem, we can mount defenses against weaponized narratives—and build mutual understanding to solve society’s most pressing challenges.

TRUTH

Pg 30. George Musser, science writer: Physics [is a science that tries to describe how everything in the universe works.] It seems to be one of the only domains of human life where truth is clear-cut.
Yet physics has just as many–perhaps more—struggles with the notion of truth as any other discipline. Physicists can be full of themselves, but the most experienced and accomplished among them are usually [more modest]. They tend to be the first people to point out the problems with their own ideas, if only to avoid the embarrassment of someone else doing it for them. No one ever said that finding the truth would be easy.

Pg. 35. Math:
Philosophers study everything. They cannot agree on whether mathematical objects exist or are pure fiction. The number 7 might really exist as an abstract object, one which has certain features, such as being a prime number. Or it could be part of an elaborate game that mathematicians devised to seek the truth. One mathematician says, “Math is full of uncertainties—it just hides them well”.
Does that sound like math? Numbers are objects? Way too confusing.
Mathematicians have developed numbers to help evaluate things, and they have developed rules for how to work with numbers. When one follows the rules, useful but complex conclusions can be reached, which are called Proofs.
But proofs confer only conditional truth, with the truth of the conclusion depending on the truth of the assumptions. [in that respect, it’s kind of like religion: If there is a god, then religious practices are valid. If you agree, then you follow rules of how to treat your friends and enemies, and how to pray ]
p 38 Where do these foundational math objects and ideas come from? Often from usefulness. We need numbers so that we can count (heads of cattle, say) and [we need] geometric objects such as rectangles to measure, the areas of fields. We can’t do much without math.

p40. OUR INNER UNIVERSES. (Neuroscience evaluates reality:)
Reality is constructed by the brain, and no two brains are exactly alike.
[The brain is amazingly flexible. you may know that in Spanish, queso means cheese. Taco Bell had a billboard which read,] “In queso emergency, pray to cheeses” Our brains take in the message, realize that it’s illogical, then create a logical message from it. In this case, the brain realizes that the statement is for the sake of getting attention by using humor, and turns on the body’s chuckle response.

P 42. (neuroscience) The reality we perceive is not a direct reflection of the external world. Instead it is the product of the brain’s predictions about the causes of incoming sensory signals, such as vision, hearing, etc. The property of realness that accompanies our perceptions may serve to guide our behavior so that we respond appropriately to the sources of sensory signals.

THE BRAIN PREDICTS (neuroscience)
2000 years ago, in Plato’s Allegory of the Cave, prisoners are chained facing a blank wall all their lives, so that they see only the play of shadows cast by objects passing by a fire behind them, and they give the shadows names, because for them, the shadows are what is real.
One thousand years ago, an Arabian scholar wrote that perception depends on processes of “judgement and inference” rather than having direct access to an objective reality. Hundreds of years later, Immanuel Kant realized that the chaos of unrestricted sensory data would always remain meaningless without being given structure by preexisting conceptions or “beliefs”.
[What is the color of a radio wave?] We have known since Isaac Newton that colors do not exist out there in the world. Instead they are cooked up by the brain from mixtures of different wavelengths of colorless electromagnetic radiation. Colors are a clever trick that evolution has hit on to help the brain keep track of things.
Today these ideas have gained a new momentum [thanks to the] idea that the brain is a kind of prediction machine, and that perception of the world–and of the self within it–is a process of brain-based prediction about the causes of sensory signals.
Our perceptions come from the inside out just as much as … from the outside in. [In other words, we see what the brain expects us to see.]
In the late 1800s, a German scientist … proposed that perception is a process of unconscious inference. [The main idea] is that the brain is attempting to figure out what is out there in the world by continually making and updating guesses about the causes of its sensory inputs. [Perhaps this can explain public reaction to statements that come from a habitual liar. One expects them to be false, and so, if repeating them, is likely to pass them on as falsehoods. This process of building a reality from what the brain believes, THEN checking sensory inputs to compare, has become known as a “controlled hallucination”. ]

By now, I hope you can see how difficult it is to figure out what is true. Let’s consider lies.

LIES

P.50. DECEPTION IN THE WILD (Anthropology)
Humans are not alone in their deceitful ways. Animals may mislead members of their own species or other species by using camouflage or mimicry. When the false signaling is done with intent, it is called tactical deception–a strategy deployed by creatures ranging from cuttlefish to dogs. [We watched a dog sniff a dead possum, then walk away. Then the possum got up and ran away. Either the possum was false signaling, or God spends a lot of time resurrecting possums]

P.54 WHY WE TRUST LIES (Network Science)
The most effective misinformation starts with seeds of truth..
In the mid 1880s, a caterpillar the size of a human finger began spreading across the northeastern U.S. This appearance of the tomato hornworm was followed by terrifying reports of fatal poisonings and aggressive behavior toward people. In July 1869 newspapers across the region posted warnings about the insect reporting that a girl in Red Creek, N.Y., had been “thrown into spasms, which ended in death” after a run-in with the creature. That fall the Syracuse Standard printed an account from one Dr. Fuller, who had collected a particularly enormous specimen. The physician warned that the caterpillar was “as poisonous as a rattlesnake” and said he knew of three deaths linked to its venom.
Entomologists had known the insect to be [harmless] for decades when Fuller published his dramatic account, and his claims were widely mocked by experts. So why did the rumors persist even though the truth was readily available? People are social learners. We develop most of our beliefs from the testimony of trusted others, such as our teachers, parents and friends. This social transmission of knowledge is at the heart of culture and science. But as the tomato hornworm story shows us, social learning has a gaping vulnerability: sometimes the ideas we spread are wrong.
P 56 Over the past five years, the ways in which the social transmission of knowledge can fail us have come into sharp focus. Misinformation shared on social media Web sites has fueled an epidemic of false belief, with widespread misconceptions concerning topics ranging from the prevalence of voter fraud to whether the Sandy Hook school shooting was staged, to whether vaccines are safe.
The same basic mechanisms that spread fear about the tomato hornworm have now intensified a profound public mistrust of basic societal institutions. One consequence is the largest measles outbreak in a generation.

P.88 A NEW WORLD DISORDER (Communication science)
Our willingness to share content without thinking is exploited to spread disinformation.
The author presents three categories of information disorder: Misinformation, Malinformation, and Disinformation. [1] Misinformation, (unintentional mistakes by the sender) [2] Malinformation (Intent to harm; deliberate publication of false information); and [3] disinformtion, the combination of the two, (fabricated or deliberately manipulated content). If someone knowingly lies to you, (disinformation) and you believe it, and pass it on, it has become misinformation)
There is no going back. Gossips of the past had local, limited influence. Modern gossips have become able to spread their influence worldwide, and people hearing their stories pass them on without thinking.

[The word, MEME, has been coined to describe a concept, often humorous, that could be true or false, and usually is transmitted on social media]. The urge to conform, [and the urge to be first with the news] is a profound part of the human psyche and one that can lead us to take actions we know to be harmful. People who [share a particular] meme simply trust the friend who sent it, rather than checking for themselves. Figuring out what is true, and acting accordingly, matters deeply. [The anti-vaccination issue is a prime example]

P. 60 Social trust and conformity can help explain why polarized beliefs can emerge in social networks. But at least in some cases, including the Somali community in Minnesota and Orthodox Jewish communities in New York, social conformity was only part of the story. Both groups were the targets of sophisticated misinformation campaigns designed by anti-vaxxers.
A classic example comes from the tobacco industry, which developed techniques in the 1950s to fight the idea that smoking kills.
P. 61. How do we protect public well-being when so many citizens are misled about matters of fact?

p.63 Contagious Dishonesty (Behavioral economics)
Dishonesty begets dishonesty, rapidly spreading unethical behavior through a society.
Imagine that you go to City Hall for a construction permit to renovate your house. The employee who receives your form says that, because of the great number of applications the office has received, the staff will take up to nine months to issue the permit. But if you give her $100, your form will make it to the top of the pile. A number of questions go through your head, [such as]: Will I pay to speed things up? Would any of my friends or relatives do the same? You would probably not wonder whether being exposed to the request would, in and of itself, affect a subsequent ethical decision.
[The researchers] observed that in the games they [had volunteers] play, bribe-exposed participants cheated more than those who did not receive a [bribe] request.
Results suggested that receiving a bribe request erodes individuals’ moral character.
P. 64. Research indicates that most people act ethically to the extent that they can benefit while also preserving their moral self-image.
P.67 How to defraud democracy (Cyber security)
A worst-case cyberwarfare scenario for the 2020 American presidential election.
P. 69. There are still major cybersecurity vulnerabilities facing the 2020 US presidential election in part because the election system is based on faith. Foreign attackers could target voter-registration rolls and election machinery to either influence the outcome or to spread chaos and doubt.
P. 70 People assume that because there are data, that the data must be true. But the truth is, all data are dirty. People [make the measurements that] create data, which means that data have flaws, just like people do.
To use the data, you have to do a lot of janitorial work. You have to clean and organize them; you have to check the math, and you also have to acknowledge the uncertainty.
P. 71 We’ve seen recent advances in using machine learning to synthesize video of people saying things that they never actually said on camera.
[This author claims that] election security challenges can be solved without any major scientific breakthroughs and for only a few hundred million dollars. It’s just a matter of political will.

P. 72 With the problems of truth and the powers of lies in mind, we now examine UNCERTAINTY

P. 74 TOUGH CALLS (Decision Science)
How we make decisions in the face of incomplete knowledge and uncertainty.
Psychologists study how humans make decisions by giving people “toy” problems. They described a hypothetical disease with two strains, and asked, “Which would you rather have, a vaccine that completely protects you against one strain, or a vaccine that gives you 50% protection against both strains?” Most people chose the first vaccine. The experimenters inferred that [subjects] were swayed by the phrase about complete protection, even though both shots gave the same overall chance of getting sick.
When people assess novel risks, they rely on mental models derived from previous experience, which may not be applicable.
[In one experiment, experts found that average] people would make generally accurate predictions [about the nature of disasters], but that they would overestimate [the number of] deaths from causes that get splashy or frequent headlines–murders, tornadoes–and underestimate deaths from “quiet killers” such as stroke and asthma. [Thus] people who watch more television news worry more about terrorism than individuals who rarely watch. [The interviewers] found that the general public emphasized what could happen in an exceptionally bad year, whereas … experts in the field expressed what the average disaster would be like.
Lesson 1: The facts of science will not speak for themselves. They need to be translated into terms that are relevant to people’s decisions about their lives. [To minimize misunderstandings]: test messages before sending them. Ask people to read and paraphrase a message.
Lesson 2: People who agree on the facts can still disagree on what to do about them. A solution that seems sound to some can seem too costly or unfair to others.
For some individuals, scientific evidence or economic impacts are less important than what certain decisions reveal about their beliefs. These people worry about how their choice will affect the way others think about them, as well as how they think about themselves.
When scientists communicate poorly, it often indicates that they have fallen prey to a natural human tendency to exaggerate how well others understand them.

P.80 CONFRONTING UNKNOWNS How to interpret uncertainty in common forms of data visualization. [such as graphs and pie charts]
This article examines the idea that it is difficult to gather data that accurately represents what is being studied, and to present the data in a way that is easy to understand]. How likely is it that an approaching hurricane will veer away from your town? If the data says 50%, do you leave or stay?
How do you learn to evaluate probable outcomes of events that may affect you? For this, a statistics course is a great help, but in my experience, it takes an extremely good course to help one develop a useful mindset.

P.85 RADICAL CHANGE (Social psychology) Uncertainty in the world threatens our sense of self. To cope, people embrace populism.
Rapid and overwhelming change can threaten people’s sense of self and identity. Self-uncertainty motivates people to seek out [1] stronger group identification as well as [2] leadership preferences that can encourage confirmation bias and populism. Both these factors are facilitated and exaggerated by the availability of unlimited information and access to extremist groups on the internet.

P.61 The way to decide a question of scientific fact is not to ask a community of nonexperts to vote on it, especially when they are subject to misinformation campaigns. What we need is a system that not only respects the processes and institutions of sound science as the best way we have of learning the truth about the world, but also respects core democratic values that would preclude any single group, such as scientists, dictating policy.

I propose that we have to start thinking of social media and the internet as a sixth sense, and hope that we can adapt to this new world in which that is a separate stimulus. Its input to us will be mostly through vision and hearing, at least for now, and our brains must be trained to evaluate and apply what is gathered in.
Until the internet goes away for some reason, we need to control its input to us, and for that we must rely on the brain to construct mental filters. It may take generations, but without that ability, we won’t trust what’s true.
This is a difficult, complex, issue, but it’s very current. The intent today was to describe the ways human animals try to cope with the modern world they have created. Go in peace

About Dr. Lou Yock

Dr. Louis Yock, is the minister of People's Church Unitarian Universalists and in this role, is responsible for delivering a portion of Sunday services, pastoral care, conferring with all committees and providing spiritual leadership for the congregation.