5 lessons on finding truth in an uncertain world
5 lessons on finding truth in an uncertain world
Adam Kucharski is a professor of epidemiology at the London School of Hygiene & Tropical Medicine and an award-winning science writer. His book, The Rules of Contagion, was a Book of the Year in The Times, Guardian, and Financial Times. A mathematician by training, his work on global outbreaks has included Ebola, Zika, and COVID. He has advised multiple governments and health agencies. His writing has appeared in Wired, Observer, and Financial Times, among other outlets, and he has contributed to several documentaries, including BBC’s Horizon. What’s the big idea? In all arenas of life, there is an endless hunt to find certainty and establish proof. We don’t always have the luxury of “being sure,” and many situations demand decisions be made even when there is insufficient evidence to choose confidently. Every field—from mathematics and tech to law and medicine—has its own methods for proving truth, and what to do when it is out of reach. Professionally and personally, it is important to understand what constitutes proof and how to proceed when facts falter. Below, Adam shares five key insights from his new book, Proof: The Art and Science of Certainty. Listen to the audio version—read by Adam himself—in the Next Big Idea App. 1. It is dangerous to assume something is self-evident. In the first draft of the U.S. Declaration of Independence, the Founding Fathers wrote that “we hold these truths to be sacred and undeniable, that all men are created equal.” But shortly before it was finalized, Benjamin Franklin crossed out the words “sacred and undeniable,” because they implied divine authority. Instead, he replaced them with the famous line, “We hold these truths to be self-evident.” The term “self-evident” was borrowed from mathematics—specifically from Greek geometry. The idea was that there could be a universal truth about equality on which a society could be built. This idea of self-evident, universal truths had shaped mathematics for millennia. But the assumption ended up causing a lot of problems, both in politics and mathematics. In the 19th century, mathematicians started to notice that certain theorems that had been declared “intuitively obvious” didn’t hold up when we considered things that were infinitely large or infinitely small. It seemed “self-evident” didn’t always mean well-evidenced. Meanwhile, in the U.S., supporters of slavery were denying what Abraham Lincoln called the national axioms of equality. In the 1850s, Lincoln (himself a keen amateur mathematician) increasingly came to think of equality as a proposition rather than a self-evident truth. It was something that would need to be proven together as a country. Similarly, mathematicians during this period would move away from assumptions that things were obvious and instead work to find sturdier ground. 2. In practice, proof means balancing too much belief and too much skepticism. If we want to get closer to the truth, there are two errors we must avoid: we don’t want to believe things that are false, and we don’t want to discount things that are true. It’s a challenge that comes up throughout life. But where should we set the bar for evidence? If we’re overly skeptical and set it too high, we’ll ignore valid claims. But if we set the bar too low, we’ll end up accepting many things that aren’t true. In the 1760s, the English legal scholar William Blackstone argued that we should work particularly hard to avoid wrongful convictions. As he put it: “It is better that ten guilty persons escape than that one innocent suffer.” Benjamin Franklin would later be even more cautious. He suggested that “it is better 100 guilty persons should escape than that one innocent person should suffer.” “We don’t want to believe things that are false, and we don’t want to discount things that are true.” But not all societies have agreed with this balance. Some communist regimes in the 20th century declared it better to kill a hundred innocent people than let one truly guilty person walk free. Science and medicine have also developed their own traditions around setting the bar for evidence. Clinical trials are typically designed in a way that penalizes a false positive four times more than a false negative. In other words, we don’t want to say a treatment doesn’t work when it does, but we really don’t want to conclude it works when it doesn’t. This ability to converge on a shared reality, even if occasionally flawed, is fundamental for science and medicine. It’s also an essential component of democracy and justice. Rather than embracing or shunning everything we see, we must find ways to balance the risk that comes with trusting something to be true. 3. Life is full of “weak evidence” problems. Science is dedicated to generating results that we can have high confidence in. But often in life, we must make choices without the luxury of extremely strong evidence. We can’t, as some early statisticians did, simply
With Beyoncé's Grammy Wins, Black Women in Country Are Finally Getting Their Due
February 17, 2025Bad Bunny's "Debí Tirar Más Fotos" Tells Puerto Rico's History
February 17, 2025
Comments 0