Special Edition – Strong Opinions Loosely Held: Part 4

Page 1

Page | 1

Special Edition – Strong Opinions Loosely Held: Part 4 Science, Psychology, Misinformation, Vaccines, and Prosocial Ways to Respond Dr. Chris Stout Preface I started with writing two pieces (Part 1 and Part 2) from a place of concern, spurred by a few reader comments in response to my weekly Newsletter. I have a wonderfully diverse group of friends, some of whom post what I consider may indicate a lacking in an understanding of science or public health concerning the pandemic. Working to keep things timely, I next compiled Part 3, and here is Part 4. I’m suspecting there may be more. We’ll see. I’m sure those comments were meant to be well-intentioned, but it nevertheless upset me. I did not want to create a stir or upset a friend by responding with a long form explanation as to what may have been an off-the-cuff repost of something that had resonated with their point of view, but


was incorrect and could result in an unintended iatrogenic impact for other less informed readers of theirs. So, instead I have organized a multipart, series of curated collections in response. They cover five areas, purposely organized in this order: Science, Psychology, Misinformation, Vaccines, and Prosocial Ways to Respond. This and subsequent pieces that I vet and compile follow the same areas but with newer points. Similar to my Newsletter’s format, there will be less of my writing as I Page | 2 take on the role of editor and curate the content from the voices of authors and experts. These are all long-form posts that take time to read. (They will take even longer to read if you choose to look into the linked original sources and listen to the associated podcasts—which I sincerely hope you will do.) They are not “TL;DR” pieces. Please be curious. Please read this from a perspective of wanting to learn and understand how others may hold a point of view that’s different from yours. As Covey put it, “Seek first to understand.” I am the poster-boy for seeking to understand what I am puzzled by. I hope you find this of use.

Science How to Make Better Decisions: Understanding Bias vs. Noise By Peter Attia, MD, 14 November 2021 Bias is important, but so is noise. Many of us are familiar with the term bias. It’s one of those concepts that has made its way into the common parlance, its meaning well-understood as factors that sway judgment in a particular direction. The presence and pitfalls of biased decision making have long been on my radar, which I discussed in my podcast conversation with Carol Tavris and Elliot Aronson. In addition to bias, it turns out there is another, equally significant reason for errors in judgment: noise. Both bias and noise are fundamental concepts which must be understood and accounted for in order to successfully evaluate science and make the most accurate decisions possible. Want to learn more from Peter? Check out his article, why we’re not wired to think scientifically (and what can be done about it), and our interview with John Ioannidis, M.D., D.Sc. on why most biomedical research is flawed, and how to improve it. Subscribe to his free weekly newsletter so you never miss an article! What is Noise? Noise is the unwanted variety in a set of responses, or judgments about something. I say unwanted because the variability, in this case, is not beneficial but rather represents deviation, or error. A noisy system is one that has a large variation in decisions pertaining to a given topic. For example, if a patient consults with four doctors and they all give a different stage of cancer diagnosis, the determinations are collectively undesirably noisy. How Bias and Noise Work Together


This essay, written by Daniel Kahneman, Olivier Sibony, and Cass Sunstein, discusses how bias and noise are contextualized together. Importantly, bias and noise exist independently of one another but are both always present to some degree in human decision making. To illustrate bias and noise together, the essay provides a useful example: a scale that gives an average reading that is too high or too low is biased. If the scale gives different readings in quick succession, it is noisy. In their recently-published book on the same topic, the essay authors illustrate the significance of Page | 3 noise’s contribution to overall error (also called the mean squared error, or MSE) with the figure below, which shows how MSE equals the sum of bias squared and noise squared (Figure). In the two visual scenarios below, there is more noise than bias in one instance (left) and in another instance there is more bias than noise (right). In both, MSE remains the same. The point is that while bias is perhaps more commonly accounted for in the decision-making process, reducing and preventing noise deserves the same emphasis. Ultimately, the aim is to improve accuracy by reducing the unwanted variability (noise) and average error (bias) in the decision-making process.

Figure. The mean squared error (MSE) is equal to bias squared plus noise squared. This distinguishes noise as independent from bias but an equal (if not at times more significant) source of error. Image source: Noise How to Identify Noise to Improve Decision Making In their book, the authors delineate a disciplined process for identifying and preventing noise in order to improve decision-making accuracy. The first step is to undergo a “noise audit” to assess the degree of noise in the system. This audit involves evaluating a set of judgments and asking “how much variation is there between independent judgments?” The second step in the process addresses ways to prevent noise by employing procedures called “decision hygiene” practices. The goal is to produce an independent, fact-based evaluative judgement. Some suggestions to reduce noise include aggregating and averaging the independent assessments and imposing structure for assessments. The authors also mention that absolute scales have more noise than relative scales. As a more extreme solution to reduce noise, human decision-making can sometimes be removed altogether and replaced with algorithms. But of course, using rules to replace human judgment has the potential to introduce its own systematic bias (not to mention that a person has to program the machines).


Importantly, noise—understood as the variability in a set of judgments—is not always an undesired phenomenon. Take for instance the different approaches to treating anything from a headache to a torn ligament. There is not always a single “correct” approach to medical care. Further, different approaches can be, in fact, desirable (which is why I personally try to construct teams of subspecialists to consult on a single case). Even in an organization in which judgements and decisions Page | 4 are made with a singular voice (and thus, less noise is desirable), individual opinions are still important. There is also value in understanding the reasons for variation between judgments in an organization, which can then inform strategies for increasing accuracy within the larger system. After reading about noise, I realized that many of the most insightful researchers and analysts I know account informally for noise and bias as sources of error in the process of analyzing scientific literature to form their own opinions on a particular question. Evaluating the quality of evidence on a given topic involves collecting and aggregating independent studies to analyze, and without fail, the set of publications will feature varying degrees of noise. To reconcile noise in the data, the aggregation process accounts for the independent nuances between studies before the collection is reviewed together. Close attention is given to the differences between individual studies, which could be sources of bias. Only after adjusting for bias and noise in these ways will a good analyst look for trends in the best available data and derive a judgement or point of view. To make effective judgements, we not only have to have information, but we also need a system and process in place for navigating bias and noise, respectively. The good news is that there are clear procedures to account for bias and, now with a little help from Kahneman, Sibony, and Sunstein, for noise too. Source: https://peterattiamd.com/how-to-make-better-decisions-understanding-bias-vsnoise/?utm_source=weekly-newsletter&utm_medium=email&utm_campaign=211114-NLbiasnoise&utm_content=211114-NL-biasnoise-email-nonsub-b Cheat sheets: Evidence-based medicine By Advisory Board, 5 September 2018 Been awhile since your last statistics class? It can be difficult to judge the quality of studies, the significance of data, or the importance of new findings when you don't know the basics. Download our cheat sheets to get a quick, one-page refresher on some of the foundational components of evidence-based medicine. • Evidence-based practice • Observational studies • Randomized control trials • Systematic reviews • Meta-analyses • Statistical significance Evidence-based practice Evidence-based practice (EBP) is the explicit use of the best available medical evidence in making clinical decisions. Needing to improve cost efficiency and boost care quality, providers are investing significant time and resources into the creation and adoption of EBPs. These practices


guide clinicians in providing patients with the right care, at the right level, at the right time. Download The Cheat Sheet Observational studies In an observational study, researchers observe a population of individuals and measure their outcomes. Unlike in randomized control trials, researchers do not intervene in selecting which Page | 5 participants get a given intervention. Observational studies are the most common study type, although their takeaways are often limited because they cannot prove causality (something which is often misunderstood). Download The Cheat Sheet Randomized control trials In a randomized control trial, participants are randomly divided into separate groups where one group is the experimental group (given a treatment or intervention) and the other group is the control group (not given the treatment or intervention). Then, researchers compare the two groups on a particular outcome. Randomized control trials, especially when analyzed together in a systematic review or meta-analysis, are the basis for evidence-based medicine and forming new clinical guidelines. Download The Cheat Sheet Systematic reviews A systematic review is a summary of existing research on a given topic. It answers a particular research question by collecting and summarizing all empirical evidence that meets a certain prespecified eligibility criteria. Systematic reviews are the strongest way to understand the 'scientific consensus' on a given issue and evaluate new clinical research. Download The Cheat Sheet Meta-analyses A meta-analysis is a statistical analysis that combines the data of multiple studies into a single estimate of an effect size. Meta-analyses, particularly of randomized control trails, are one of the strongest types of evidence, and are often used to justify new clinical guidelines or medical approaches. Download The Cheat Sheet Statistical significance The result of a study is considered statistically significant when it is highly unlikely to have occurred by chance or due to sampling error (the sample being different from the general population). Statistical significance is often the basis for deciding whether a new health intervention or treatment is effective. Download The Cheat Sheet Source: https://www.advisory.com/daily-briefing/resources/primers/evidence-based-medicine Causality. Why you shouldn't use Bradford Hill criteria! By Greg Martin, MPH, MBA, 27 October 2021 Determining causality isn't easy. Correlation doesn't mean causation. And yet where we see a strong correlation between an exposure and an outcome, we need to be able to determine if there is a cause and effect relationship. Public Health professionals and epidemiologists have typically use the Bradford Hill criteria to establish causality. I don't like the Bradford Hill criteria. Instead I


advocate for a process of exclusion (exclude chance, bias, confounding, reverse causation and fraud). Earn more: https://www.youtube.com/watch?v=0JV1orh1QGE 9 Ways to Know if Health Info Is Actually Junk Science: Use these steps to figure out what's true By Timothy Caulfield is a Canada Research Chair in Health Law and Policy at the University of Alberta, 18 October 2021 Yes, there are fraudsters and cranks out there trying to sell products and personal brands, but most people aren’t intentionally trying to pass along misinformation. Gordon Pennycook, Ph.D., is a cognitive psychologist at the University of Regina who studies how to battle the spread of misinformation. His research reveals that roughly 80 percent of people believe that it’s very or extremely important that the info they share on social media is accurate. However, it can be difficult to parse quality from quackery. What should you be considering when you see a blog comment, YouTube testimonial, or social-media rant from your loudmouthed uncle? Ask yourself these nine questions to help you figure out what the heck is true: 1. Is there evidence for this? Be skeptical if a claim is based on a study that was done only in animals (if it’s not proven in your fellow humans, don’t bank on it), was small (in general, you need big studies—in thousands, not dozens, of people—to produce reliable results), or was only observational. (Observational studies mostly involve noting relationships between things, like eating a food and changing a disease outcome. These can’t really prove the food caused the change. For that, you need randomized, controlled studies, which give a treatment to one group and not another to zero in on whether it has an impact.) Although it may be tempting to be swayed by exciting new research—especially if it’s contrarian in nature—it’s important to consider the body of peer-reviewed evidence, meaning other scientists have checked it (a common practice before research is published), and to look at the broader scientific consensus. You might even want to make sure there’s science referenced at all. A report or post that simply pontificates should be viewed as less pertinent than something relying on, you know, actual evidence. 2. Is the evidence preliminary? While scientific speculation can be a valuable way to publicly debate emerging ideas, peer review adds much-needed oversight and credibility. During the pandemic, there has been dramatic growth in access to “preprint” papers—previews of studies that have not been reviewed by independent scientists or accepted for publication in an academic journal. Unfortunately, they often get reported on or shared without the important “this is just a preprint” caveat. For example, a preprint about the alleged benefits of the anti-parasite medication ivermectin in treating or preventing Covid helped drive interest in the drug. But after questions were raised about possible data manipulation, it was pulled from the preprint website. As is so often the case, however, the idea was out, and the narrative of benefit continues to circulate. While the peer-review process isn’t perfect, it’s a valuable tool. If you can’t find the study in an academic journal, be skeptical.

Page | 6


3. Is the claim based on an anecdote? Everyone loves a good story. It can pull at our heartstrings and astound. It can also scare the shit out of us. (“Swimmer eaten by shark!”) There’s a theory that we are hardwired to respond to narratives, especially if we can relate to the characters. (“Hey, that shark-dinner guy is like me!”) They play to our cognitive biases because they’re easy to recall. So if someone asks you to head to the beach for a swim, you think about that “man eaten” headline and not the rarity of shark attacks. Research from the University of Michigan has shown that a compelling anecdote can decrease our ability to think scientifically. And this is why stories have become a key driver of misinformation. A site called 1,000 Covid Stories is just a group of videos with anti-vaccine voices like Eric Clapton recounting bad vaccination experiences. There’s no way to tell if those incidents had anything to do with a vaccine. Good scientific research is required to tease out a link between an event (like a vaccine) and an outcome. Stories can give meaning to cold data and help get statistics across. But there should also be facts. Don’t rely on testimonials or celebrity musings alone. You. Need. Data. 4. Just your buddy’s opinion? Research (including studies from here, here, and here) has shown that we tend to trust someone we see as being similar to us, even if there is no science-informed reason for doing so. I’m not saying you shouldn’t have a beer with your super pal and listen to his entertaining hypothesizing about lizard people running the government. It’s just that, ya know, he could be wrong. 5. Is someone trying to sell you something? Often misinformation is being pushed for personal gain. For example, Joseph Mercola, D.O., has built a multimillion-dollar empire selling products, some of which—including Dr. Mercola Quercetin & Pterostilbene Advanced—have been flagged by the FDA for improper claims that they can help fight Covid. 6. Is there fearmongering or ideological spin? Is the author trying to be provocative or play to a particular worldview? Is it making you emotional? Last fall, a Canadian politician circulated a rumor that the federal government was building “internment camps” to force people to isolate. Conspiracy-theory websites immediately picked up the lie, which then quickly migrated to social media, where some posts used the seriously offensive phrase “concentration camps.” This conspiracy theory blossomed in mere days, mostly because it was scary sounding and fit a certain anti-government narrative. Indeed, it was repeated by another politician opposed to the federal government’s policies. (P.S.: I live in Canada. No internment camps.) 7. Is there Scienceploitation? Sciencey language can be exploited to make any information sound more credible, a tactic I call “scienceploitation.” That can create the illusion of scientific authority, even if the topic is complete gobbledygook. (“That haunted house contains an abundance of exteriorized ectoplasm!”) And

Page | 7


since many of us aren’t well versed in what the sciencey terms actually mean—how many people have a solid handle on quantum physics?—the opacity can help misinformation slide through the cracks in our critical-thinking defenses. So be on guard: Just because impressive terminology is being used doesn’t mean good science is. Also watch for vague and meaningless phrases (“detoxifies the body”) and overpromises (“revolutionary”). The “too good to be true” test works pretty darn well. 8. Does a post or person make you doubt the evidence? A time-tested way to push misinformation is to constantly inject elements of doubt. In popular culture and on social media, the strategy of “just asking questions” (aka JAQing off) has become one of the dominant tools of the current crop of doubtmongers (check out our Misinformation Superspreader Hall of Shame here). It works because it feels benign. Always take a step back and think of the big picture. Scientists will always debate the details—don’t let this distract you. We have concrete and actionable answers for much. (Vaccines work. Incredibly well.) 9. Can you fact-check it? What are authorities on the subject and fact-checkers saying? Can you verify the source of the information and does it look credible? And who is making this claim? Is it someone who has studied the topic and has relevant credentials? (Good places to check on an item's accuracy include Snopes, Factcheck.org, Stat News, and the CDC.) If the info’s not credible, you can do something important to stop its spread. Pause to consider an item’s accuracy before you share. Source: https://www.menshealth.com/health/a37912562/how-to-spot-junk-science-know-whatstrue/

Psychology Doing the work required to hold an opinion means you can argue against yourself better than others can. Source: The Work Required to Hold an Opinion Insight “If we are sincere in wanting to learn the truth, and if we know how to use gentle speech and deep listening, we are much more likely to be able to hear others’ honest perceptions and feelings. In that process, we may discover that they too have wrong perceptions. After listening to them fully, we have an opportunity to help them correct their wrong perceptions. If we approach our hurts that way, we have the chance to turn our fear and anger into opportunities for deeper, more honest relationships.” Source: Thich Nhat Hanh, Fear: Essential Wisdom for Getting Through the Storm

Misinformation Almost 80% of Americans believe or are unsure about at least one Covid falsehood

Page | 8


Page | 9

The vast majority of Americans either believe or aren’t sure about at least one falsehood regarding Covid-19, according to a new Kaiser Family Foundation report. The eight false statements in the survey included that the Covid vaccines cause infertility and that ivermectin is a safe and effective treatment. About a third of respondents either believe or aren’t sure about at least half of the statements. Here’s more: § 60% of adults think the government is exaggerating the number of Covid-19 deaths or aren’t sure if that’s true. § 8% of adults believe Covid vaccines have been shown to cause infertility and another 23% aren’t sure. § Almost half of Republicans believe or are unsure about at least one of the falsehoods compared to 14% of Democrats. § 95% of unvaccinated adults believed or were unsure about at least one of the incorrect statements compared to 71% of vaccinated adults. The Best (and Worst) Places to Find Reputable Health Information: Here’s how to weed out misinformation superspreaders from experts you can trust By Timothy Caulfield is a Canada Research Chair in Health Law and Policy at the University of Alberta, 18 October 2021


Science is up against a lot. Inaccurate information is extremely contagious on social media: There’s a 70 percent greater likelihood that misinformation will be retweeted than that true stories will. And the truth just can’t keep up. While some social media platforms have taken to labeling some content misleading or false, it’s often too little too late. For instance, one election fraud video on Instagram got 500,000 views before it was flagged as potentially misleading. But only 20,700 people saw the warning label after the post was flagged. One way to get a feel for what’s reliable information and what’s probably not is to identify which sources you can trust. You’ll find a starter list of some favorite good sources below. You’ll also want to check the subtext of what a source is saying. Are they sowing doubt in what you thought was reality? Are they, you know, “Just asking questions”? Do they stand to gain financially from what they’re posting (this supplement protects against Covid and what do you know, I sell it right here…). Scientific facts are rarely going to be as clicky as untruths. So remember that if something “totally makes you click,” then you’re going to want to take another second to put it through the BS detector before believing it or sharing it. (Shore up your own BS detector with these 9 Ways to Spot Junk Science). To make it a little easier to get to accurate info, take a look at today’s top misinformation superspreaders and their tactics, and then check out where to get reliable information, below. Misinformation Superspreader Hall of Shame Robert F. Kennedy Jr. The classic-rock DJ of anti-vax bullshit. He can be relied upon to keep playing the same sciencefree, deeply misleading songs again and again and again. (Reality: No, vaccines do not cause autism!) RFK Jr. can doubtmonger with the best of them. Cherry-picking personified. Tucker Carlson A Just Asking Questions expert extraordinaire. The Tuckster can inject partisan-spun doubt into any topic—science, sense, and human decency be damned. “Clean water? Really? A liberal conspiracy to get into your house? I’m just asking . . . .” Gwyneth Paltrow A wellness woo wizard who can magically transform a public-health crisis into an opportunity for testimonial-fueled profit. Think you might have long Covid? Try a plant-based diet, intuitive fasting, and Goop’s detoxifying supplements. (Read: Don’t.) Joe Rogan The dude is just pragmatic, open-minded, and inquisitive. Right? So giving a platform to unproven therapies, fad diets, and illogical and potentially harmful health-policy positions isn’t really misinformation—it’s just a couple of bros brainstorming!

Page | 10


Joseph Mercola, D.O. Yep, he has the credentials of a health-care professional. But that hasn’t stopped him from fearmongering, using vague and meaningless terminology and leaning heavily on testimonials to build a for-profit pseudoscience empire. Donald Trump A Cornell University study found that Trump was mentioned alongside 37.9 percent of all the Covid misinformation in traditional and online media coverage from January through May 2020. Judy Mikovits, Ph.D. A conspiracy theorist adept at making absurd ideas sound like credible (but silenced by the Man) science. Exhibit A: the “documentary” Plandemic. Mikovits deploys misinfo tools such as casting doubt on public-health officials and fearmongering about alleged vaccine ineffectiveness. Where to go for reliable info Despite some less-than-ideal messaging (um, masks), official agencies like the CDC remain important sources of health information. But you need not only rely on boring government or university websites (although it is good to rely on them). There are also many other terrific, scienceinformed voices aiming to clarify what’s good science and what’s noise. A few favorites: Websites Snopes The oldest and largest fact-checking site. Factcheck.org A project of the Annenberg Public Policy Center that checks the accuracy of statements by political figures. Statnews A health news site with trusted investigations; one of the first to look into Covid. Instagram @science.sam; Samantha Yammine, Ph.D. This neuroscientist blends expert communication skills with broad biosciences knowledge. @jessicamalatyrivera; Jessica Malaty Rivera Expect jargon-free explanations of Covid science from this epidemiologist. TikTok @AlexDainis; Alex Dainis, Ph.D. This geneticist’s TikToks break down the facts on hot topics like PCR testing and gene editing.

Page | 11


@doctor.darien; Darien Sutton, M.D. Get clarity on facts about chest pain, Covid, and more from this emergency medicine doc. Twitter @DrJenGunter; Jen Gunter, M.D. A bold debunker unafraid of controversial topics. @SabiVM Sabina Vohra-Miller, M.Sc. This clinical pharmacologist aims to make data make sense. @EricTopol; Eric Topol, M.D. A top doc putting medical research into perspective. Podcasts This Week in Science; @TWIScience Hosted by Kirsten “Kiki” Sanford, Ph.D., a neurophysiologist who makes complex topics like CAR T-cell treatments engaging. Science Vs @ScienceVs A team of leading journalists talks to scientists to separate fact from fad. #Hastag @ScienceUpFirst #ScienceUpFirst This social-media movement, cofounded by Canadian senator Stan Kutcher, M.D., and me, brings together diverse, science-informed voices to counter misinformation on social media. Q&A: Aaron Carroll on medicine's 'dirty secrets,' how to fight coronavirus myths, and more By Aaron Carroll. 13 October 2021 In a recent episode of Radio Advisory, host Rachel (Rae) Advisory sat down with Aaron Carroll, a pediatric physician and the chief health officer for Indiana University. Carroll is also a popular New York Times contributor, the editor-in-chief of the health policy blog the Incidental Economist, and the host of the YouTube channel "Healthcare Triage," as well as the author or coauthor of several books, including "The Bad Food Bible." This transcript has been lightly edited for length and clarity. Question: Medical myths were obviously a problem long before Covid-19, but I'm curious, what feels different about this moment? Aaron Carroll: I think the stakes, to be very honest with you, are the biggest difference. People who are buying into misinformation or myths at the moment are not just putting others at risk; they're putting their own lives at risk.

Page | 12


And most of the time when I'm talking about medical myths, it's small-ball. It's things which might make a slight difference at the edge, or might make a tiny quality of life difference. Or even, if we're talking nutrition, it might make a broad years- or decades-long difference. But right now, buying into the wrong stuff could have an impact on mortality, and like, in the very short term. It's a whole different game. Page | 13 Q: Is there a moment where you started noticing more of this misinformation creeping in, either in your own practice right as a physician or in your broader career? Is there a moment where you went, "Hmm, I'm getting a lot more questions from people that just don't make sense or completely rooted in misinformation or maybe even disinformation?" Carroll: I think things felt like they got tense when the country started the lockdown last year. Up until that point in January and February, the pandemic was something "over there." It wasn't even affecting us. And in March it was still hard to raise alarm bells; it wasn't a big deal. But by the time we got to April, when it felt like a lot of the country was locking down and people were taking it seriously, that's when I think we just starting to see pushback, because people's lives and livelihood were being really affected. And if you weren't in the health care system, you did not see Covid-19 every day back then, so it was hard to understand why we were doing that. And I feel that's when things started to pick up. It's when you just started seeing protests around lockdowns or protests around masking. That was when it felt like things were getting worse. Q: I also feel like in the world of myths and misinformation, there's just some particular vulnerability in the medical and health care space, because it is so complex and misunderstood even by the folks within it, that it just becomes really hard to battle. Carroll: People in general have no appreciation of how much uncertainty there is in medicine. One of the dirty secrets that we don't tell anyone is how much we're just making it up. The number of things for which there's rock-solid, randomized controlled trial evidence is really small. A lot of the times we're going with best guess, best practice. And we sometimes get it wrong, but we speak with the same level of authority no matter what that level of evidence is. And so this felt like a time when a lot of people were all of a sudden exposed to how much uncertainty we often have to deal with in medicine, but it was playing out right before their eyes, and people freaked out. Q: Yeah, they didn't know how to deal with it. In their mind it's, "You're changing the goal posts on me. See, you don't know what you're talking about, so why should I trust you this time?" But they're not realizing this is inherent to the way that we study and ultimately practice medicine. Carroll: I remember being on a podcast in, I want to say April or May, I can't remember exactly, but it was about masks. And the host was like, "How can you live with this level of uncertainty?" And I'm like, "This is every day." I'm totally comfortable with this. I'm always playing small odds in one way or the other and understanding that even the best treatments have a number needed to treat of like one in 100, one in 1,000. Everything is incremental. And there's often a fair amount of uncertainty. So at the beginning, when we were talking about masks, it was focused on masks to protect you, meaning N95s, that were in short supply. And we needed to hoard them for those in the health care system who were at highest risk. And so I remember even saying or tweeting one point, like, if you're wearing a mask like an N95 at this point, you're wasting a mask.


And then, months later, it was like, "OK, no, no, no, no. Now we know the coronavirus is airborne. Everyone should wear a mask." And people are like, "Well, you said…" And I'm like, well, different masks in a different circumstance. We're learning as we go. Q: You've pointed out before that one of the biggest problems is that the very people with the least understanding of science tend to be the ones that oppose it the most. That's why large-scale Page | 14 efforts to educate the public tend to fail. We know that bombarding people with facts, figures, data is not going to be that effective. Do we have any understanding of what does work to nudge behavior? Carroll: I mean, there are some. Unfortunately it's hard. Obviously, if messages come from trusted voices in the community, they work better—but that's often hard to do because the same people want to be the answer for everything, and that doesn't work. I also think, and this is more personal, it requires time and effort to truly get to understand where the lesion is. Where's the misinformation or misunderstanding come from? What's the concern? How do I address it? But my biggest gripe is that the answers are often complicated. When someone wants to say, "Do masks work?", I'm like, that's going to take 10 minutes for me to answer. I cannot put it in a soundbite. And most people, unfortunately, consume their news from cable news, where if you're lucky, you get to say three sentences. And then someone else is going to say three contradictory sentences, and then they'll go, "People disagree," when really it's a nuanced, long answer required. And there just isn't a lot of space for that in today's media—with the exception being podcasts, which is why it's one of the few things I'll say yes to, because there's an actual chance to have the long-form discussion where you might actually get into some of the nuances of the answer, as opposed to a quick hit on a panel. Q: Exactly. And that's one of the reasons why I think the physician-to-patient or clinician-topatient relationship is so important, because we see that, generally speaking, people still trust their doctor. And in ideal circumstances, which of course aren't always there, there is a moment for trust-building in the physician office or through telehealth or in a portal message. What is your advice for how clinicians can in the moment try to get these messages to stick? Carroll: Well, again, I think it's important to try to figure out where the problem is. Some of it is just misinformation where there's no negative intent. Some people think it costs a lot of money still to get vaccinated—it's free, but they just think it isn't. And so just making sure they understand that—for some people, it's literally a logistical barrier. It takes activation, energy, and time that they don't feel they have. If we can just figure a way to get the vaccine to them, they might get it. Some of it is mistrust in the health care system, which has to be combated with long-term building of trust. And some of it is that they've just literally heard lies, and those have to be carefully and thoughtfully countered in a respectful and compassionate manner. You're right, though, that this is something physicians should be able to do, because they should have that kind of relationship where they can probe and get the answers they want. Of course, office visits get shrunk and shrunk in terms of time. That's the problem. Q: Yeah, that's a problem that I'm hearing. It's not that clinicians don't think that's their role or that they don't want to do it. It's that they're saying, "Hold on, I'm this overworked, I'm this understaffed. You've pushed me to be transactional in all of these different ways through


telehealth. I don't have time to build a trusted relationship, let alone spend time unpacking this information in the moment." What advice do you have for that pushback? Carroll: Make the time. I know that that's a flip answer, but we're the last line of defense here. Look, I'm a pediatrician. So for a long time, it's been difficult convincing some parents to vaccinate their children. This is not new, certainly for pediatricians who have dealt with myths and Page | 15 misinformation about vaccines for decades. So this is part of what we're going to need to do for Covid as well. I don't think we've relied as much on the health care system to distribute Covid-19 vaccines, to be very honest. They're not often delivered in the doctor's office the way most vaccines are. And so it's very different. And we've perhaps missed that opportunity where, if we were making this part of the regular doctor visit, maybe we could get a few more people or at least a decent number of people vaccinated. Q: And maybe don't assume that you're going to change somebody's mind all at one time. I read this wonderful piece that was talking about a patient with HIV, who related very strongly to his physician who was a fellow black homosexual male. And he spent the better part of a year at the end of each visit when they were doing regular checkup to say, "What about vaccination? What about Covid-19?" And it took time. But the moment that the patient said, "Doc, I did it," made it all worth it for this physician. Carroll: I would agree. And I think that physician should be used to that. If you've ever tried to counsel someone on diet and exercise, it doesn't happen in the one visit. I mean, the way I've been talking, I've been assuming an established relationship that now, if you already have the level of trust, you can build. But if you're seeing a new patient for the first time, of course, very little is going to be successful in that first contact. It's just the beginning. And we just have to take the long road on this. It's a marathon, not a sprint. Q: You brought up diet and exercise. I wonder if that means there's actually something that we can learn from old-school patient activation here. In population health, we assess patient activation because we want to know, should we intervene? And if so, when? Which of course means sometimes we don't. Do you think that there's some application here of choosing when, how, when to ignore medical misinformation, even when it shows up at our practice? Carroll: Well, I mean, I will gauge sometimes—and granted, not as much in clinical practice, but more when I'm doing more public health—I will gauge sometimes how entrenched someone is. There are some people I'm like, they're so angry or antagonistic about it, then if I truly try to go deep and argue with them, I'm just going to entrench them further. So sometimes it's worth just backing off because, if I'm not the right person and I'm not the trusted individual, then if I argue, I'm just convincing them they're right and I'm wrong. Q: What's the red flag for you, when you go, "Oh, I I'm actually might be doing more harm than good. I might be entrenching them?" Carroll: When people leap from argument to argument to argument to argument, and then they start circling around again. Where it's, if I have an answer to everything they say, and they just keep leaping into arguments, I'm like, OK, this is not going to work. But if people have a concern or a block, and I can address that and we can go in depth into that, then I feel like there's more progress that's likely going to be made. And especially if I feel like I can answer this in a way that maybe will stick and convince. I mean, you can sometimes tell when people are being thoughtful about it or whether they're just using an excuse. I know plenty of people who are like, "Well, I won't get it because it's not FDA-


authorized." And the day the Pfizer vaccine got authorized, then it was, well now they authorized it too fast. They didn't. That was an excuse, that was not a reason. And that's fine. Just, if you'd said that, we could have saved both of us some time. Other people have genuine concerns, and I can explain why it feels fast for the vaccines have been developed, but that doesn't mean that safety got skipped. Then so many people can be convinced. Page | 16 And so sometimes it's also opening it up, asking them if they have questions, seeing what kind of questions or concerns they have, answering the first few of them, and then getting a feel for, "Is this someone that we're going to be able to make progress with today?" Or is this a, "Let's just establish some relationship and trust and move along the next time?" Part of that is being a clinician and establishing relationships with patients. Q: Yeah, absolutely. I think if we're going to address misinformation, we have to understand how it spreads. And you mentioned one way that it spreads, which is through the cable news networks. But a lot comes from online information and online discourse. But what strikes me is that I'm also seeing more clinicians, more researchers, more scientists online. You've obviously been doing this for a very long time. Do you believe that everyday clinician should be moving their guidance online and maybe even into social media platforms? Carroll: It depends how engaged and involved you're willing to be. I think that the problem is that with social media, especially with things like Twitter, is that people think there's a magic tweet which will convince everyone that they were right and everyone else was wrong. And that never ever, ever happens. Most of the time you are preaching to the choir in that your engagement is going to be mostly followers who already agree with you or people who just retweet it. And then you just get like a mob of people who violently disagree with you. I think very few people are ever convinced by anything on Twitter. So I've always seen Twitter as a tool. I use it to drive people to content that I think might make a difference. So columns I've written, videos I've made, other things other people have written, thoughtful articles by really good journalists or data that might sway someone. But I always am amazed that if I have something that maybe went viral, I'm like, this made no difference. You don't understand this, no one was convinced by this. It made me feel better for five minutes. Q: But are people convinced in the opposite way? I'm thinking there are a lot of videos containing misinformation that have been shared on new platforms like TikTok. And I see nurses, I even see physicians that are using their own medical background almost as armor to spread mis- and disinformation. Is it making a negative impact when clinicians are doing this? Carroll: I mean, granted, there's people that absolutely believe that the answer is yes. But this is where I'm taking the long view on this. Anti-vaccination sentiment has existed as long as there have been vaccines. I mean, we did a series on vaccination at Healthcare Triage. It is not as if we needed social media to have a massive worldwide misinformation backlash against MMR; it did not need social media for that. Now, does it make it faster and easier? I imagine it does. But I don't know how much of it is actually to blame versus—it's easy to point and say, "Well, this must be what it is." I don't know. Was anyone expecting the vaccination in the United States to go much more smoothly than it has? I mean, we don't ever get more than this number in flu shots. I think I saw from the CDC that, right now, something fewer than 20% of young adults are vaccinated against HPV. If we don't mandate vaccines, people don't take vaccines. That's how it goes.


Now all the vaccines with very high levels of vaccination are mandated, and organizations and the schools that have mandated the Covid vaccine achieve very high levels of vaccination. When we don't mandate them, it doesn't happen. Blaming it on social media may feel convenient, but I don't know that that's really the cause. Q: Do you believe that health care organizations, medical boards, professional boards, are they Page | 17 doing enough to enforce standards on physicians, on nurses who are spreading harmful messages? Carroll: Well, they're just starting to threaten to do something. And so they really haven't done much. Having said that, it's hard to police this stuff. It is very easy for physicians to couch themselves in specific patient information or uncertainty or levels of evidence, because again, we deal with uncertainty so often. I see all the time where patients are like, "I know this is what we're supposed to do in this situation," but where my patient is different. And there's a lot of acceptance from both patients and physicians for that kind of attitude. Q: In a lot of different ways. "My patient is different from a safety perspective." Carroll: Hate guidelines, hate protocols, hate anything, because my patient is different and I know better. And that has also existed long before Covid. So policing this is—I don't want to say a slippery slope, because I hate the word. But if they're going to start with this, there are lots of other areas where we also could say, well, this isn't right either, and that's not right and that's not right and that's not right. We just don't do that. Unless things get really egregious. And maybe right now, we're at "really egregious," but I'm sure it's hard for the organizations and licensing boards to want to wade into that. Q: We're talking about combating misinformation between the physician and the layperson. But one interesting trend is there's just a lot more online communication between clinicians. Clinicians are using open online platforms to actually debate with each other. Does that online communication quicken the pace of translating new research, new ideas into clinical practice? Or is there a downside? Carroll: I think it's both. I think it probably does, but again, this is where I think it's important to understand that it's still probably a minority of clinicians engaging in this space. And so while it seems like it's huge and pervasive, it's still mostly a smallish number of massively exposed people. And that goes across the board. I think in general more transparency is better. I think the public understanding that there is uncertainty in a lot of what we do, and being able to ask open and honest questions of their clinicians—I think that's massively important. So I think that's great. And I don't think it's bad for doctors or any other clinicians to be on social media or to have a presence or to answer questions. I think that's great, but I do worry that not everything that's said is true, and people often hang their hat on credentials as if that's the metric by which we should trust. And that's a problem. Q: Or, let's be honest: People, even experts, can see different things in data can come to slightly different conclusions. And that again could have a downstream impact to real people who are going, "Oh, they don't know what they're talking about."


Carroll: The most angry professionals have gotten at me might have been two years ago, when there was a series of randomized controlled trials in Annals of Internal Medicine that looked at, "What's the real danger of meat?" And that the evidence is not great. So I wrote an editorial on it. And I would argue I was taking a reasonable take of, "Let's assess the evidence." And clinicians lost their minds because—whatever side you fall on the meat wars—it's Page | 18 going to kill you. Or people have this anger and vitriol. And I would be like, this is the issue. We don't know, but both sides are convinced they absolutely do know. And the other side's lying. And I could see how for the general public, that could be massively confusing. Q: And it comes back to your comment about policing. We talked about medical boards policing in a very specific, strong way, but is there a role where you do want clinicians online to be policing each other and saying, "Hey, maybe we shouldn't do this publicly," or, "You are wrong," or "You are spreading misinformation"? Carroll: The issue was less that it was public than how angry it got. I think it was good to have, honestly, a discussion of how questionable the evidence is in some of these cases. I think people understanding that there is some gray in health care and we're all doing our best to understand it better—if people understood that and saw that play out, that might increase trust. I think people viewing us screaming and yelling at each other like children will only decrease trust. And so it's the way we do it sometimes, not just that we do. Q: I think you are spot-on. If there's one thing we know that works, it's the trusted relationships that patients have with their clinical team. We need to figure out, how do we use the media? How do we use the internet? How do we use the existing relationships we have to keep building that trust, which might mean being transparent about what we don't know, saying that this is a gray area. Carroll: Yeah, I say "I don't know" all the time. I don't understand why people are so afraid of that. And sometimes that means "I don't know, I've got to go look that up," and sometimes it means "I don't know because we don't know." Even when we talk about things like masks—people talk about masks with a really fairly large amount of surety. And I'm like, OK, there are situations, and we have a knowledge base about when masks might be useful. But then there's times where it's like, yeah, the absolute value or the benefit is probably getting small. I mean, if you're talking about, "Should I wear a mask if I'm sitting outside with someone 20 feet away?" But other people are like, "No, masks are always needed." It's like, OK, now we've got to be able to talk about the nuance here. And we've got to be able to do so carefully and do so honestly, and dispassionately and not assume the worst in each other. But especially on social media, too many of these discussions become just yelling at each other. Q: And this is where good digital citizenship becomes really, really important. You use all sorts of platforms to communicate with your peers, with the public. What advice do you have? For other clinicians who might be thinking about getting a little bit deeper into their online presence, how should they be practicing good digital citizenship? Carroll: For me, at the beginning, I tried to ground almost everything I said with evidence. Even when we started the blog in 2009, 2010, it was not that I wanted to come and tell you my opinion. It was that I wanted to explain, "Here's the reason I believe this. And here's all the evidence."


And if you disagree, there's a comment section, and let's talk about it. But it wasn't "trust me." It was "let me explain why." And I like to think that that's what my columns are too, that they're full of links to research and I'm explaining why this study matters and why this is so and what evidence and what caveats exist and how I get to this opinion—not "I just believe it because." Then we can debate the rationale Page | 19 behind it, as opposed to just having a yelling match as to what we each believe. Q: I love that. "Let me explain" instead of, "just trust me implicitly; everything that comes out of my mouth should be chapter and verse for you." Instead it's, "Let me explain." I love that. Carroll: And it is building trust. I agree with you. It's like, I don't expect if I show up with one blog post that people are going to believe me. And in the beginning, no one came to the blog. But over time, people said, "OK, these guys are rational and they're explaining it and they get it." And journalists started to pay attention, and it built an audience. Healthcare Triage is the same way. It takes time to build that level of trust, and you don't ever want to squander it. So I try to be very careful, but as I said before, it takes time. I think people often want to show up in social media and think, "Let me get viral as quickly as I can." Q: Which is dangerous. Carroll: You can do that. It can be done, but that's never been my goal. It's more, I want to build a level of trust. And then—that is one of the things I will say I like about social media: I can follow journalists that I trust, as opposed to just reading outlets. And so even during the pandemic—Ed Yong, The Atlantic, Amanda Mull, or Olga Khazan. Or it's STAT News like Helen Branswell, or Matt Harper. Or I grant that I've colleagues in the New York Times that I really follow, but I follow individuals and journalists that I've learned to trust, as opposed to, I just read the New York Times. Q: Yeah. How do you handle the trolls? Carroll: Mostly two different ways. If they're horrible people, I ignore it. If somebody sends me an email and they took the time to write, if they at all seem reasonable, I will sometimes answer them and surprise them. And nine times out of 10, you'd be surprised. People respond by like, "Oh, now I feel terrible. It didn't occur to me that, like, you're a human being and you might actually read this and respond." Q: Because you see it as an opportunity to build trust. Carroll: Yeah. And so sometimes you will break through, but I mean, clearly if somebody is just being terrible, I just ignore it. Q: Yeah. Well, this has been unbelievably helpful for me also as somebody with her own kind of social media presence. And I know that our listeners and the clinicians who are listening to this podcast will find it valuable as well. At the end of my episodes, I always want to give our guests the platform and the chance to just speak directly to our audience. So when it comes to the world of medical misinformation and disinformation, is there one thing that you want health care leaders of all kinds to focus on or act on right now? Carroll: The biggest thing is, don't miss an opportunity to connect with patients. I know everybody is busy, and I know that it's really hard, and this has been an incredibly stressful year and a half, it's ongoing. But it is amazing to me that, in poll after poll, doctors remain the most trusted source of information.


Above anyone you see on TV, above any politician, above any "expert," people trust their doctors. And we should make use of that. And take it out of duty if you can. Connect with patients. You'll probably do more to convince someone to get vaccinated or do the right thing and all the other messaging. Q: And if you're an administrator, make sure that your clinicians have the protected time to do that, because I agree this is an untapped resource that we need to use going forward. Carroll: Yeah, absolutely. Source: https://www.advisory.com/daily-briefing/2021/10/13/radio-advisory-90 The Golden Age of Junk Science Is Killing Us: Misinformation is being spewed, weaponized, and consumed at a deadly rate. Fortunately, there's a way out. Here's how to make sense of what you're seeing By Timothy Caulfield is a Canada Research Chair in Health Law and Policy at the University of Alberta, 18 October 2021 We’ve all been exposed to quacks, hacks, and fonts of bullshit before. But not like this. We’re at peak misinformation now, and it’s hurting our health. In a recent Economist/YouGov poll, 20 percent of U. S. citizens surveyed said they believe that Covid vaccines contain a microchip. Think about that. A fifth of respondents subscribe to a theory that has its roots in the idea that Bill Gates wants to track your activities. This survey also found that only 46 percent of Americans were willing to say that the microchip thing is definitely false. Even though there’s no plausible way it could be happening. These stats are troubling. But given our frenetic information environment, they are also, well, understandable. It is becoming harder and harder to tease out the real from the unreal. Sense from nonsense. Magical thinking from microchips. Not long ago, I was shocked by a headline about “Covid parties”—people allegedly gathering to intentionally infect themselves and others. Infuriated and without pausing to reflect (or to do sufficient fact-checking), I immediately took to social media to rage about how irresponsible this was. Reality: Covid parties are mostly an urban legend. I was just adding to the noise and our collective angst. I study misinformation. This is my job at the University of Alberta, where I am a professor of law and public health and specialize in health policy and the public representations of science. I should have known better. But the story played to my values, emotions, interests, and professional passions. Cringe. This is truly the golden era of misinformation. We are, as the World Health Organization declared in early 2020, in the middle of an “infodemic”—a time when harmful misinformation is spreading like an unstoppable infectious disease.

Page | 20


Part of the problem is that we have normalized nonsense in some very subtle and some very obvious ways. Heck, there are a host of (very) successful wellness gurus who have embraced pseudoscience as a core brand strategy. And thanks to people like Andrew Wakefield—the disgraced former physician who started the vile “vaccines cause autism” fallacy in a paper published in and later retracted by The Lancet—misinformation about vaccine safety has continued Page | 21 to spread and find new audiences. A sad truth: Misinformation and men are an especially bad combo, and it’s hurting our health. Research from the University of Delaware tells us that men are more likely to believe Covid conspiracy theories and other research suggests they may be less concerned about the harmful effects of misinformation. Men are also less likely to get the Covid vaccine. While there are myriad reasons for this hesitancy, the male inclination to accept and be influenced by Covid conspiracy theories is a key part of the story. So it’s especially important right now for men to use the strategies here to ingest a healthy diet of information and wash it down with some skepticism. Our information environment has become a chaotic, confusing, exploitative shitstorm that is destroying our health and well-being. There are a variety of forces making it increasingly difficult for us to avoid—or even recognize—the harmful hogwash and polarizing pandering. And all this is happening at the exact moment in history when we crave and so desperately need facts and a bit of clarity. The infodemic has helped foster an erosion of confidence in scientific institutions, as those who spread misinformation frequently seek to promote doubt and distrust. The scientific community deserves some blame, too, with occasional bad research and poorly communicated results creating confusion. (Masks don’t work./Yes, they do.) But that’s how science works; evidence evolves and recommendations change, and being transparent about those changes is essential. Just be aware that alternate and often science-free voices will try to be definitive when actual scientists don’t have the data or facts to get there quite yet. You’re better off waiting until they do. But there is a way forward! By using a few critical-thinking tools and being aware of the tactics used to push misinformation, we can cut through the noise. So How Did We Get Here? There’s no single reason that half-truths, deliberate untruths, and simple misunderstandings are undermining the acceptance and sharing of science-backed information. It’s a complex tangle of many factors. But if I’m forced to pick the one that has done the most to supercharge this era of bad info, the choice is obvious: social media. In July, President Joe Biden went so far as to say that misinformation on social media is “killing people,” a lament that is both alarming and supported by a growing body of evidence. If you get your news from social media, you are more likely to believe and spread misinformation, according to a 2020 study from McGill University. An analysis by Pew Research Center came to


a similar conclusion. Other research has traced the origins of Covid misinformation circulating in popular culture to specific platforms. For example, a 2020 Press Gazette analysis of more than 7,000 misleading claims about Covid on the global Poynter Coronavirus Facts Database found that more than half had originated on Facebook. We know that misinformation can spread fast and far. In August, Facebook released a report on its most widely viewed content from January through March 2021. The winner? The post seen more times than any other was a misleading article implying the Covid vaccine had killed someone. This nugget of misinformation was viewed nearly 54 million times by Facebook users in the U. S. in that three-month period and has been leveraged by countless anti-vaccine advocates, compounding its impact. This kind of noise has, as noted by Biden, done great harm, leading to deaths and hospitalizations, increasing stigma and discrimination, and skewing health and science policy. One study from early in the pandemic linked more than 800 deaths and thousands of hospitalizations to a rumor, spread primarily through social media, about the use of methanol as a cure for Covid. A study published this year by Heidi Larson and her colleagues at the Vaccine Confidence Project at the London School of Hygiene and Tropical Medicine found that the spread of online misinformation about Covid vaccines has had a significant impact on hesitancy—jeopardizing our ability to reach herd immunity. There are many reasons why this happens. Our current information ecosystem is a frantic space that doesn’t really invite a careful consideration of the facts, especially if the headline plays to our emotions. We react quickly to the impressions that content creates. We know, for example, that humans are evolutionarily predisposed to remember and respond to negative and scary information. This negativity bias is universal. Media experiments have found, for instance, that negative headlines outperform positive ones. There is also a growing body of evidence that exposure to social media may be stressing us out. And when we’re stressed out, we may be more likely to believe and spread misinformation—thus creating an accelerating cycle of angst, dread, and bunk. There is some ironic truth in the term “deathscrolling.” Adding to the gravitational pull of this vortex is the reality that the algorithms used by social-media platforms to decide what we see continue to ensure that harmful—and often fearmongering— misinformation floods our feeds. This can pull people into microchip-infused, anti-vax, 5G-causedCovid rabbit holes that are specifically designed to play to our interests and values. The impact of this personalized media curation can be staggering. A 2020 analysis by the activist group Avaaz estimated that the algorithm Facebook uses generated 3.8 billion views of health misinformation in just one year. Another problem: Lies, fake news, and pseudoscience can be made more compelling (microchips in the vaccines!) than the boring old truth (safe, clinical-trial-tested, actual vaccine ingredients).

Page | 22


Indeed, research has found that, yep, as the saying goes, “a lie can travel halfway around the world while the truth is putting on its shoes.” Misinformation and conspiracy theories can also draw us in because they may provide a complete narrative as to why things are happening. They can offer answers to questions that, from a scientific perspective, remain unresolved. During the pandemic, for example, much was—and still is—unknown. This can feel discombobulating. A story that gives answers, even a seemingly zany conspiracy theory, can be comforting—especially if that story reflects our preexisting values and beliefs. (Ah, it was that evil Bill Gates and his microchips!) And this brings us to ideology. The spread of misinformation has always had an element of ideological spin. Crafting a message that fits with a particular worldview is a surefire way to make misinformation more appealing, at least for those who subscribe to that worldview. In addition to leveraging our confirmation bias— that is, the strong psychological tendency to see, process, and remember information that confirms our preexisting beliefs—using ideology as the hook allows those who are pushing misinformation to sidestep the actual science. The message becomes about an ideological position, not what the science says or doesn’t say. I want to be clear that I’m not judging anyone’s political leanings. The ideological spin of science happens across the belief spectrum. The point is that when it comes to accepting and sharing misinformation, ideology matters. We all—right, left, center—need to be aware of that. David Rand, Ph.D., a professor at MIT’s Sloan School of Management, has conducted a boatload of studies on the connection between ideology and misinformation. “We have found that when deciding what to share on social media, people are much more likely to share content that aligns with their political partisanship—even if it’s false,” he says. Experts have long recognized the double-edged nature of social media. It can bring us together, connecting us to communities, friends, and families. But it also can drive us apart—especially around ideology. “Social-media [platforms] are amplifiers of political polarization,” says Kate Starbird, Ph.D. She is an associate professor at the University of Washington and an expert on the spread of misinformation. “They make polarization worse and allow for that polarization to be leveraged in new ways by those seeking to exploit our differences for their gain.” Public discussions about Covid became politically polarized almost as soon as the pandemic was declared. And an analysis from the University of Cincinnati that examined social-media interactions from the beginning of the pandemic saw that some of the most influential voices were politically motivated. Increasingly, our information environment is dominated by social media and fueled by a toxic stew of fear, distrust, uncertainty, and political polarization. Recognizing the forces that drive misinformation is an important step in stopping its spread. Indeed, Starbird told me that her top recommendation for spotting misinformation is to tune in to your emotions. Find 9 more ways to identify it here.

Page | 23


“Whenever some piece of content makes me feel politically self-righteous—like I’m about to spike a political football—that’s when I know I need to be extra careful about sharing,” she says. “Because there’s likely a misinformation flag somewhere on the field.” You can do even more than that to stop the spread of misinformation. The rest of this series helps you figure out whether or not you should believe the information you’re hearing. Page | 24

Source: https://www.menshealth.com/health/a37910261/how-junk-science-and-misinformationhurt-us/

Vaccines Medicine Must Sanction the COVID Quacks: Intent doesn't matter when patients are being harmed by Matthew K. Wynia, MD, MPH, 17 October 2021

The classic definition of a "quack" -- dating as far back as the 1500s -- is a medical charlatan, a "fraudulent pretender to medical skills." Derived from the old Dutch kwakzalver, or hawker of salves, quacks typically mislead patients into buying useless or even harmful therapies by falsely promising miraculous cures. Picture a snake oil salesman peddling a proprietary elixir or "tonic" from the back of a wagon, then moving on quickly to the next town before folks start asking for refunds. But today's COVID quacks force us to rethink this common stereotype, which is creating challenges for state medical boards and other organizations charged with self-regulation of the medical profession.


Today, many doctors acting like quacks see themselves not as purveyors of snake oil but as mere medical iconoclasts, willing to challenge the status quo. They seem to start down the path to quackery by convincing themselves that the unprecedented circumstances of the pandemic should lower the bar for what counts as practice-changing evidence Page | 25 -- not an entirely unreasonable assertion. Next however, they argue that it's actually unethical to wait for research published in peer-reviewed journals and vetted by experts with years of experience in relevant fields. And then they convince themselves they have personal experience that helps them see patterns and interpret data that experts in epidemiology, public health, and infectious disease either can't or won't. Of course, in some ways today's COVID quacks are just like the quacks of yore. They often carry medical credentials of some sort, and they often claim to have uncovered secrets their mainstream colleagues are either too dumb or too corrupt to see. Some are certainly scammers, seeking to make a buck. Almost all display the quintessential mark of a quack -- offering patients a false level of certainty and the promise of miracle cures. But more often than not, today's COVID quacks appear to believe the stories they tell. Most are not getting rich off the pandemic, and we can presume they are being honest when they claim to be frustrated by the lack of mainstream acceptance of their fringe ideas. Many have convinced themselves they are saving lives by standing up to a medical establishment they view as ignorant or corrupt. In other words, they are misguided but most are not intentionally hurting anyone, because their beliefs are sincere. This means one might better think of today's COVID quacks as followers in a cult, or as doctors with an addiction, rather than as snake oil salesmen. That is, they are probably more victims than villains. When this is the case, we should seek to understand how they ended up down conspiracy theory rabbit holes and use exit counseling methods to gently guide them back to the real world that the vast majority of physicians thankfully still inhabit. But these well-intended COVID quacks pose a serious question for the medical profession, which prides itself on its comprehensive, if admittedly imperfect, system of self-regulatory structures like board certifications, state licensure, and codes of ethics. Namely, does the fact that most of these doctors don't mean to harm anyone matter when it comes to professional self-regulation? Making this question even more challenging is that medical science values challenges to the status quo. For us this isn't just a matter of free speech -- proving theories wrong is how medicine moves forward. It is not a coincidence that statistical testing in medical research (the P-value) isn't used to prove a hypothesis is true, it's used to determine how likely the hypothesis is to be false. It also doesn't help that some of the same doctors promoting unproven therapies today also promoted both hydroxychloroquine and steroids for patients with severe COVID-19 early in the pandemic. It turned out they were wrong about hydroxychloroquine but they were right to


question the conventional wisdom on steroids. Sometimes, medical iconoclasts are on the right side of history. Of course, it is also a demonstration of the value of the scientific method that it was solid research -- not anecdotes -- that quickly proved hydroxychloroquine doesn't work but steroids are Page | 26 helpful for patients with severe COVID-19. The bottom line is that we don't want a profession where everyone goes along unquestioningly. We need a profession where conventional theories are tested, where new information changes practice. Especially in a rapidly evolving pandemic, gathering and using new information is critical. We want doctors who can change their minds based on data. But we don't want doctors making outlandish promises based on very limited data, or worse, those who are unable to give up on their pet theories no matter what the data show. And we cannot tolerate a profession where doctors clinging to disproven theories are killing patients. Of course there are many unknowns and many areas of legitimate disagreement in medicine but, to be blunt about it, any doctor still promoting hydroxychloroquine, or suggesting that ivermectin is a "wonder drug," or that vaccines make COVID-19 infections worse, is hurting and killing patients -- whether they intend to or not. Every time a pregnant woman listens to one of these doctors, decides not to get vaccinated, and then ends up in the hospital or the morgue, those women and their babies were harmed by these doctors. The bottom line is that neither intent nor the ability to acknowledge the harms one has caused matter when patients are being hurt -- this means they don't matter for professional self-regulation. This might seem counterintuitive, since intent often matters in adjudicating guilt in a courtroom, and the processes of professional self-regulation used by state boards, specialty boards, and medical societies typically follow legalistic procedures, with considerable attention to due process and the right to appeal. But there is a key difference between a courtroom and a licensure board: intent does not matter in adjudicating whether harms have occurred. And our task in professional selfregulation is not to decide whether a physician is innocent or guilty, it's to prevent our peers from harming patients. The purpose of professional self-regulation is to protect public safety -- that's it. When significant harms are arising due to a doctor's persistent and demonstrably false beliefs, good intentions and sincerity in holding the false beliefs no longer matter. The medical profession must sanction the COVID quacks. Source: https://www.medpagetoday.com/opinion/secondopinions/95085?xid=nl_secondopinion_2021-10-19&eun=g1024768d0r

Prosocial Ways to Respond What You Can Say to Pull Someone Out of a Junk Science Rabbit Hole: Use these strategies, but listen first


By Timothy Caulfield is a Canada Research Chair in Health Law and Policy at the University of Alberta, 18 October 2021 From your gym buddy to your otherwise-sensible mom, someone in your life is going to be stuck on the horns of junk science. They might be about to take an unproven or potentially dangerous Page | 27 treatment based on something they read about it on social media. A report published in 2021 in the Journal of the National Cancer Institute saw that of the 200 most popular cancer treatment articles mentioned on social media, one third of them contained misinformation. And that misinformation? Seventy five percent of it was potentially harmful. That’s just one example. It’s easy to see why: It can be really hard to know what’s supported by evidence and what’s just spin, especially in a world where science has become politicized. Yet it can also be really frustrating to talk to someone who’s vehemently headed down the path of hurting their own health—or the health of others—due to misinformation. You might want to rant. Throw a mountain of evidence at them. Keep trying to convince them at every turn that they’re not seeing the truth. Which pretty much, as you probably know by now, won’t work. The first step to having a conversation with someone about potentially dangerous misinformation is this: Don’t talk. Listen. And the second step is to force yourself to actually listen—which doesn’t mean watching your friend’s mouth move while waiting for your turn to talk. I know this isn’t always easy, but it is essential to understanding their perspective and their specific concerns. Not everyone who is hesitant about vaccines, for example, is a hardcore conspiracy theorist. There is a vast range of viewpoints. Take the time to find out where someone is coming from before you start talking. (You may also find yourself with more compassion when you take a look at why we're all so vulnerable to misinformation and how we got to this apex moment.) Then you can use these strategies, too: Stay calm Don’t mock them, and don’t just throw a mountain of facts at them. Try to get a sense of what information would help them feel better about the issue. Ask where their sticking points are. And recognize that they may have legitimate reasons for mistrusting the relevant institutions. A little empathy can go a long way. Try to find common ground Perhaps it is mutual concern for the safety of family members or frustration about the uncertainty of emerging science. And use your own stories—such as a positive experience with vaccination—to support your perspective.


Don’t get bogged down in the details Appeal to critical thinking and emphasize the big picture, sciencewise. Research has shown, for example, that providing credible information and highlighting the rhetorical tricks used to push misinformation can make a difference. (I even wrote a paper on it that you can see here). Give them a path to good science It is very rare for someone to change their mind right in front of you. (And be aware that it’s impossible to change some people’s minds.) Humans are pretty darn stubborn. But, over time, your discussion might have an impact. Be patient. Source: https://www.menshealth.com/health/a37928531/how-to-talk-to-someone-aboutmisinformation/ Can ex-holdouts persuade current ones? By Kristen V. Brown , 24 October 2021, via email Bloomberg Prognosis picks one question sent in by readers and putting it to experts in the field. This week's question comes to us from Janice in Ottawa, Canada, who is wondering what impact the recent wave of Covid-19 cases has had on vaccine holdouts. She asks: I’ve read many testimonials of very ill or dying unvaccinated Covid-19 patients who are now encouraging others to get vaccinated. Is there any evidence these tragic stories persuade previously unvaccinated people to get the jab? Vaccine hesitancy is a huge problem in the U.S. and Canada. While in some parts of the world access to Covid vaccines is limited, in North America there are plenty of shots but not enough takers. In the U.S., about one-quarter of the eligible population still hasn’t received a first dose.

Page | 28


But after an aggressive wave of the delta variant swept through the U.S. in the summer, hard-hit Southern states saw many previous holdouts rushing to get vaccinated. Arkansas, for example, in one single four-week period reported a 300% uptick in vaccination as many people confronted friends and family members contract the virus. There are, of course, still a large number of holdouts in those states, as there are elsewhere in the U.S. The question is how to best reach them. For some insight, we turned to Matt Motta, a political scientist at Oklahoma State University who studies vaccines and public opinion. “It's quite likely that Covid vaccine skeptics may be effective at reaching those who share their distrust of vaccine safety and efficacy, when they encourage others to pursue vaccination in order to avoid bodily harm,” says Motta. “But this is still, to some degree, an open question.” The key, says Motta, is source credibility. “In general, people tend to trust most those who share our views on political, social, religious, and other matters of great personal importance to us,” he says. Former holdouts, then, could be a good way to reach current ones. “Previous studies in the field of science communication find that when those with whom we have something in common — say, our political beliefs — change their minds about a particular issue — say, whether or not climate change is caused by human activities — we tend to be more likely to update our own views in response,” Motta says. In a paper by Motta and colleagues still undergoing peer review, preliminary evidence suggests that such “reversal narratives,” as Motta puts it, can increase other skeptics' confidence in both childhood and adult vaccines. In general, showing empathy and building trust are known to be two of the best approaches to help motivate vaccine-hesitant people to get the shot. Focusing on those approaches, rather than simply the facts, could go a long way toward increasing confidence in vaccines — for Covid as well as other diseases. If you have a question, write to CovidQs@bloomberg.net—Kristen V. Brown #

#

#

If you'd like to learn more or connect, please do, just click here. You can join my email list to keep in touch. Tools and my podcast are available via http://ALifeInFull.org. Click here for a free subscription to our Weekly LinkedIn Newsletter

Page | 29


If you liked this article, you may also like: Special Editions – Strong Opinions Loosely Held: Part 3 – Science, Psychology, Misinformation, Vaccines, and Prosocial Ways to Respond Special Edition – Strong Opinions Loosely Held: Part 2 - Misinformation, Vaccines, and Prosocial Ways to Respond Special Edition – Strong Opinions Loosely Held: Part 1 - Science and Psychology The Reproducibility Problem in Science—What’s a Scientist to do? (Part 3 in a series of 3) The Reproducibility Problem in Science—Shame on us? (Part 2 in a series of 3) The Reproducibility Problem—Can Science be Trusted? (Part 1 in a series of 3) Can AI Really Make Healthcare More Human—and not be creepy? How to Protect Yourself from Fad Science Technology Trends in Healthcare and Medicine: Will 2019 Be Different? Commoditization, Retailization and Something (Much) Worse in Medicine and Healthcare Fits and Starts: Predicting the (Very) Near Future of Technology and Behavioral Healthcare Why I think 2018 will (Finally) be the Tipping Point for Medicine and Technology Healthcare Innovation: Are there really Medical Unicorns? Can (or Should) We Guarantee Medical Outcomes? A Cure for What Ails Healthcare's Benchmarking Ills? Why Global Health Matters

Page | 30


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.