← Back to portfolio

Combatting the Health Hoaxes

Published on

Disinfectants are a powerful way to kill viruses on the surface level, but they were never designed for human consumption. Not only can ingesting concentrated cleaning solutions like bleach risk poisoning and even death, but it’s also entirely ineffective.

62-year-old Miami resident Mark Grenon and his three sons were recently charged for producing and selling tens of thousands of bottles of a toxic industrial bleach solution as a way to cure COVID-19, cancer, autism and other serious medical conditions, according to the U.S. Attorney’s Office for the Southern District of Florida.

According to an article published by Spectrum News, the Grenons allegedly received over $1 million from manufacturing and promoting “Miracle Mineral Solution,” which consisted of sodium chlorite and water that, when ingested, transformed into chlorine dioxide, a component often used in treating or bleaching textiles.

Even despite warnings from the FDA that the product and others like it could cause severe side effects, including life-threatening low blood pressure, people were still adamant about trying it out for themselves.

Thanks to the internet, it doesn’t take much for a bogus health claim to make its rounds these days. From conspiracy theories to outlandish headlines proclaiming the next miracle cure, science and health misinformation is seeping its way into conversations and media outlets left and right.

And it isn’t purely an inconvenience to sift through the lies. It can be deadly.

Fact-Checkers to the Rescue

A study published in August 2020 by the American Journal of Tropical Medicine and Hygiene found that nearly 800 people died from methanol poisoning after listening to a rumor that high concentrations of alcohol could disinfect the body and kill COVID-19. Approximately 6,000 people around the world were reportedly hospitalized for similar reasons.

Last July, the CDC conducted a survey on the use of household disinfectants. Of the respondents, nearly 20% said they washed their produce with bleach. The same percentage reported using a household disinfectant on their hands or skin. And 4% even confessed to consuming or gargling bleach, soapy water and alcohol-based disinfectants in an effort to protect themselves from COVID-19.

In an effort to fight the spread of false health claims like these and others, the International Fact-Checking Network (IFCN) at the Poynter Institute launched the Coronavirus Facts Alliance, a database featuring fact-checkers in over 70 countries with articles published in 40 languages (https://www.poynter.org/ifcn-covid-19-misinformation/). Within the first two months of operating, the company published over 3,500 fact-checking articles and weekly reports centered around COVID-19 hoaxes and misinformation.

One of the latest health claims to be debunked on their page pertains to the use of the drug ivermectin in treating COVID-19. Last week, a video made its rounds on Instagram featuring Dr. Pierre Kory, a critical care specialist and co-founder of Frontline COVID-19 Critical Care Alliance.

Kory and a team of doctors appeared to advocate for the drug as being safe and effective. But according to the FDA, ivermectin is not intended to treat viral infections. It’s more often used in animals to prevent heartworm disease and parasites.

“It’s crazy. The internet provides us all of this access to this wonderful information. But it’s also terrible, because it provides us all of this access to this bad information,” Bill Adair, founder of PolitiFact and Duke University journalism professor, said. “You’re almost better off waiting until the information is distilled by authoritative sources than doing your own research.”

Adair joined the Coronavirus Facts Alliance alongside 98 other fact-checking leaders when the pandemic first erupted. Managed by the Poynter Institute, PolitiFact sets itself apart from the rest by using a gadget they like to call the “truth-o-meter.”

Each statement gets rated and converted into a tally, which allows for a level of accountability that other fact-checking sites can’t match, Adair said. “I believe it’s the largest fact-checking effort in the world, and it’s the model for many fact-checking ventures around the world.”

Along with poorly researched COVID-19 “cures”, another major target for misinformation is vaccines.

The IFCN just announced a new Vaccine Grant Program, in which seven fact-checking organizations around the globe will receive a total of $500,000 in grant funding to support efforts to fight the spread of COVID-19 vaccine misinformation.

And it doesn’t just stop there. Cognitive scientists have teamed up to create and publish user-friendly guidebooks to relieve some of the hesitancy and lies surrounding the COVID-19 mRNA vaccines.

Addressing Vaccine Hesitancy

Anti-vaxxers around the world are dominating social media platforms to spread false claims with massive repercussions. Last December, a video made its rounds on Facebook warning individuals that the COVID-19 vaccine would contain a tracking microchip. Even after the theory was shut down, whispers traveled far and wide.

“For vaccines to be effective, you need to reach a certain level of everyone being vaccinated,” Adair noted. “So, when people fall for these falsehoods, it hurts everyone.”

Extreme viewpoints and conspiracies aside, there’s another group that’s quietly on the rise: the vaccine-hesitant. Out of 40 Auburn University students who participated in a survey on health misinformation and the media, 52% said they experienced some or a lot of hesitancy in receiving a COVID-19 vaccine.

But vaccine hesitancy didn’t just spring up in the last year. In 1998, British doctor Anthony Wakefield published his infamous study claiming that the measles, mumps, and rubella (MMR) vaccine could cause childhood autism. Studies were conducted and published almost immediately afterwards to refute the link, but the damage was already done.

Wakefield lost his medical license, but over two decades later, parents are still afraid to vaccinate their children.

In an effort to target the rumors early on, Dr. Stephan Lewandowsky collaborated with over 20 expert scientists and authors to create the new “COVID-19 Vaccine Communication Handbook.” (https://www.movementdisorders.org/MDS-Files1/The_COVID-19_Vaccine_Communication_Handbook.pdf).

Featuring over 20 authors and scientific experts, the 18-page digital guide delves into the truth behind vaccines while addressing some of the most common anti-vaccination misinformation. Each section provides access to a “wiki” with more detailed information.

As one of the main contributors to the guidebook, Northwestern University cognitive psychology professor David Rapp explained how inappropriate equality between diseases can misrepresent efforts that went into creating the COVID-19 vaccines.

“We’re talking about tons of medical labs around the world all converging and trying to find evidence,” Rapp said.

Public Mistrust in the Science Community

Science isn’t static, nor is it black and white. Health claims are refuted and retracted by medical experts daily. And as the old data replaces the new- sometimes, seemingly overnight- people’s minds aren’t quite so eager to adjust.

In September 2020, the CDC stirred up confusion after shifting their stance on how the coronavirus can be transmitted through airborne particles. The new update claimed that the virus can remain suspended in the air and spread beyond a distance of six feet. Three days later, however, federal officials said the post was a mistake, and it had been released before undergoing a full review process.

Previous guidance from the CDC also suggested that a credible exposure occurred when a person was within six feet of an infectious individual for 15 consecutive minutes. But according to their website today, even brief contact can lead to transmission. Spending a total of 15 minutes with an infectious person over the course of 24 hours is enough to put someone at risk.

“Our understandings get more refined and better informed by evidence,” Rapp said. “There needs to be more of an understanding provided by the medical community- and provided by science- that these ideas change over time, and our understandings develop over time.”

With new research being released regularly, public health claims are bound to change. Dr. Fred Kam, Auburn University’s medical director, said he spends hours a day examining data from multiple medical journals.

“As time goes on, we get other sources of data,” Kam noted. “But, unfortunately in today’s world, there’s not one major repository.”

Since September, Dr. Kam and his team at the Auburn University Medical Clinic have been posting weekly video updates on social media to keep students and faculty informed on the latest news on COVID-19 and vaccine distributions. Kam said the goal for the videos is to stop the spread of misinformation head on and combat any confusion that may arise.

Another way that Auburn University has reached out is by creating the COVID-19 Resource Center. With plans to transition back to normal operations in Fall 2021, the CRC will now be emailing newsletters to students, faculty and staff on a biweekly basis. For questions about vaccines, university policies and other virus-related concerns, the CRC can be reached by email at covidresourcecenter@auburn.edu or by calling 334-844-6000.

Perhaps health misinformation stems from errors in communication. Maybe a lack of transparency is the predominant issue.

But according to cognitive psychology, part of the reason lies in the way our brains are wired.

Dr. David Rapp has been conducting studies to uncover the cognitive reasons behind why people fall victim to misinformation.

“We’re less likely to look for information that contradicts what we know, or pulls into question our really strongly held beliefs,” Rapp said. “And often, those beliefs connect with our identities.”

People are more willing to believe facts that align with their existing viewpoints, but they fail to step back and examine the bigger picture. On this notion, anti-vaxxers will seek out information that suggests the dangers or risks associated with vaccines. Meanwhile, those in favor of vaccines will only look for data that endorses the benefits.

“Another hard-wired mechanism is when we see information over and over again, we tend to think that information is more true,” Rapp added.

Here is where echo chambers come into play. With platforms like Facebook and Twitter taking precedence over print news sources, people who share the same views can easily validate false information with a quick comment or retweet.

Lisa Fazio, assistant professor of psychology at Vanderbilt University, conducted research which found that repetition can affect a person’s ability to detect truth- particularly through viral posts on social media.

In an effort to flatten the curve of the “infodemic” and fight back against cognitive biases, some scientists are looking for ways to prompt people to consider accuracy before sharing content online.

Boosts and Nudges

While it is certainly possible to correct existing misinformation, the greater challenge comes in slowing down the spread. A new theory from MIT cognitive scientist David Rand has suggested that boosts and nudges can positively impact consumer behavior.

These modest interventions are “designed to help people think more critically, evaluate information, or neglect the biases that they have when they think about the world,” Rapp said.

In his most recent study, Rand tested a series of interventions to increase the truthfulness of information shared on social media. Results showed that tagging some stories as false made readers more willing to believe and share other stories, even if those untagged stories also turned out to be false.

It’s not an entirely new concept. Psychological nudges and boosts have been used in the past to promote healthier eating habits and improve attendance rates via encouraging text messages. Fast food restaurants also use the technique to up-sell their products- that is, suggesting larger sizes or side items with the meal.

In an effort to fight science and health misinformation, these interventions can play a critical role in urging people to reconsider what they choose to post or retweet.

“If they get these brief nudges, and it pushes them to do things differently afterwards- and you do that consistently- maybe it starts to become a practice that changes,” Rapp said.

Generating Healthy Skepticism

Questioning the sources is critical in combatting the spread of misinformation. And when it comes to science, facts should never be taken as the gospel truth.

“Anyone can put just about anything on the internet these days,” Auburn University biomedical science and journalism student Bayley Beasley said. “You just have to be really careful what you’re looking at.”

In her former persuasion and health communications classes, Auburn University professor Dr. Debra Worthington said she made it her goal to turn her students into healthy skeptics. “If you always hang on to that little bit of skepticism, it’s going to lead you to investigate more,” she said.

Part of her course curriculum involved having her students examine TV drug advertisements. “It’s to get them to think about what it is that you’re seeing…to step back, look at a message, and critically evaluate that message,” Worthington said.

In the age of accessibility, the line between truth and falsehood is thin. And the gateway to information is wide open, leaving plenty of room for misinformation to follow suit.

It’s proliferating at a dangerously rapid rate. And no one is immune from it.

“It’s like an exponential growth. You send it to five people…then five people send it to five other people. And it just keeps going. Then it’s hard to pull it back,” Kam said.

Fortunately, treatments do exist. But it’s no easy fix.

Readers have the daily task of digging deeper. Meanwhile, journalists ought to be promoting the benefits of scientific interventions rather than narrowing in on the ugly. And as for medical practitioners and scientists, transparency is key.