COVID-19 Facts or Fiction: 1 in 4 YouTube Videos Misleads Viewers

WEDNESDAY, May 13, 2020 (HealthDay News) — More than one-quarter of popular English-language COVID-19 information videos posted to YouTube are misleading, researchers warn.

There are posts, for example, falsely claiming that drug companies already have a cure for COVID-19, but won’t sell it, and that different countries have stronger strains of coronavirus, a new study finds.

YouTube viewers “should be skeptical, use common sense and consult reputable sources — public health agencies or physicians — to fact-check their information,” said study lead author Heidi Oi-Yee Li, a medical student at the University of Ottawa in Canada.

With billions of viewers, YouTube has enormous potential to bolster or hamper public health efforts, Li and her colleagues said in background notes. But what they turned up in their recent YouTube search is “alarming,” Li said.

“In an ideal world, social media platforms should take more responsibility for content uploaded,” she said. But “this is an unrealistic expectation, given the billions of users uploading information every second across the globe.”

Li’s team did a simple keyword search for “coronavirus” and “COVID-19” on March 21, 2020.

After compiling the top 75 videos for each of the search words, the team excluded all non-English clips, those exceeding an hour, duplicates, and anything not actually about COVID-19.

The remaining 69 videos had already been viewed nearly 258 million times, they said. Just under one-third (29%) were clips from TV network news. Consumer-generated postings and entertainment news each accounted for about one in five clips.

Internet-based news made up 12%, while “professional” advice and information gleaned from newspapers accounted for less than 10%.

Only 2% of the clips were posted by government agencies, such as the U.S. National Institutes of Health or the Centers for Disease Control and Prevention. Clips from educational institutions also accounted for just 2% of the total.

All were scored for accuracy according to information on how COVID-19 is spread; typical symptoms; prevention; treatments; and infection patterns (epidemiology).

While nearly three-quarters of the videos were deemed accurate, almost 28% — accounting for 62 million screenings — were not.

Misleading information included racist remarks, inappropriate recommendations or conspiracy theories.

One-third of the misleading videos were from entertainment news; one-quarter came from network TV news; and another quarter represented internet news postings. Consumer-generated content made up 13%, the investigators found.

The good news: None of the government-generated postings contained misinformation.

The bad news: Government-generated postings made up a fraction of the most popular COVID-19 videos.

And therein lies the rub, said Li. “Should you really be getting health care information from a random YouTuber that probably knows as much as you do? I think not.”

If trusted health sources want to reach people where they are, she added, they’ll need to step up their video production game.

“Public health agencies must ensure their message gets out to the public by producing videos that are more entertaining,” she said, “because that’s what people watch.”

According to Li, the most viewed video on YouTube in the study — with over 20 million views — was produced by a popular YouTube and television celebrity. Meanwhile, the most popular government video only reached 4 million views.

But it’s not as simple as producing more watchable video content, said Matt Motta, an assistant professor of political science at Oklahoma State University.

“We know from past research that YouTube is a place that people who want to be exposed to conspiratorial content go to consume that information,” he observed.

Simply “presenting people with the facts is no guarantee that they will change their minds,” even if the presentation is entertaining, Motta added. That’s because “when ‘factual corrections’ challenge people’s deeply held political beliefs or social worldviews, people sometimes double down on those views.”

But he also suggested that “de-platforming bad information on sites like YouTube may have the potential to cut off the availability of misinformation.”

With public interest in reliable health advice at an all-time high, “increasing the supply of accurate information, while decreasing the availability of false information, could reduce the number of people who mistakenly hold misinformed beliefs,” Motta said.

The study findings were published online recently in BMJ Global Health.

More information

There’s more about COVID-19 at the U.S. Centers for Disease Control and Prevention.