Misinformation is a pandemic that doesn’t have a vaccine


Misinformation

Getty Images

Conspiracy theories and misinformation about QAnon, COVID-19 and 2020 election fraud took a deadly turn in 2021. As bad as things were last year, experts worry it’ll get worse in 2022.

“I think we’re going to see an acceleration and expansion of the conspiracy theories,” said Mike Caulfield, research scientist at the University of Washington Center for an Informed Republic. “They’re going to go bigger, they are going to play even more loosely with the truth.”

This expected ramp-up could mean a widening divide among Americans, more outlandish ideas being shared and, as shown in this past year, potentially more lost lives. We’ll see a real-world test of how bad this could get with the approach of the 2022 midterm elections, around which misinformation peddlers are expected to continue their onslaught on the truth. 

One reason it could get worse is that the federal governments and tech companies aren’t getting ahead of the problem. 

“We are still very much in a reactive mode, and until we get ahead of some of this, we can expect each cycle to be worse than the last,” Caulfield said. 

There is good news. This can be fixed, but it’ll take some effort.

In his study, The Perfect Storm: A Subcultural Analysis of the QAnon Movement, Chris Conner, visiting assistant professor of sociology at the University of Missouri, Columbia, argues that people who support the conspiracy theory do so because there are feelings of mistrust in the government and public officials after the social systems in the US have failed them — whether it be through economic hardships or a lack of proper mental illness coverage. This left believers feeling alienated and dissatisfied with how their lives ended up. 

If these hardships aren’t addressed, many will continue to go further down the rabbit hole. Others could end up martyrs for what they believe to be a just cause. 

“What’s going to be productive is listening to these people and taking them seriously about what it is they were responding to,” Conner said.

Social media’s response

For the most part, social media companies are sticking with what they did in the past year to tackle misinformation, including removing false content, banning popular influencers who spread the misinformation and improving their reporting systems to flag content. That, however, might not be enough.

Experts have repeatedly pointed to social media as being a driving force in the spread of misinformation. 

“It is absolutely 100% on the backs of the social media companies to continue to crack down on these movements and maybe risk one tiny little sliver of their profits in the service of doing the right thing for the greater good,” said Mike Rothschild, conspiracy researcher and author of The Storm is Upon Us, which provides a history of the conspiracy theory. “I don’t know that QAnon would have spread if Facebook, Twitter and YouTube had cracked down on it in 2018.”

QAnon — a pro-Donald Trump fringe conspiracy theory that claims the former president was fighting a war against Satanist pedophiles in Hollywood and the Democratic party — continues to fester even though President Joe Biden has been in office for nearly a year. Q supporters are likely to continue spreading misinformation across social media platforms, and there are dozens planning to run for office in 2022, according to Media Matters. 

Social media companies say they are prepping for the 2022 elections and some have learned from how things went in 2020. 

One company that’s been lambasted after multiple whistleblowers came forward in October is Facebook, now rebranded as Meta. Former employees accused the social media platform of allowing hate speech and misinformation to escalate for higher profits. 

As for voter fraud conspiracies surrounding the 2020 elections, Facebook did take down some groups – one with more than 300,000 members – and accounts that spread false information, and the company says they’re already focusing on 2022. 

“We’re enforcing our policies against removing voter interference content, and we’ll continue to refine our strategy to combat content that discusses the legitimacy of voting methods, like voter fraud claims,” Monika Bickert, vice president of content policy at Facebook, said in a November press call referred to CNET by a Facebook spokesman. “While each election will bring its own unique set of challenges, we’re working diligently to apply the lessons we’ve learned from previous years to elections in the US and other countries in 2022 and beyond.” 

Twitter cracked down on misinformation accounts throughout 2021 and has issued suspensions for politician accounts such as Rep. Majorie Taylor Green, a Republican from Georgia, for spreading vaccine misinformation. The social media company also instituted a new report option on tweets spreading misinformation about health and politics. Twitter says it’s “committed to improving the health and integrity of the public conversation.”

Twitter and Facebook also banned Trump days after the Jan. 6 Capitol riot, in which his supporters stormed Congress as the 2020 election was being certified. 

YouTube stepped up its misinformation policies in 2020 and 2021 by banning COVID vaccine misinformation and anti-vax misinformation. YouTube, which is owned by search giant Google, also removed more than 1 million videos in 2021 related to COVID misinformation. The company says it’s looking to keep improving its systems to weed out misinformation.

TikTok began dealing with misinformation about the elections and COVID vaccines in 2020. This resulted in hundreds of thousands of videos being deleted from the platform earlier in 2021. The company also instituted a warning prompt to help stop the spread of flagged videos. 

“While TikTok isn’t the first place people look to for political content, we’re committed to doing our part to help stop the spread of disinformation and connect our community to authoritative information on elections,” a TikTok spokesperson said.

Social media companies have implemented others tools to help out individuals with misinformation that experts agree has worked. This includes labeling misinformation and fact-checking posts. Another useful action is to simply slow down the spread of certain misinformation content soon after it’s posted. 

Even though these larger social media platforms are trying to stomp out misinformation, people spreading false information and conspiracy theories have found other platforms. Telegram, an encrypted messaging app, has become a safe haven for QAnon influencers who talk to their hundreds of thousands of followers. Rumble and Odysse are two video platforms filled with misinformation and conspiracy theories that would be quickly removed from YouTube. 

Along with social media companies, Rothschild also points to payment platforms as having a role in spreading misinformation. 

The larger influencers of conspiracy theories profit from the misinformation they peddle. Patreon, PayPal and other services that let people pay money to creators have instituted policies that attempt to prevent funds from going to people making misinformation content, but they continue to find ways around those policies or look to more lenient platforms such as Subscribestar.  

What’s in store for 2022? 

A big factor in 2022 could be the person who was at the center of much misinformation in 2021: Trump. In October, the former president said he would launch his own social media network called Truth Social that will “stand up to the tyranny of Big Tech.” 

Since the announcement, however, there has been little talk from Trump about his social media network. The site allows people to sign up for accounts and be put on a waiting list, but so far there has been no date on when the platform will launch. Those who do wish to reserve a username will have to make a donation to the National Republican Senatorial Committee.  

Other pro-Trump social media platforms exist, but they don’t have the active userbase in comparison to the more popular platforms. This includes Gab, Parler, Gettr and Frank. 

These platforms, along with Telegram, will likely be going into overdrive in 2022 due to the midterm elections. There will be 34 Senate seats up for grabs as well as every seat in the House of Representatives. There will also be 36 gubernatorial elections along with other states and local offices. For pro-Trump misinformation peddlers, there’s a lot at stake, although it might not be so clear at a first glance. 

“The bigger thing that people in that [misinformation] universe are trying to do with 2022 is to win the narrative battle so that they are sitting in a much better place going into 2024,” Caulfield said. “If they are able to convince large swaths of the public that the 2022 elections are illegitimate, then they are more likely to get the sorts of legislative changes that they want.” 

While all of this misinformation seems overwhelming, there are things you can do to ensure you’re not getting taken in, from running Google reverse image searches to verify photos to just generally being aware of this problem. It also helps to take stock of your emotions when a certain social media post shows up. Some content is designed to make people outrage, especially when it comes from dubious sources. 

Because even though academic experts, government entities and tech companies are all aware of how bad the misinformation problem is, and will be, it will still be an uphill battle to stop it from spreading. 



Source link

WP Twitter Auto Publish Powered By : XYZScripts.com