December 19, 2024

In a recent Joe Rogan podcast, Meta CEO Mark Zuckerberg explained the chain of events that led Facebook to censor the Hunter Biden laptop story and how that censorship took place. The story, which many argue was a key factor in the 2020 election, was infamously banned by Twitter shortly after it came out. At the time, Facebook announced they were “reducing its distribution” on their platform, but offered minimal details as to how or why.
In late October of 2020, Zuckerberg explained in a Senate hearing that the FBI had warned Facebook to be on “heightened alert” about “hack and leak operations” leading up to the election, which is part of what caused them to temporarily throttle the story.
Though Zuckerberg did not divulge any particularly new information with Rogan, the conversation is causing quite the stir and has increased public attention on the federal government’s role in social media censorship.
“Basically the background here,” Zuckerberg said, “is the FBI I think basically came to us, some folks on our team, and was like ‘Hey, just so you know, you should be on high alert. We thought that there was a lot of Russian propaganda in the 2016 election. We have it on notice that basically there’s about to be some kind of dump similar to that, so just be vigilant.’”
Zuckerberg stressed that Facebook didn’t outright ban the story like Twitter did, but only reduced its distribution.
“For the…I think it was five or seven days when it was basically being determined whether it was false, the distribution on Facebook was decreased, but people were still allowed to share it. So you could still share it, you could still consume it,” Zuckerberg continued.
“So when you say ‘the distribution is decreased.’ How does that work?” Rogan asked.
“Basically the ranking in news feed was a little bit less. So fewer people saw it than would have otherwise,” Zuckerberg replied.
“By what percentage?”
“I don’t know off the top of my head. But it’s…it’s meaningful.”
After commenting on the “hyper-political” nature of the story, Zuckerberg reiterated how the FBI’s comments were a key factor in the decision.
“We just kind of thought, ‘hey look, if the FBI’—which I still view as a legitimate institution in this country, it’s very professional law enforcement—‘if they come to us and tell us that we need to be on guard about something, then I want to take that seriously.’”
“Did they specifically say you needed to be on guard about that story?” Rogan asked.
“No, I don’t remember if it was that specifically,” Zuckerberg replied, “but it basically fit the pattern.”

For years, content creators have wondered whether Big Tech platforms like Facebook have been silently censoring their content. With these platforms being such black boxes, it’s impossible to say for certain why some content goes viral while other, equally-interesting material seems to struggle. Rumors have long circulated that “the algorithm” favors some stories and accounts more than others, but without access to the back end this has been difficult to prove.
Even with Zuckerberg’s recent comments, we still don’t know much about how Facebook engages in censorship or exactly what kind of content is being censored. But if we can read between the lines of his comments, we can deduce this much for certain: there is a group of people who control which content gets proliferated, and if they are even suspicious of your content, that’s going to hurt.
This alone should be chilling. The idea that a panel of “experts” is deciding which stories become national news and which ones never take off is deeply troubling. Yes, misinformation exists, but the dangers of allowing a small, likely-biased group to control the national conversation are far greater than the dangers of allowing “wrong” or “misleading” information to spread.
And besides, as the last two-and-a-half years have repeatedly shown, today’s “misinformation” all too often becomes tomorrow’s facts.
But while social media companies engaging in censorship is problematic in its own right, the FBI being involved points to a much bigger problem.
One of the points that gets brought up a lot in the Big Tech censorship discussion is the fact that social media platforms are private companies and that they therefore have the right to moderate content as they see fit. This is true, and so-called “free speech” laws which mandate loose content moderation policies should be opposed on these grounds.
But as Dan Sanchez and Liam McCollum astutely point out, just as governmental laws mandating less censorship are wrong, governmental pressure to have more censorship is also wrong for the exact same reason. When the government is threatening to regulate companies for allowing misinformation to spread, leading to a certain degree of “self”-censoring, this too is a violation of their right to choose their own content moderation policies.
“When a government doesn’t like the content coming out of a media industry,” they write, “it doesn’t always have to enact formal laws to censor it. Sometimes all politicians and bureaucrats have to do is make their displeasure over the content abundantly clear and to threaten (whether implicitly or explicitly) to crack down on the industry. Generally, a threat is all it takes to intimidate private companies into censoring themselves to preempt or prepare for the imminent crackdown.”
This is exactly what’s been happening. Big Tech CEOs have been hauled before Congress again and again to be threatened with regulation should they fail to “crack down” on misinformation. Can you really blame Facebook, then, for being quick to acquiesce when the FBI hints it wants more censorship? You can say they were technically free to make their own decision, but being technically free doesn’t mean they were actually free.
To be sure, Big Tech should not be let off the hook. They are part of this problem. But the blame for social-media censorship lies first and foremost with the government, not the tech companies.
It’s not Big Tech that we need to rein in. It’s Big Government.

Patrick Carroll has a degree in Chemical Engineering from the University of Waterloo and is an Editorial Fellow at the Foundation for Economic Education.
This page relies heavily on JavaScript.
Please, enable JavaScript and reload the page to enjoy our modern features.
This work is licensed under a Creative Commons Attribution 4.0 International License, except for material where copyright is reserved by a party other than FEE.
Please do not edit the piece, ensure that you attribute the author and mention that this article was originally published on FEE.org
This work is licensed under a Creative Commons Attribution 4.0 International License, except for material where copyright is reserved by a party other than FEE.

source

About Author