Social media has long touted itself as a great equalizer that enables anyone with an Internet connection to reach a limitless number of people across the globe.
For the past several years, that has become merely aspirational. Algorithms that dominate social media platforms directly control the content we see and how we see it in ways that are far from equitable and transparent.
After the 2016 election, it became clear that Facebook’s algorithm had a quirk that allowed polarizing content and misinformation to quickly spread by virtue of their high engagement levels.
Despite the company’s guise of post-election soul-searching and its supposed efforts to improve the situation, Facebook’s actions show the company is committed to serving up manipulative content that drowns out independent thought.
Writer Zadie Smith touched on this and her arms-length relationship to technology on a recent episode of the NPR show On Point.
“I can’t think in an algorithm. I don’t feel free, and I’m very addicted to freedom,” she said. “So, I try to work outside of it as much as possible.”
Facebook’s strongarm influence over how its users think and feel is bad enough, but many of the company’s frequent algorithm tweaks have lowered the prevalence of news in people’s feeds. In that way, Facebook aids in the suppression of journalism that would, in theory, provide its users with the information they need to form their own conclusions about the world.
Yesterday, independent journalist Judd Legum wrote on Twitter about what this looks like if you’re a publisher.
Last week, Legum posted a link to his article about Facebook’s recent moves to give conservatives and the far right tremendous advantages on the platform – things like permitting falsehoods in political ads and allowing The Daily Caller to join Facebook’s befuddling factchecking program.
According to Legum, Facebook showed that post to just 632 individuals out of his nearly 11,000 followers. That means that out of the thousands of people who signed up to receive updates from Legum, less than 6 percent actually had access to that post in their feeds.
Legum acknowledged he has a bigger following on Twitter than Facebook, but he said the same link spread to more than 800,000 people on Twitter.
The bottom line, as Legum writes, is that Facebook “is (objectively) not a free speech platform,” despite CEO Mark Zuckerberg’s regular claims to the contrary.
Legum’s example shows how Facebook arbitrarily limits the reach of content and impacts publishers’ ability to maintain and grow their audiences on the platform. These are things that I’ve thought about a lot this year while watching Big If True’s traffic from social media plunge to previously inconceivable depths.
Last year, social media made up 60 percent of the traffic for Big If True. According to our most recent analytics, that number has fallen to 19 percent.
Big If True has far fewer Facebook followers than Legum, but I can’t tell you the number of times this year that I have watched our posts reach a microscopic fraction of our audience before the platform cuts us off.
The advice I’ve received on ways to address this from industry experts and my own research has run the gamut.
I’ve heard we could improve our algorithm slot by posting original photos and video, which Big If True doesn’t have the resources to create. I’ve heard that Facebook may be dinging us because our content simply isn’t relevant enough to our followers, despite the fact that it used to be. Some journalism organizations have made up the difference and then some with newsletters, while others say at least some of their readers get what they need from the email product and are then less likely to visit the actual site. And then there is the reality check – that journalists over-relied on social media in the first place, and traffic from Facebook is as good as gone, permanently.
Should Big If True invest more time in crafting posts that are somehow more engaging? Probably, but this has the look and feel of a lost cause.
I’m sharing this because I want fellow journalists (and ideally, Facebook itself) to know that this dynamic creates unspeakable barriers for anyone attempting to start a news site without tremendous resources going in. People have historically been exposed to print publications through the physical world – newsstands and bookstores and clippings shared with friends and family. Social media is one of the main places people live online, but it isn’t a substitute for things like newsstands, which no one can magically will only 6 percent of the world to see.
So, this much is clear. Facebook is choosing what we see – the ideas we’re exposed to, the people and organizations we can engage with and learn from. Facebook is also choosing to show us content that’s divisive and manipulative, and they’re barring us from counter-balancing that with facts within the platform itself.
In a too-perfect example of how Facebook handles journalism, earlier this month, the platform flagged posts from Alaska news outlet KTOO as clickbait. As punishment, Facebook lowered KTOO’s reach on the social network.
The only hitch was that KTOO doesn’t share clickbait. Two posts that Facebook flagged were about a local election and an old mining town, and both KTOO’s headlines and its post content accurately represented the stories. This bizarre decision left me wondering what, if anything, I could do if something similar happened to Big If True.
Yet, entities that repeatedly post propaganda continue to function and thrive on Facebook. Despite persistent complaints that big tech is targeting conservatives with account suspensions and limited reach, traffic at some conservative sites has actually gone up this year. How could they do that without some buy-in from social media platforms?
That implicit buy-in, a silent endorsement really, happens whenever Facebook allows doctored videos of politicians to stay online, whenever social media companies allow despicably untrue claims to gain traction so that as many people can see them as possible.
Smith wasn’t the first to do so, but in her On Point interview, she emphasized that none of this was inevitable. Companies like Facebook made deliberate choices that turned social media into something a lot less fun and a lot more toxic. They’ve had years to consider the fallout of these decisions and how to reduce the amount of harm they’ve created, and they’ve had years to recalibrate and return their black holes of Internet space into something even slightly less terrible.
Quitting these platforms or complaining about them isn’t enough to get them to change. The proof is in the pudding: They simply don’t give a shit.
The solution is to give other companies a shot at designing social media platforms that suck less – ideally, to put it mildly, companies that aren’t required to follow China’s censorship policies.
There’s a market for it. The internet’s evolution to what it is today wasn’t written in the stars. And it doesn’t have to stay this way.
Contact Big If True founder Mollie Bryant at 405-990-0988 or bryant@bigiftrue.org. Follow her on Facebook and Twitter.
We’re nonpartisan and nonprofit. Support Big If True.