As a matter of fact, Facebook can’t debar people’s accounts just for sharing 50-plus false, astonishing or clickbaity news articles per day. It doesn’t want to disregard anyone’s right to share. But there’s nothing stopping it from burying those links low in the News Feed so few people ever see them.
Today Facebook declared an algorithm change that does just that. It discerns links that are truculently shared by suspected spammers, and deprioritizes them in the News Feed. Facebook’s research shows that these links “tend to include low-quality content such as clickbait, sensationalism, and misinformation,” so showing these links less bodaciously could improve the quality of what people see on the social network even if this change doesn’t analyze the specific content behind the links.
If you just like to post a lot on Facebook, this shouldn’t affect you. Pages are free to post as much as they want as this change only scrutinizes individual user accounts. And Facebook says only Pages that depend on these spammers for traffic will see a drop in their distribution.
That’s because Facebook’s VP of News Feed Adam Mosseri tells me it’s targeting “People who are sharing deliberately and purposefully . . . who purposely share vast quantities of things. Fifty-plus posts a day. Very significant statistical outliers. People having an outsize impact, inundating the News Feed with public content because they have some sort of agenda.”
Facebook started its existing battle with clickbait back in 2014 by relegating links to websites people opened and immediately jumped back to Facebook. Since then it’s changed the algorithm to show fewer japes, trained AI to weed out clickbait and spam, tackled fake news with reporting options and fact checkers, relegated links to trashy ad-filled sites and amplified its attack on clickbait to nine more languages.
“We’re trying to do as much as we can to get false news, clickbait and sensationalism off our platform,” Mosseri insists. “We’re a platform that tries to empower people to share. And so these people are in a gray area. They’re spamming but not necessarily violating any specific policies that we have so we think this is the right type of approach.”
If Facebook can extradite this kind of content from its feed, people will spend more time reading, be more confident about what they click and Facebook can better fulfill its new mission statement to “bring the world closer together.” While overtly false news gets most of the attention, it’s only a small part of what’s shared. Sensational and biased content that polarizes society is a more subtle but sizable problem. Downranking this content in News Feed could help liberals and conservatives build stronger bridges across the ideological divide.