Your Facebook News Feed Sucks Because That’s How Advertisers Like It
At a time when American’s trust in the news media is plummeting to record lows, one recent Pew poll reported that about a third of U.S. adults say they regularly get their news fix from Facebook’s feeds—even if their own trust in Facebook itself is plummeting to new lows for, oh, whatever reason. Needless to say, the News Feed is a hugely consequential space that carries enormous sway in the lives of countless people scrolling through it every day.
While we’re still unearthing even more damning evidence detailing the piss-poor job that the company’s done handling its outsized societal role, there’s one question underlying every one of these pieces that nobody seems able to answer: How the hell do tools like News Feed even work? Is the shadowy algorithm that can bump the most polarizing content to the top of your screen doing so just to mess with us? Is there something more sinister at play here? What gives?
The company’s most vocal critics would probably say yes to any number of reasons—and they might be right! But a new theory suggests that the driving force behind our feeds might be hiding in plain sight.
Like the other hugely consequential pieces of Facebook’s ecosystem, the News Feed is dictated by a series of algorithms, arcane ranking systems that decide which users get cute kittens at the top of their feed, which ones get potentially dubious medical information, and which ones get yet another wedding announcement from yet another high school pal. Writing about the News Feed algorithm in 2016, the Atlantic compared its influence on the media to a traditional news mogul like Rupert Murdoch. But while Murdoch’s agenda is pretty easy to spot, an algorithm is, well, an algorithm—save for a few vaguely worded blog posts over the years, the company keeps a pretty tight lid on the system’s specifics.
If you ask critics—including a mounting number of ex-Facebook employees—about the algorithm’s agenda, however, one word keeps bubbling up again and again: engagement. As Facebook whistleblower Frances Haugen succinctly put it in a recent 60 Minutes interview, the company is constantly “optimizing for content that gets engagement, or reaction,” out of its users. Usually, this criticism of engagement is linked to content that’s polarizing, hateful, and divisive.
G/O Media may get a commission
$30 off
Samsung Galaxy Watch 4
Keep tabs on your health
With a large emphasis on biometric health data, one key new feature is an enhancement to their sleep tracking.
But what if the way Facebook designs your personal News Feed isn’t solely dictated by what you find “engaging?” What if your end-behavior—your clicks, your views, your angry reposts—were being influenced by the company’s advertisers that paid Facebook to squeeze that behavior out of you? It’s a pretty out-there idea that’s the subject of a recent paper delving into what’s known as “mechanism design,” a Nobel-prize winning economic theory that appears to underlie some of the most basic choices platforms like Facebook (and Google Search, and Uber, and others) appear to take when designing their products. In this case, the researchers propose that Facebook’s News Feed is chiefly designed to get users to do one thing: click on and/or react to the ads that they see. The whole “promoting extremist content” thing is just an unfortunate side effect, or so the theory goes.
“When they optimize outrage and engagement, those sorts of things … I think they’re all stopping points on the way to optimizing revenue, which is what they care about,” Lee McGuigan, one of the paper’s co-authors, told me over the phone last week.
“Facebook’s in the business of generating evidence of behavior for its advertisers first and foremost,” he went on. “Sometimes they call that behavior ‘attention,’ or they call it ‘website visits’ or ‘clicks’ or whatever.” It just so happens that extremist, polarizing content is surprisingly good at dredging up that behavioral evidence, because—as Haugen and others have pointed out, it’s more engaging. It’s especially more engaging than some kind of milquetoast, even-handed post at the top of your Facebook feed. And at the end of the day, McGuigan said, “an equation doesn’t care about specifics. Engagement looks like engagement, and even if it’s not exactly the kind of engagement advertisers are looking for, it’s just a byproduct of the business.”
McGuigan will be quick to note that he doesn’t have any firsthand knowledge about how Facebook’s algorithms are constructed. (We reached out to Facebook for comment on the study but didn’t hear back.) What he does have is an hour-long talk given by one of Facebook’s econ researchers, Eric Sodomka, back in 2015, along with countless independent studies published in the years since that focus on exposing how the company’s ad auctions operate.
As the paper points out, these systems aren’t unique to Facebook—sprawling, anonymous digital auctions are the typical way that digital ad space on a site gets sold off to advertisers, and have been for more than a decade. As you’re reading this right now, there are marketers bidding for slots on Twitter, TikTok, LinkedIn, and any other major platform you can think of.
What’s been unusual about Facebook’s auctions—at least historically—is that it takes into account nearby organic content (like users’ posts and videos) before deciding which ad will win the auction and appear in a person’s timeline. Compare that to the auctions that dictate, say, which Google Search ads appear at the top of your screen; rather than worrying about whether the results themselves might impact an ad’s performance, Google Search ads really only care about… which Google Search ads it appears alongside.
If Facebook predicts that a given ad (for, say, hemorrhoid cream) might not play nicely with whatever organic content is in your line of sight (let’s just say that content was from your parents), that ad might lose the auction or be placed in a different spot on the page to appear near user content that’s more beneficial for the ad.
The News Feed’s gone through its fair share of changes since Sodomka’s talk was recorded, but at its core, the News Feed algorithm that Facebook blurts onto all of our timelines in 2021 is trying to solve the same problem that Sodomka described six years ago. How do you decide which posts and ads get shoved in front of each user when they log onto Facebook when there are literally millions (if not billions) of possible combos?
“Facebook’s optimization problem is that they have a certain amount of space in front of you that needs to be filled with content of some kind or another,” McGuigan said. “So then they’re asking: what’s the best way to fill those slots to keep people on the platform and interacting?”
“Ultimately, those interactions are what the company wants since that generates evidence of a specific behavior,” he went on, adding that over time, the more behavior you funnel into your newsfeed—the more clicks, likes, shares, and so on—the better data profile the company potentially has on you, and the easier it’ll be to fill your News Feed with personalized content—and ads, of course.
Sodomka explained that the specific ads getting plopped where on your feed—and the content that’s served around them—are decided by plugging a slew of complex mathematical factors into a complex mathematical model, the details of which will immediately put you to sleep. Thankfully, he broke down some of the most basic elements about halfway through his talk:
There’s some probability that you will get to this [post on your timeline], and then it will have your attention, and given that it has your attention there’s some probability that you’ll click, or that you’ll do whatever event [to the ad below it].
The “event” Sodomka mentioned might be something like hitting the like button or clicking a link, saving the advertiser’s post, or hitting the heart button. Or it might be something completely different, depending on the campaign that advertiser’s running. Each different event and each unique advertiser carries its own unique “cost” that Facebook weighs when deciding where to squeeze it into your News Feed. Too many ads might make the cost too high (and the platform too annoying to use), which means you risk losing users. Shoving a loud, flashy ad right below a particularly emotional post from your aunt might have the same effect. The idea, McGuigan explained, is that you want to minimize those sorts of icky encounters while also maximizing those “events” on the end user’s behalf since that’s where all the ad money is.
“One of the more interesting things about Facebook is that when you buy ad space on its platform, it’s understood that you’re not just buying exposure, but you’re buying a certain outcome,” he said. “And Facebook’s just supposed to deliver that.”
Starting at the end (the ad clicks) and then massaging the surrounding content to make those clicks happen sounds kinda like a topsy turvy version of economics 101. “Typically, economists are looking at a set of circumstances and trying to predict what the outcome will be,” McGuigan said, “Like, ‘what happens to the labor market if there’s a change in inflation!’ or questions like that.”
But this kind of ass-backward methodology is ultimately what “mechanism design” is. Even though Facebook’s a platform where we constantly have the choice to click on an ad or post an angry comment on our uncle’s latest anti-vaxx post, the company’s entire design is geared towards “getting the outcome that advertisers want to see,” he said.
There’s a good number of tech critics (and at least one documentary filmmaker) that will hear this idea of a platform actively molding users to advertisers’ wills, and will immediately think: mind control. As cool/horrifying as that would be, what’s happening here is way, way less dramatic; it’s like a nudge. Facebook’s design choices, McGuigan said, aren’t going to make you click on every ad you see, for example, or spend all your cash on products the company advertises at you. What they’ll do is make you a smidge more likely to click on that ad or buy that product, because when that smidge happens over and over every day across a platform with Facebook’s size and scale, it’s still going to be profitable.
If we’re actually serious about holding Zuckerberg and Co. accountable, then we need to be investigating these minute design choices alongside the researchers and regulators currently sounding alarms about Facebook’s more, um, obvious offenses. If we solely focus on those big swings, we risk letting the worst parts of the platform slip through the same cracks that have been right under our noses for roughly the past decade. And honestly, that isn’t an option anymore.