The Atlantic

How Facebook Works for Trump

Donald Trump won the presidency by using the social network’s advertising machinery in exactly the way the company wanted. He’s poised to do it again.

Updated on April 18 at 2:00 p.m.

Look at a thousand of the millions of Facebook ads Donald Trump has run, and it’s hard to believe that they represent a winning strategy. They recycle the same imagery and themes, over and over: Trump, photoshopped in front of a flag, points a finger. Trump claps before an audience. Trump gives a thumbs-up, or smiles at a microphone. Each image is washed in patriotic red or blue. The text almost always issues a call to action: Buy this hat, sign this petition, RSVP to this rally.

They are notable only in their banality, and in their sheer volume. During the 2016 election cycle, Trump’s team ran 5.9 million ads on Facebook, spending $44 million from June to November alone. Hillary Clinton’s campaign ran only 66,000. In 2020, Democrats are still buying fewer ads: According to the Facebook ad archive, only Michael Bloomberg approached the ad volume of the Trump campaign, running more than 50,000 ads in February of this year, his last month in the race. During that time, Bernie Sanders bought only 8,400, Elizabeth Warren and Joe Biden even fewer. Everyone is using Facebook, but Trump is doing something different and, by most accounts, better.

At the start of the year, Andrew “Boz” Bosworth, who led Facebook’s ad team during the 2016 election, wrote that Trump “ran the single best digital ad campaign I’ve ever seen from any advertiser.” Trump’s team agrees, of course.

But that might not mean what you think it does. Trump didn’t master Facebook because of foreign interference by Russia or psychographic exploitation via Cambridge Analytica. He didn’t do it via microtargeting—the ability to send highly differentiated audiences just the right messages to change attitudes or inspire action—either, despite conventional understanding. His campaign did so via pure, blunt constancy, using Facebook in exactly the way the tech giant intended: pouring heaps of money and data into Facebook’s automated advertising system.

Trump’s 2020 digital director, Gary Coby, compared the strategy to high-frequency financial trading: Facebook has built an algorithmic ad-buying system with a mercenary drive toward results, and Trump’s campaign exploits it tirelessly. In the artificial-intelligence field, this system is the opposite of self-driving cars or robots or virtual assistants: a deeply boring, basically invisible application of machine learning that is dramatically reshaping our lives, not someday but right now.

Facebook wired a machine into electoral decision making. Political campaigns have ceased to communicate with voters and have begun to communicate with AI instead. Facebook’s artificial intelligence for delivering advertising is already a crucial component of a winning 2020 campaign—perhaps the crucial component. And it works in a tangled, outlandish way that no human, not even at Facebook, can ever fully understand.


Perhaps calling Facebook ads “advertising” in the first place is misleading. The pictures and text that appear on its website do indeed conform to the traditional meaning of that term. But the plumbing that selects and delivers those ads is wholly unlike what came before Facebook.

In the old days, advertisers bought guaranteed placements in print publications, on outdoor displays, or in media broadcasts. They would select these placements, in part, based on the audiences those media might reach: a glossy magazine for women interested in fashion, or a billboard that thousands of downtown commuters pass daily. At the dawn of the internet, advertisers did the same thing on websites: If a business wanted to get in front of a particular audience, it could buy space adjacent to the content that brought in that audience. Eventually, it could also buy space atop search results, bidding for placement based on terms typed into Google.

Facebook upended that. A “Facebook ad” is less an ad and more a machine for producing ads. Instead of paying to put particular media in front of a specific audience, an advertiser now pays Facebook to deliver a selected outcome from a certain stripe of people. For example, a clothing manufacturer might pay Facebook for webpage visits from women in their 30s who live in Los Angeles, or for likes by parents with college degrees whose online behavior is similar to that of users who had previously made purchases. How those ads get to which matching users is up to Facebook. Given some starting information, its system learns how to tune the delivery of the ad, in relation to all the other advertisers out there. In short, Facebook chooses which ads will be shown to whom at what price.

This utterly changes what it means to create and deploy advertising. Today’s advertisers simply assemble the raw information—data about actual and potential customers—that lets Facebook do the work on their behalf. When those ads successfully push users to take action, those actions generate ever more data, which in turn get funneled right back into Facebook … to help target even more ads.

In other words: Advertising previously involved identifying a market for products and services and then placing media to address that market. Now it means training a tech behemoth’s artificial intelligence to assemble the right audience from the scattered fragments of potentially similar ones.

Most political advertising on Facebook begins with lists of targetable users, which Facebook calls “custom audiences.” A custom audience can be created from data an advertiser has collected or acquired outside Facebook—lists of phone numbers or email addresses, for example. A custom audience can also be made of anonymized Facebook IDs generated when users take an action on a webpage marked with a Facebook pixel, a way of tracking online activity back to its source. Facebook offers tracking for 17 standard actions—such as donating to a cause and viewing a video—or marketers can create their own. We could have placed a pixel here at this spot in the page, and then we’d have you in a custom audience, re-targetable whenever we were looking for readers who actually read long-form political-technology articles.

After advertisers have identified their audience, the savvy ones then employ a related tool that builds “look-alikes”—groups of people who Facebook predicts will act similarly to those included in a custom audience. When you find an advertising process that yields a good harvest—more people signing up for a rally, buying a hat, giving up a phone number—the seeds can be saved and propagated, the methods for raising them used over and over. If it’s working right, the system can grow organically—more money means more ads means more money means more ads. If it’s not, the cycle of decline is just as inevitable. This is great news for Facebook, because once you start buying ads there, it becomes hard to stop.


Trump spends tens of millions of dollars on Facebook marketing. But his boring, forgettable ads work so well because his campaign has been willing to cede control to Facebook’s ad-buying machinery, and then to cultivate the results into even more Facebook ads.

Facebook wants users to spend as much time as possible on its site. To incentivize that, the company makes more engaging ads less expensive to run. So to lower their costs, marketers can either make ads that perform better or find audiences that respond well to the ads they’ve already created. Trump’s team has bet on the latter.

To illustrate the process in practice, we examined the set of ads the Trump campaign ran on Facebook to promote a single rally on January 14, 2020, in Milwaukee, Wisconsin. The state helped decide the 2016 election, and it remains a 2020 battleground. The ads we reviewed ran during the week before the rally took place. (We are grateful to Northeastern University research scientist Piotr Sapieżyński for supplying the initial data set.)

All of the Trump Milwaukee-rally ads—almost 1,800 of them—in the order they ran on Facebook. The ads are so similar that they are often hard to distinguish from one another. (Facebook / Donald J. Trump for President)

For this campaign, Trump’s team bought a lot of ads—1,780—but they looked almost identical to one another. Remember, those 1,780 “individual ads” comprise just outputs of the machine. Each ad used one of four versions of ad copy, matched to one of six pictures of Trump, composed and cropped in three different ways to fit different ad slots across Facebook and Instagram. As you can see from the ads above, the results are unique but hardly distinct.

This was not microtargeting: Only 3.6 percent of the Milwaukee-rally ads targeted a narrow demographic group. The campaign also spent very little money on any one placement, suggesting that it was not conducting trial-and-error tests for specific audiences. Almost all the ad buys—98 percent of them—cost less than $99 each, perhaps far less. The largest was only $599. (By comparison, Trump’s two Super Bowl spots cost about $11 million.)

The Milwaukee-rally ads were mostly very low cost, under $100, and relatively evenly distributed across the gender and age categories that Facebook offers for demographic targeting. These were not microtargeted ads, nor were they small-scale tests of which a few successes were ramped up into much larger spends.

Rob Goldman, the former vice president of ads at Facebook, knows exactly how an ad campaign such as this might have worked. The Trump campaign would likely have had dozens of custom audiences from which to start, he said, such as donor lists and recent website-visitor lists. “Maybe there is a local politician and they’ll have a list from that person.” Or lists of people grouped by interest in specific issues, such as immigration, collected around Milwaukee, Goldman suggested. Those data form the hidden variables that generate so many ad variants. Smaller audiences, such as the ones Goldman postulated, could make good targets, but that is not guaranteed. The only way to really find out is to run ads against them and see what works.


Building target lists has been a campaigning cornerstone for decades. But unlike direct-mail marketing, Facebook offers no guarantee that the people an advertiser targets will see an ad run with that targeting. This isn’t like buying a list of periodontists and mailing out flyers to offices. Instead, each slice of a user’s propensities is sold in a complex auction conducted billions of times a day for each ad impression.

Finessing a Facebook-marketing campaign requires getting the machine to actually show the ads to people. Making very high bids would do the trick, but that would become prohibitively expensive, even given a lot of money to burn.

Remember, though, “an ad” on Facebook is not just an individual image and link. It’s the combination of content and the prospective audiences that advertisers want to push toward an action. For years, Facebook has helped advertisers deliver that objective at the lowest price. Now the company encourages advertisers to hand over the reins entirely, letting Facebook allot spending among ads, audiences, schedules, and budgets. The company even automates the bidding on the advertiser’s behalf. As one Facebook marketer put it, “Facebook’s advertising algorithm has gotten so much better at automating campaign management that it can now easily outperform a human manager.” That’s why the company advises advertisers to “embrace a certain agnosticism towards placement, platforms and yes, even audience. This gives systems more opportunities to consider when assessing which will deliver the best performance.”

To turn the crank that starts that machine, according to Goldman, marketers must build their own knowledge of the users they have or want to reach. “Hand that to Facebook in the form of a custom audience or a look-alike against a custom audience,” Goldman said, “and then you’ve given Facebook a hint about where to learn.”

Learn in this case means “learn by machine.” Initially, the Trump campaign allocated very little money across a small number of ad variants. Then Facebook’s machine took over, trying to maximize the “value” of advertising amid posts from friends (what Facebook calls “organic content”). Facebook has to sift through billions of bits of organic content and trillions of possible marketing messages in order to render your newsfeed on the fly. To do so, Facebook looks at everything on its platform in the same way—ads are no different from kid photos or gripes about work.

As with all machine-learning systems, outcomes can be difficult to trace back to causes. When Facebook shows you a particular ad, it does so based on a shadowy projection of you, formed from actions you have (or haven’t) taken on content that Facebook tracks. The things you click on compose part of that signal. The way that others with similar preferences had engaged with similar ads does too.

Goldman said that in many campaigns, a few ads become the runaway winners. In the case of the Milwaukee rally, the system did find a few winners—the 35 or so ads that the algorithm put more money behind. Using the data Facebook makes available, we can’t say why those ads worked and others did not. The truth is, Facebook probably can’t say either. Neither can the Trump campaign. That’s why it has to continue to run so many ads.

On January 8, 2020, at 12:23 p.m. EST, the Milwaukee-rally ad campaign begins. The ad sets are loaded up in order, walking through all the combinations of creative and basic targets, as the image above shows. This is probably done by a human who created the campaigns and issued the initial ad buys.

Eight hours later, at 8:22 p.m. EST, Facebook has collected some data about the performance of these ads. The algorithm takes over, buying the placements thumbnailed below within the next 45 minutes. In comparison to the first set, the buys seem haphazard, almost random, driven by the unknowable rationales of Facebook’s AI.

If this sounds murky and evanescent, that’s because it is. Campaigners have to teach a machine they can’t fully understand (because no one can) how to find their tribe. Then that machine finds people who seem similar, in ways nobody can identify, based on factors that no one knows. Then it interprets bids and places ads based on even more ultimately untraceable factors.

Facebook now makes its predictions on 2 million distinct “features,” Goldman said. These might be the last place a person seeing an ad ate a hamburger, or the minute an ad auction was launched, or the percentage of battery life left on someone’s phone. If it exists anywhere within the Facebook data-collecting universe, the machine-learning models have tried to account for it. “Everything you could imagine is a feature,” he concluded. And, clearly, a whole lot of things you couldn’t.

Do the predictions make a good model of a person’s actual inner desires? Do the ads “work?” It doesn’t matter. Facebook’s ad software doesn’t try to get someone to buy a product or vote for a candidate. It merely tries to produce the results that advertisers declare they want, by serving ads to users similar to the ones who furnished those results on earlier, similar ads. Each action a user takes or doesn’t take—clicking, liking, sharing, commenting, donating, hovering, buying, filling out a form—slightly changes the complex network of predictions that form Facebook’s picture of a person, which is to say, a consumer. From Facebook’s perspective, the big political advertisers are just sources of more data for the company to incorporate and optimize—not contenders competing to lead the free world.


The brave new world of machine-learning-automated ad buying is throwing human marketers for a loop. The algorithm provides some feedback in the form of campaign outcomes—site visits, sign ups, and the like—but no one really knows which signals will show ads at a fairly consistent price and volume. Trying the same ad tomorrow might produce different results than it did today.

Over time, Facebook has exerted more and more control over what ads get shown, to whom, and for how much money. Paradoxically, that means marketers have to constantly monitor what Facebook is doing, lest they end up spending too much or too little. On an episode of the Facebook-marketing podcast Perpetual Traffic, the Facebook-media buyer Nehal Kazim, CEO of AdPros, reflected on how it felt to let Facebook take over a greater degree of decision making. “It feels like day trading. It feels like gambling,” he said. “What am I doing?”

What he’s doing is learning to communicate with an incredibly complex artificial intelligence, so that it will deliver ads that produce the results he thinks he desires. No wonder it’s confusing! Picture those scenes from the movie Arrival, where the octopus-like aliens convey some signs and the humans have to figure out how to write back. Facebook marketers use brute force to make their way through this process, hoping to send the right messages for those few hours, and then the next, and then the next.

None of this is magic. Democrats did well in the 2018 midterms, a fact that’s not lost on someone like Charlie Rybak, who ran millions of dollars’ worth of Facebook ads for the Democratic PAC Priorities USA in 2018. “The Trump campaign isn’t doing anything special on Facebook, just like they didn’t do anything special in 2016,” Rybak said, taking a jab at Brad Parscale, Trump’s 2016 digital director and his current campaign manager. “Parscale sold them on the idea that he has a secret formula, but he’s really just launching thousands of ads and spending millions of dollars, which would work for anyone that had the resources to do it.”

According to one source close to the 2016 Trump campaign, its advantage came from a surprising inspiration: the mobile-gaming company Machine Zone. Machine Zone was an early poster child for high-volume Facebook advertising. At one point, the company claimed it was “probably the world’s largest direct-response marketer,” thanks to its ubiquitous ads for games such as Mobile Strike and Game of War. Machine Zone had even built a set of tools that allowed it to do this kind of audience building and Facebook-ad buying before Facebook and other service providers made it possible to do so.

Machine Zone does not appear to have worked directly on the Trump campaign (the company didn’t respond to our requests for comment), but according to a former Machine Zone employee, its methods inspired Trump’s digital team to adopt that kind of advertising strategy. Before this, Parscale’s firm was a mid-tier online-ad agency, creating websites for local gastroenterology offices and the like. Machine Zone’s co-founder and CEO, Gabriel Leydon, was friendly with Jared Kushner, a connection that might have helped Parscale make the transition from prosaic web marketer to presidential promoter.

From there, the Trump team created a small-dollar fundraising machine on Facebook. It didn’t take technical genius as much as help from people like Coby at the Republican National Committee, James Barnes, a Facebook employee assigned to work on the Trump campaign, and their worker bees at ad-buying firms. Together, they have made modern elections fully cybernetic, a complex and dynamic interweaving of human and machine that’s utterly unlike what came before it.


As the 2020 general election approaches, Trump has a new form of incumbent advantage. People have marveled that Trump never stopped running Facebook-ad campaigns. And the reason is, he couldn’t. The whole point is that the campaign has to keep fresh data flowing through the system. Most of the time, it can optimize for the cost of acquisitions, hoovering up money and data from the Facebook users it targets. Then, at strategic moments, the team reverses the machine, spending whatever money is required to get the highest penetration and the widest reach among their people.

The COVID-19 pandemic has entrenched Trump’s upper hand on Facebook, but it has also created new opportunities for his Democratic competitors. By mid-March, as the novel coronavirus began to shut down America, the cost of Facebook ad placements had plummeted by as much as 40 percent, according to Kazim, the Facebook-marketing-firm founder. That’s because—although more people have been spending more time on Facebook, looking for information and comfort—more advertisers have pulled back their spend as a result of economic calamity. It’s thrown a wrench into Facebook’s advertising machinery, offering an advantage to advertisers who are paying attention.

The Trump campaign made the most of the opportunity by buying thousands of now-cheaper ads—for Trump/Pence Keep America Great dog collars, for American Worker caps and yard signs, for renewed calls to donate. Those might seem like unlikely appeals when a terrifying virus is ravaging the nation’s citizens and its economy, but Kazim thinks they make perfect sense: The smart advertiser, he said, would double down on messaging that produces engagement among people already predisposed to respond well to it. Even if those users don’t buy many dog collars, the ads will drive the users to Trump websites, where Facebook pixels will slurp up revised data that will help refine the Trump campaigns’ advertising even more.

These same conditions have also leveled the playing field for Biden, Sanders, and others. But so far, the Democrats haven’t taken advantage of the opportunity. The Biden campaign was still running 96 percent fewer ads than Trump in the week after St. Patrick’s Day. Sanders bought one single ad that week: a lament about Biden’s super PAC out-fundraising him.

This spring, the coronavirus changed almost every aspect of American life. But Facebook marketing has still persisted much as before, at least for the advertisers prepared to use it effectively. The endless hustle of Facebook marketing means that the organizations already playing the game are best positioned to alter course. By the time we spoke with Kazim, he had already published multiple videos for his clients about managing Facebook-ad campaigns in the age of COVID-19, outlining a 13-point strategy for adjusting to the new conditions. He trusts Facebook’s ad-buying algorithms as much as before, provided advertisers adjust for changes in the platform’s behavior. “If your messaging resonates, and you’re being fluid,” he said, “then you can’t ask for a better situation.”

Even amid a historic pandemic, this election will be a lot like the last, at least as far as Facebook’s ad machinery is concerned. What might be different is that to get the best results, each campaign will have to give up more control to the platform itself.

That takes resources, but it also requires the somewhat cynical will to carry it out adeptly—candidates for whom the sensation of day-trading democracy might feel exhilarating rather than estranging. Bloomberg, whose short-lived campaign spent $45 million on Facebook, made his fortune selling computer software for exactly that purpose, to exactly those people, and perhaps that made his campaign predisposed to embrace algorithmic advertising.

* * *

Trump’s 2016 campaign never really ended, and churning many generations of data in and out of Facebook in the meantime has probably given him an advantage. That it certainly has is hard to say, because seeing how data flow back and forth between political campaigns and Facebook isn’t possible.

That’s a problem. Facebook prides itself on the supposed transparency offered by its political-ad archive, but that resource provides insufficient detail. If policy makers or election officials wanted to throw a real wrench into political advertising on Facebook, they could ban the use of custom and look-alike audiences for campaigns. That said, given how Facebook-ad delivery works, eventually the “algorithm” will decide who sees the ads anyway.

The Cambridge Analytica scandal was comforting by contrast: There were villains, data were taken, and psychographic prediction sounded scary. But people more familiar with Facebook’s advertising system scoffed at the uproar. Unlike the psychographic traits—a fondness for Björk makes you open-minded, say—few, if any, of the real factors that Facebook uses get labeled in a way humans can understand.

As Facebook’s profits grow, the machine-learning ad-delivery system keeps expanding—quietly, perpetually. Occasionally, glimpses of its massive scope peer up from the murk. Earlier this year, Facebook announced the launch of its “Off-Facebook Activity” management tool, which purports to allow users to view and control how organizations such as the Trump campaign can use information collected outside of Facebook—from purchases and voter registrations, for example—to retarget them. Those data can be harrowing to see, as The Washington Post’s Geoffrey Fowler learned: “My Post colleagues found that Facebook knew about a visit to a sperm-measurement service, log-ins to medical insurance and even the website to register for the Equifax breach settlement.”

Facebook

New controls allow users to opt out of allowing Facebook to use off-Facebook activity to target them, but doing so doesn’t stop organizations such as the Trump campaign from sending those data to Facebook, nor does it prevent Facebook from using the data it receives to train and operate its machine-learning ad-targeting systems, which subsequently improve and become ever more incomprehensible to humans.

Facebook marketing isn’t an advertising strategy so much as a new way of life conducted amid an unseen alien intelligence. Bewildered but willing, advertisers such as Kazim are embracing their alien impresario—partly because they have no other choice: “I believe letting Facebook do its thing is the future,” he said. They are not alone. Users—who are also citizens—similarly have no way out. Letting Facebook do its thing has become a requirement for electoral politics, and democracy’s future is entwined with the results.

Ian Bogost is a contributing writer at The Atlantic.
Alexis Madrigal is a contributing writer at The Atlantic and the host of KQED’s Forum.