As public complaints mounted that Facebook was refusing to police dangerously deceptive U.S. political ads, the company stuck for nearly a year to a hard line CEO Mark Zuckerberg had drawn: “I don’t think it’s right for a private company to censor politicians or the news in a democracy.”
This month has been an about-face: First, Facebook announced that it is banning new political ads in the week before Election Day to prevent last-minute attempts to deceive voters. Then this week the company took it further, saying it will reject ads that claim victory prematurely as worries rise that President Donald Trump might do just that.
The path was neither direct nor swift. Those involved in the discussions over political ads say Facebook officials spent nearly a year wavering between its founder’s declarations on free expression and a desire to avoid becoming a presidential-election villain yet again.
A look into that year of deliberations reveals a company holding back from big public moves while it searched for a solution that would satisfy both its critics and its CEO — until a sense of emergency kicked in.
“It’s insanely close to Election Day for Facebook to make decisions like this,” said Mark Jablonowski, managing director of the Democratic-leaning digital ad firm DSPolitical, who argued that Facebook was after a public relations victory rather than a meaningful prophylactic against election mayhem. Facebook faces a barrage of charges that it is once again allowing misinformation, the inflaming of racial tension, and conspiracy theories to circulate in the run-up to a major American election.
Inside Facebook, the mood is still rattled as executives stare in the face of the chaos injected into the race by both Covid-19 and President Donald Trump’s high-pitched claims of election fraud.
“Holy shit,’ is what we’re thinking most of the time, like everybody,” one Facebook executive said of the current feeling about the U.S. election inside Facebook. The person spoke anonymously to discuss the mood inside the company.
The current firestorm over the social-media firm’s handling of political ads can be traced back to nearly a year ago, when Facebook vice president for policy and communications Nick Clegg said that the company would not be fact-checking political advertising in the run-up to the U.S. election.
Critics said that Facebook and Zuckerberg were abdicating their responsibility to democracy.
Under attack, Facebook quietly started floating a menu of possibilities for handling political and issue ads among campaigns, advertisers and activists, according to those familiar with the conversations. Most spoke anonymously to discuss private talks.
The options ranged from what one Democratic political strategist involved in the conversations called the “do nothing” approach — the policy articulated by Clegg, to the “nuclear” option — a full-and-forever ban on all political advertising on Facebook.
But field testing revealed that none of those options would satisfy critics on both the left and the right.
Democrats and liberal activists argued that the problem wasn’t political ads, it was Facebook’s refusal to police them. Republicans and conservative activists argued that given the Trump campaign’s long-running reliance on Facebook advertising, this was just Facebook’s attempt to silence him.
Meanwhile, Facebook’s social media rivals took action. Twitter announced in late October that it was, indeed, going nuclear: pulling the plug altogether on political and issue ads on its platform. Google said in November that it would limit the way political ads can be targeted — a method some argued had been used to manipulate certain demographic groups in 2016.
With no obvious perfect path forward for itself, in January, Facebook announced that while it was making some changes — improving its public cache of political ads to allow for more transparency and letting Facebook users opt-out of political ads — it was keeping its hands off the ads themselves.
Then came Covid-19.
Zuckerberg, say those in and around the company, became consumed with the coronavirus. They say he was quick to realize that the virus would upend just about everything, crediting in part early reports Facebook was being fed from its global network of office’s about pandemic’s severity. Earlier than much of corporate America, Zuckerberg shut down Facebook’s offices in the U.S. and beyond.
And the company began trying to tackle misinformation about the disease, such as using artificial intelligence to detect ads that broke the site’s rules against misleading health-related content.
Those close to Facebook say that the coronavirus outbreak was eye-opening for the company — in terms of realizing both that it could battle back against bad information and how fragile the United States was heading into the 2020 election.
The pandemic was a catalyzing experience for Facebook’s critics, too, who took away from it that Facebook could take action against harmful or misleading content when it wanted to.
Rashad Robinson, president of the civil-rights advocacy group Color of Change, said the episode made him think, “If they could mobilize that quickly after Covid, why are we here after 2016?”
That new thinking helped push Color of Change and others to organize an advertiser boycott of Facebook a few months later.
Meanwhile, as spring turned to summer, Zuckerberg also became increasingly alarmed by the inflaming of political and racial tensions in the United States, according to those close to the situation.
That included by Trump himself, as he sowed doubt about the legitimacy of the mail-in ballots many are expected to use to cast votes amid the pandemic and amplified the furor on the right over Black Lives Matter protests. Zuckerberg told employees he personally felt “disgust” over a Trump post that included the line “When the looting starts, the shooting starts” during protests over policing in Minneapolis in May.
Facebook struggled with how to treat some of Trump’s more controversial posts because it wasn’t in the practice of deploying the same sort of punitive labels Twitter was using, said the same Facebook executive who discussed the company’s thinking about the election. So instead they considered only whether to remove the posts or leave them up, instead of pursuing a more middle road.
Facebook eventually rolled out similar labels. It placed its first flag calling into question a Trump post about voting in early September, three months after Twitter first did so.
Around the same time, Facebook started to feel it had to act on ads as well: the country had become so tense and the election so fraught that its hands-off approach was no longer tenable.
It took some steps to block ads that broke the site’s rules — for example pulling down a Trump campaign ad decrying “far-left groups” that included a symbol once used by Nazis — but the moves served to draw attention to its generally hands-off approach to political ads.
And with election day bearing down, and after months and months of outside consultations and internal debate, Zuckerberg landed on a final decision.
The new ad-pause rule was seen inside Facebook as a middle path, what once source close to the situation calls the “best of both worlds” approach: not to aggressive to turn advertisers off of Facebook but enough to address the concerns of some critics.
Facebook spokesperson Andy Stone said the move is an attempt to strike a sensible balance, robbing bad actors of the chance to mess with the election while still letting “campaigns to make their closing arguments and run aggressive get out the vote efforts.”
Stone called this week’s pledge to reject ads that prematurely claim victory a “further clarification” of Facebook’s Sept. 3 announcements about the handling of election advertising.
One real benefit of the new-ad-pause: that middle path didn’t completely undercut Zuckerberg’s own rationale — laid out in a high-profile Georgetown University speech last year — for why the company had in the past taken a hands-off approach.
There was nothing wrong with Zuckerberg’s idea that politicians should be free to speak, as long as there was the chance for others to vet that speech and opponents to fire back, the thinking went. It was just that there wouldn’t be enough time for that to play out in the final days before the vote.
Facebook argues that campaigns should have plenty of time to adapt to the change. Some digital strategists say that they and others are already preparing to place limited runs of ads before the October 27 cut-off as a sort of stockpile and will choose which to throw dollars behind as election events unfold.
Some civil-rights advocates who have argued for years that Facebook has not done enough to thwart those who’d use the platform to stir up racial and ethnic conflict say they wonder, too, why it took Facebook until now to act.
“The fact that we’ve been fighting for these types of changes for years and then in early September before the election just shows how unwilling the organization is to do what’s right,” Color of Change’s Robinson said. He characterized Facebook’s moves in early September as “half steps” forward.
And in the end, Facebook’s shift on political ads has received a mixed reaction in the politics world.
Eric Wilson, a Republican digital strategist who led Sen. Marco Rubio’s digital efforts in his own 2016 White House run, called it “a colossally terrible idea,” saying it would interfere with efforts to turn out voters during election week.
Meanwhile, Henry Fernandez, a senior fellow at the left-leaning Center for American Progress and a Facebook critic, said that the election-week restrictions fundamentally misunderstand the 2020 election and the role being played by mail-in ballots and early voting. More than 40 states, and the District of Columbia, allow early voting.
By the time Facebook’s ban on new political advertising kicks in on Oct. 27, said Fernandez, “Millions and millions of people will have already voted.”