Mark Zuckerberg knew his keynote speech at F8 this year would not be like any other. His previous appearances at Facebook’s annual developer’s conference were all about the new products and technology Facebook was announcing that day, and the vision he would share for future triumphs.
But in the wake of Cambridge Analytica, fake news, Russian election-tampering, incendiary hate speech and did we mention Cambridge Analytica, Facebook has dished out serial apologies and embarked on a steady march of product adjustments and transparency initiatives, a course that is nowhere near completion. Zuckerberg understood that at this F8, he could not give short shrift to the near-existential crisis his company is undergoing. But he also didn’t want to ignore the main function of F8—refilling the pipeline of new products and visions.
“The hardest decision this year hasn’t actually been investing so much in safety and security,” he says. “I mean, that was obvious—there was no choice to not do that. The real question is how do we also find a path to move forward on all the other things that our community expects from us.”
Zuckerberg is telling me this as we meet offsite on the eve of F8, during his last-minute preparations for the event. We talked for almost an hour, discussing his keynote, some of the new products he’s announcing, his feelings about his ten-hour congressional testimony, the question of whether the company censors conservative speech, the need to make Facebook more proactive in policing content, and why it will take three years to do it.
But first we talked about how he was going to thread a tiny needle on stage: rebuilding trust while also fulfilling the expectations of fans who want cool stuff and developers whose businesses depend on Facebook’s continued evolution. “That’s going to be what this whole conference is about,” he says. “On the one hand the responsibilities around keeping people safe—the election integrity, fake news, data privacy and all those issues are just really key. And on the other hand, we also have a responsibility to our community to keep building the experiences that people expect from us. Part of the challenge of where we are is making sure that we take both seriously. F8 is going to be a balance of those two points.”
The mea culpas come first. After the initial revelation of the Cambridge Analytica episode, which exposed the data of 87 million users, there was a five-day period when Zuckerberg and his operating partner Sheryl Sandberg were harder to find than the Golden State Killer, a mistake that Zuckerberg acknowledges. “We were initially very slow to respond,” he says. “We were trying to understand all of the internal details around what happened. And I think I got this calculation wrong where I should have just said something sooner even though I didn’t have all the details. Since we dug in and learned all of it, I think we’re doing the right thing. It’s just that we should’ve done it sooner.”
In his keynote the plan is to acknowledge Facebook’s problems and to introduce yet more features to address them. But he’s also getting past the apologies. The dilemma he faces at F8 is a paradigm for his larger problem at Facebook. “The question isn’t, do we feel bad. Of course we feel bad. But what we owe the world is, ‘Here’s what we’re going do to make sure that doesn’t happen [again].'”
Zuckerberg will then introduce one welcome new trust feature: the ability for users to wipe clean information that Facebook has gathered about them from their activities off Facebook, like web browsing. Zuckerberg compares this to cleaning the cookies out of one’s browser, a form of digital hygiene that he occasionally practices himself. Apparently, this improvement was an indirect product of Zuckerberg’s ten hours in the congressional hot seat last month. Zuckerberg tells me he had anticipated that the legislators’ questions would largely focus on Cambridge Analytica and maybe the Russians, but instead they fired a broad range of questions at him, many of them involving the deep weeds of Facebook’s operations.
“I figured other product questions that came up, I’d be able to answer, because I built our product,” he says. That was overly optimistic. “One of my takeaways was that I actually felt like I didn’t understand all the details [on things like] how we were using external data on our ad system, and I wasn’t OK with that,” he says. “On the plane ride back, I scheduled a meeting. I was like, ‘I’m going to sit down with this team and learn exactly all this stuff that I didn’t know.’ ” The result of that remedial education was an option for users to cut that information loose.
Zuckerberg has other takeaways from his hours on the Hill. “There were more questions about bias than I had expected. I think that that reflects a real concern that a lot of people have about the tech companies, which are predominantly based in Silicon Valley and Seattle and these extremely liberal places. That depth of concern that there might be some political bias really struck me. I really want Facebook to be a platform for all ideas.” Watch for some ideas on that.
When I asked how he judged his performance in DC, he corrected me. “I didn’t view it as a performance,” he says. “I think the point is to try to get people the information they need to do their jobs.” I mention that a number of the senators and representatives seemed less interested in hearing information than they were in the dulcet tones of their own voices. But Zuckerberg (savvy fellow) didn’t take that bait. “Thinking about the ratio of how many people raised serious questions to people who just wanted to make a point, I came away feeling heartened about our democracy,” he says.
Well, okay. I move on to developers, who are supposed to come to F8 shivering with excitement and leave with the fervor of empowered believers. Aren’t they going to be worried about the restrictions put on them as Facebook tightens control of information after Cambridge Analytica? Certainly they weren’t happy when Facebook suspended app reviews in March, essentially freezing their new products.
“I think there is concern, and it’s clear that our priorities are making sure that people’s data is secure,” he says. “The reality is the vast majority of developers have good intent and are building good things. So I think if you’re a good developer, it’s annoying that app reviews got stopped, but you’re not really worried long term about the direction of the platform.”
App reviews apparently will resume after F8. And Zuckerberg does think that the F8 announcements—including some involving Messenger, Instagram, and Oculus—will thrill developers.
Speaking of announcements, Zuckerberg brings up the most surprising one he plans to make—a new Facebook service called Dating. This Tinder-esque development allows users to create separate profiles to pursue romantic connections, with Facebook acting as an algorithmic yenta. In any other year, this type of service makes sense. But considering that the company is facing its biggest crisis ever because of its handling of personal data, doesn’t it seem a little risky to be adding a new data set with some of the most personal information ever?
At first, Zuckerberg answers the question by explaining that the new feature builds on the fact that people have always used Facebook for dating, and by noting the product’s various protections. (Facebook, for instance, will not use that information in targeting ads.) So I change the subject and move on to other questions. But a couple of minutes later, he returns to the Dating discussion, clearly disturbed at my implication. “Obviously, you’re asking this question,” he says. “But do you think that this is a bad time to be talking about this?”
I tell him that I get that Facebook is taking steps to isolate the information from one’s regular profile. But isn’t he worried that people might look at Dating and say, “Wow, Facebook wants to know this about me, too? “
Zuckerberg straightens up in his chair—this issue is dead in the center of the devilish tensions between trust-building and maintaining momentum. “This is the threading of the needle we talked about up front,” he says. Of course, Facebook has to keep introducing new products, announcing stuff on Marketplace, introducing a new augmented reality camera platform, shipping the standalone VR headset Oculus Go. But he doesn’t want people to think that because the company is moving forward, it’s any less serious about winning back its users’ trust. “Because my top priority is making sure that we convey that we are taking these things seriously,” he says.
Before we wrap up I ask him—has this crisis made Facebook different?
His answer is both no—the mission is the same—but, in a way, yes. “I really think the biggest shift is around being more proactive, around finding and preventing abuse. The big learning is that we need to take a broader view of our responsibility. It’s not just about building tools and assuming that humans are on balance good, and so therefore the tools will be used for on balance good. It is no longer enough to give people tools to say what they want and then just let our community flag them and try to respond after the fact. We need to take a more active role in making sure that the tools aren’t misused.”
Zuckerberg recognizes the difficulty of remaking his systems to proactively catch harmful content. “I think this is about a three-year transition to really build up the teams, because you can’t just hire thirty thousand people overnight to go do something,” he says. “You have to make sure that they’re executing well and bring in the leadership and train them. And building up AI tools—that’s not something that you could just snap your fingers on either.”
But Zuckerberg says the three-year journey is already well under way. “The good news is that we started it pretty early last year. So we’re about a year in. I think by the end of this year we’ll have turned the corner on a lot of it. We’ll never be fully done. But I do really think that this represents a pretty major shift in the overall business model and operating model of the company.”
Meanwhile, judge for yourself whether Facebook is changing by what happens in Mark Zuckerberg’s most unusual F8 keynote yet.