Mark Zuckerberg Speaks, But Is He Listening?

Facebook's CEO addresses the Cambridge Analytica mess, but avoids the bigger questions.
Image may contain Head Art and Graphics
Albert Gea/REUTERS/WIRED

After five days of silence, Mark Zuckerberg undertook an apology tour today, posting to Facebook and speaking to a few media outlets (including WIRED) about the Cambridge Analytica scandal that has embroiled his company. He acknowledged to WIRED that trusting Cambridge Analytica was "one of the biggest mistakes we made." He referred to transparency as "an area where we need to do a lot better."

Still, he implied that he and his company were doing as well as anyone could expect. After all, as he told WIRED, "there's no way that sitting in a dorm room in 2004 you're going to solve everything upfront." And, of course, he pointed out the "good news" that "big actions that we needed to take to prevent this from happening today, we took three or four years ago."

Zuckerberg's performance repeated the familiar beats of a traditional Facebook mea-kinda-culpa. It's similar to the tone struck by executive Andrew Bosworth when he tweeted that, well actually, Cambridge Analytica's purloining of Facebook users data for use in political subterfuge is "unequivocally not a data breach" because "no systems were infiltrated." It also smacked of Zuckerberg's regrettable statement in November 2016 that it was "a pretty crazy idea" to think that fake news had an impact on the election since, if you looked at the data, it represented a small percentage of all the communications posted to Facebook. (Zuckerberg has since walked back that statement.)

None of this is to dispute the facts as Zuckerberg presents them. Facebook really did shut down the program that allowed researchers to hoover up data on 50 million Facebook users, most of whom never consented to it—so, technically, a non-breach like Cambridge Analytica could not happen today. But that is missing the forest for the user IDs. Yes, Facebook needs to do more to protect our data, but it also needs to reckon with the function it has come to perform in our communications infrastructure—providing advertisers and politicians with powerful tools to manipulate its users' opinions and moods without sufficient transparency.

The classic non-apology relies on the passive tense: "mistakes were made." To his credit, Zuckerberg used the active voice in his Facebook post when he announced that "we also made mistakes." But in his interview with WIRED, he added to the passive-tense canon with his own chestnut: "The world is changing quickly and social norms are changing quickly." Zuckerberg suggests this as a reason why Facebook sometimes falls short—who can be expected to flawlessly adapt to such constant change? But in doing so, he neglects the fact that Facebook itself is the source of much of that change. The debates around fake news and hate speech did not happen to befall Facebook; we are having them in part because of Facebook. Zuckerberg often paints his company in this light, as fundamentally reflecting its users' and society's behavior rather than shaping it. Fixing fake news is a "hard problem"—ignoring Facebook's role in creating the problem in the first place. The social norms around privacy are "just something that's evolved over time," a stance that elides his company's interest in nudging that evolution along.

Zuckerberg is treating the Cambridge Analytica scandal similarly. The company was a bad actor, an aberration, that Facebook had to deal with. That's why Facebook has responded with policy solutions. That's reasonable, but the bigger question is whether Facebook bears some responsibility for creating the environment that allowed bad actors like Cambridge Analytica to flourish. As critic Neil Postman presciently wrote in 1992, "we are surrounded by the wondrous effects of machines and are encouraged to ignore the ideas embedded in them." Facebook's users are beginning to pay attention and challenge those embedded ideas. Is more connection necessarily a good thing? Is it still a good deal to trade our data for more effective digital products? Can the value of a piece of communication be quantified in likes, shares, and comments?

These are questions that Zuckerberg has not asked, and maybe cannot ask. But the rest of us can—if it's not too late. Postman was not optimistic. "Once a technology is admitted," he wrote, "it plays out its hand; it does what it is designed to do. Our task is to understand what that design is—that is to say, when we admit a new technology to the culture, we must do so with our eyes wide open."