Mark Zuckerberg used nearly 6,000 words to describe the future of Facebook Thursday, but you could sum it up in two: global domination.
Sure, Facebook’s CEO appears more “woke” than ever. He meditates on substantive issues like inclusivity, the eradication of disease, responsible artificial intelligence and the future of media.
And yet. In the simplest terms, his manifesto is about how the social network will continue to be a relevant online product as more of the world becomes connected. It explores how Facebook can become a key part of global “infrastructure,” to borrow a word Zuckerberg uses literally 24 times, that will make it an indispensable part of daily life for people across the planet.
Let’s be very clear about one thing: Facebook is not medicine. It is not a job that puts money in your pocket or a roof over your head. Nor is it the phone that connects you to your mom several states away, or the plane that takes you to her. It is an online platform where posts from estranged friends and family members are interrupted every so often by ads for “3 free soups”:
Facebook exists to grow and to make money. It treats expansion as a merit unto itself, as if there is some inherent quality to people being on Facebook that betters society.
Consider how Zuckerberg grapples in his manifesto with the idea of disturbing content.
“The guiding principles are that the Community Standards should reflect the cultural norms of our community, that each person should see as little objectionable content as possible, and each person should be able to share what they want while being told they cannot share something as little as possible,” he writes.
It’s the exact type of unprincipled thinking that has ruined Facebook in the past.
There’s a leap there—that someone seeing “objectionable content” is in effect a “bad” thing that should be avoided at all costs. You might think Zuckerberg is referring to extremely disturbing content, like child pornography or videos of suicide, content that no one would argue should be on Facebook — but he is not. Rather, it calls to mind a report from November suggesting Facebook would be open to news censorship to break into the Chinese marketplace.
“Even within a given culture, we have different opinions on what we want to see and what is objectionable,” he writes. “I may be okay with more politically charged speech but not want to see anything sexually suggestive, while you may be okay with nudity but not want to see offensive speech.”
Zuckerberg doesn’t grapple in the manifesto with the idea that things that are disturbing could be important to see, perhaps because of the fact that they’re “objectionable.”
Furthermore, his idea about solving this “problem” should raise eyebrows. Emphasis ours:
The approach is to combine creating a large-scale democratic process to determine standards with AI to help enforce them.
The idea is to give everyone in the community options for how they would like to set the content policy for themselves. Where is your line on nudity? On violence? On graphic content? On profanity? What you decide will be your personal settings. We will periodically ask you these questions to increase participation and so you don’t need to dig around to find them. . Of course you will always be free to update your personal settings anytime.
Let’s put this another way: In Zuckerberg’s idealized, and likely upcoming, version of Facebook, the default option for what is “appropriate” in your News Feed will be determined by groupthink that is specific to your area. The manifesto isn’t overly specific, of course: Regions could be a town, city, country, continent or national park for all we know. The devil will be in the details of how this is rolled out.
But you can see the trouble already: Even as Zuckerberg concedes in his note that Facebook has a “filter bubble” problem, he outlines a system that delivers content according to a moral standard set by a majority of people. Godspeed if you find yourself in a minority of people interested in “politically charged speech” about abortion in Forsyth County, Georgia. Check those News Feed settings, folks!
This definitely isn’t going to pop anyone’s Facebook bubble.
It’s the exact type of unprincipled thinking that has ruined Facebook in the past. Rather than take a meaningful stance in favor of the free spread of information, Zuckerberg, as ever before, walks a middle course that serves Facebook’s aims—to be a happy place for all people, thus ensuring its user base can grow without provoking the ire of tyrants or censors. Individuals are not served by this thinking; they’re limited by it, because by default, they won’t engage with news or content that unsettles.
And we get it: Facebook is a business, it can do whatever it wants, and of course its major incentive is to grow and be all things to all people. The concern comes when Zuckerberg intertwines these motives with something ideological, because Facebook has frequently been a threatening force in the world.
Remember when it allowed hoaxes and propaganda to spread uninhibited in the lead-up to the election of Donald Trump? When the company tried and failed to become a dominant internet service provider in India? When it removed a line from this very manifesto suggesting it could use AI to monitor private communications and profile people? Or when it allowed advertisers to discriminate on the basis of race?
And how does Zuckerberg presume to know which approach will work best for everyone on this planet when 71 percent of his company’s senior leadership is white and 73 percent male?
His solution is to steer clear of politics himself and and design technology solutions that make the hard choices for his company. Yet again Zuckerberg is deluding himself by asserting that refusing to fully own a position means he isn’t taking one.
“In times like these, the most important thing we at Facebook can do is develop the social infrastructure to give people the power to build a global community that works for all of us,” the CEO writes.
Or, as he put it a bit more specifically to Recode‘s Kara Swisher: “Our approach is to try to get community to do it and I would rather that it come from community rather than us”
That’s nice in a sense—the manifesto also includes a rather heart-swelling passage about Zuckerberg wanting Facebook to better empower administrators of the network’s groups, thereby creating “meaningful” interactions even outside of cyberspace—but this is just a remix of the same old song.
Just as Facebook has refused to take responsibility as a media company when things go wrong with the editorial content it serves, Facebook will be able to shrug it off when its “social infrastructure” is used for prejudice or violence. Don’t forget that this is the same company that, as recently as October, couldn’t stop its new “Marketplace” feature from being overrun with illegal weapons, drugs and wildlife.
All this to say: It’s nice that one of the most important companies on this entire planet has a CEO who’s apparently done a little bit of soul-searching as the world cascades into hellfire, but Facebook has failed to earn our trust as consumers of its product. The problem is that it doesn’t need it. Facebook will continue to grow and morph and harvest our data, and so many of us are a little too over-invested in the social network to log off or demand something better.
There’s no question that Facebook has already changed the world, perhaps irrevocably. It’s the product that conditioned us to share photographs, videos and “status updates” from our personal lives online without hesitation. It has used the mass data created by its 1.86 billion users for astounding projects. The ability for A.I. to recognize and describe elements of photographs to the blind, is a striking example, but Facebook’s automated “Trending” news feature, which has been tweaked to better understand how we all consume media, is also substantial.
We’ll no doubt continue to see amazing things as Facebook and its technology mature. But don’t be shocked if (when) Zuckerberg’s 6,000-word idealism coalesces into something a bit less pretty.